Breaking the Survey Illusion: Sometimes It’s Better Not to Ask

Breaking the Survey Illusion: Sometimes It’s Better Not to Ask
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Once upon a time, everyone made everything under the theory of “if you build it, they will come.” Big organizations created products with little understanding of their customers, put them in the market, and then lived with their luck – or died by it.

Sometimes, big ones – companies selling to tens or hundreds of thousands of people – did something called “focus groups,” bringing in twenty or thirty customers earlier in the process. Considering the numbers involved, this would have been something akin to asking Uncle Larry to speak on behalf of everyone in Cincinnati. And I’m not saying Uncle Larry necessarily lived in Cincinnati. But I digress.

Anyway, one beautiful day, a bright light shone over the corporate business world, and a Deep Booming Voice said, “You know, you could leverage synergistic opportunity by pre-determining a value-based proposition driven by the needs of the customer in advance of, or in parallel to, your development investment.” Then the voice went on and on with a bunch of acronyms that didn’t really make sense, and most of us tuned out. But the point had been made: what customers want should be an input to the development process, not an outcome of it. And, like most points made by a Deep Booming Voice, it was a good one, and people were keen on it.

Around that time, by design or coincidence – and, bearing in mind the cartoonish simplification I’m employing here – a bunch of guys in a bunch of garages invented a bunch of silicon-based devices that finally allowed the common layperson to count to a bazillion, fix typos without white-out in letters without envelopes, and distribute pictures of cats to huge groups of strangers. And these newfangled “computers,” as they were called, turned out also to be pretty good at collecting and summarizing the answers to questions asked to masses of people.

Forget Uncle Larry! Suddenly, for the first time, it was possible for any of us to ask EVERYONE in Cincinnati the same multiple choice question, and then summarize all those answers on one graph. In light of what the Deep Booming Voice had told us – which, by the way, we’d immortalized into a 71 page PowerPoint executive summary plus 500 back-up slides – this glorious new capability ushered in the Age of the Customer Survey.

The very next day, efforts at creating an egg-scented, moth-repellent, petroleum-based home air freshener were abandoned. The day after that, automotive dealers began in unison to hand out questionnaires to customers while simultaneously begging them to respond only one way. And then, on the third day, the entire rest of the corporate world slipped into a semi-lucid dream state in which they came to believe that if you just ask enough people the same question, you’ll get the right answer.

I’m here, respectfully, to wake you up.

I have no booming voice and no golden halo. I’m just a guy who discovered you on the ground on the side of the road and is worriedly prodding you with a stick. “Hey. Are you OK? Are you awake? You do know that you can’t just throw any question out to a crowd and get a useful answer, right? Right??”

“Crowdsourcing,” you murmur as you writhe and struggle for consciousness. “Hive mind.” “Collective intelligence.”

I know, I know, I get it. I’ve seen the stories and studies, I know it’s real and it can work. Sometimes. But not every infinite group of monkeys-on-typewriters comes up with Shakespeare. Some of them produce gibberish – or the pitch for Jersey Shore.

Just lie there for a minute while I give a few examples.

First, people don’t know when they’re learning. Giving a survey at the end of a class to find out “what people thought” is so common as to be cliché at this point. But in experiential learning (a fancy way of saying “teaching through simulation”), people’s answers don’t always correlate with their learning outcomes. Negative answers do; people don’t learn much if they’re miserable. But positive answers are a different story. Let’s say Learner One leaves the training saying it was “OK, I guess, mostly stuff I already knew.” Learner Two leaves saying it was “the best thing ever, and I learned so much!” Traditionally, we prefer everyone answer like Learner Two. But Learner One is basically saying “yeah, I knew that, if that’s all you want me to do, sure, fine.” Learner Two is saying “wow, I discovered how little I know.” Guess which of those people is more likely to try something new in a stressful situation? If you design a training to produce the “wow” result, you’re designing away from real impact.

Second, people don’t know how to fix problems. Let’s say you have a communication problem between company divisions. You survey all the employees with a list of possible fixes, and ask what would help most. Should we hold monthly brown-bags, create a job rotation program, publish a newsletter? The answers you get will be based first on personal preference, and second on opinion. The issue of whether each solution is likely to solve the problem will be at best a distant third; expertise in psychology and sociology are not conferred upon receipt of an online questionnaire. If you decide what to do based upon survey results, your action may seem data-driven, but in reality it will be a crapshoot – even more so if the effort spent analyzing survey data would otherwise have gone into clarifying the problem and researching viable solutions.

Finally, people don’t know what makes them happy. Ask anyone: “would more money make you happy?” You know the answer you’ll get – you and I would answer the same way. The thing is, most of us are wrong. Above a certain threshold, pay has little impact on happiness. (If you’ve ever gotten a big raise, you know this – you felt great for a little while, then the happiness abated as nagging doubts about being underpaid returned.) And money isn’t the only thing we’re wrong about; goals and accountability are another example. Rare is the employee who will stand up to management and say, “if you would just hold me accountable more strictly to attainable yet aggressive goals, my level of happiness would go up!” Though most of us would shy away from stricter oversight, when done correctly, it can often increase job satisfaction. Here again, if you try to design your employee engagement approach based upon surveys in which employees tell you what they want, you’re in trouble.

All of which brings us back to me, on the side of the road, poking you with this stick. Hey mister. Hey lady. You can’t design your training programs, organizational solutions, or employee engagement based upon survey results. Instead, start with clear problem statements, and research-based solutions likely to create the outcomes you desire. If you must incorporate surveys, use them sparingly for cues on things like receptivity and implementation strategy – questions where opinion matters. But be wary of asking the wrong kinds of questions. And, don’t rule out the possibility of abandoning the e-survey altogether.

Remember: the Deep Booming Voice never said surveys were mandatory. And besides, at this point, we’re all a little overwhelmed with questionnaires. The Age of the Customer Survey, as it turns out, is a lot of work for all of us. One less survey might be the best thing you could do for your customers.

But don’t take my word for it – ask Uncle Larry.

Popular in the Community

Close

What's Hot