Beware The Survey ...
This is not to say all survey's are bad or useless. But sometimes results can be misleading. Leading questions can skew responses to get a certain result or a faulty data collecting methodology can make a survey's results unrepresentative of the larger truth.
'Are you still using Windows?', for example, is a leading question with an assumption built in, i.e. that you use or, at least, used, Windows in the fist place. The statistical results from such question are not particularly meaningful because it doesn't take into account other operating environments.
Vendor "surveys" in particular are notoriously skewed, says Joshua Greenbaum, a principal at Enterprise Applications Consulting. Siebel Systems is an example of company that uses customer satisfaction surveys to great effect; quoting numbers from leading questions in its quarterly earnings reports as proof its customers rate the company very highly. (See Greenbaum's Datamation column on this subject.)
"I have survey research background and what I found out is to do anything that is methodologically pure is virtually impossible in this market," he says.
Even IT research firms, which live and die based on reputation, still have an underlying incentive to sell services to customers. The more problems uncovered, the more services can be sold, says Greenbaum. In Greenbaum's experience, 90% of business surveys are methodologically flawed.
"And that may be charitable," he says. "But to paraphrase Sir Winston Churchill, 'Democracy is the worst system possible unless you consider the alternatives'. The point: ... in 90% of cases, this is the best data you can get. Some data is generally better than no data."So, with this in mind here are a few tips from two survey experts on how to interpret results:
Frank Newport, editor and chief of the Gallup Poll, perhaps the most easily recognizable survey organization in the country, first suggestion is going to a subject matter expert just as you would to decipher financial data or your taxes.
But, since most companies probably don't have such an individual on staff, the next best step is to be a little skeptical until you uncovered a few basic facts like who conducted the survey and for who paid for it?
Reputation is key here. Look for surveys conducted by analyst houses you know and respect then ask yourself 'What is the company's track record? Have they been historically accurate?' "Consider the source," says Newport.
Rob Daves, director of Strategic and News Research for the Minneapolis-St. Paul Star-Tribune and director of the paper's annual, state-wide Minnesota Poll, suggests viewing statistics with an eye towards the survey's sample, i.e. who was polled, how was it conducted and when?
"Question wording, and then the definition of the population under study, that's what I'm talking about," says Daves.
Surveys commission by groups or corporations (a common IT industry practice) should be viewed with care. If the results don't match what the sponsoring organization is looking for, results could be withheld. Daves has seen this first hand more than once.
Also, look for context: a number standing alone is just that. Numbers need to be viewed against the backdrop of who participated in the survey and why they participated ... the last is a question you will probably never get answered but a good one, along with many others, to ponder before you accept results at face value.
While useful tools, survey's are just that, tools, says Greenbaum. A survey is only as good as its methodology. So, the best advice is compare the results of any survey against the backdrop of your own experience and then do a little digging. Get the questions and read them, and then see if you can talk to some of the survey's participants to get their take on the results.
"These surveys should be guideposts, but you need a better map than just a collection of surveys" to base decisions on, says Greenbaum.
Want to discuss the issues raised in this story? Take it over to our IT Management Forum.