Online surveys: three examples of what not to do!

Fowler (1995) said “Improving question design is one of the easiest, most cost-effective steps that can be taken to improve the quality of survey data”.

It’s certainly true that no other single action will have such a big improvement on your survey response rates, and yet every day I see terrible examples of survey questions. Here are just three examples of what NOT to do!

1. Don’t manipulate your data to get a positive result

The classic here involves use of the Likert Scale – the scale that aims to rate opinion. So for example you might ask people how much they agree or disagree with a statement, on a five point scale. Fine, this is a common survey question type. But hang on, surely there needs to be an equal number of positive and negative answers, right? Not if you’re GfK, who created this survey for Ikea customers! Tut tut, they might have given the client what they wanted to hear, but have they actually created any useful data with this? Something that their client can use to drive improvement in their business? I think not 🙂

Screen Shot 2013-11-12 at 10.17.07

2. Don’t use ‘sometimes’ or double negatives in your questions!

This seems basic, right? And yet I see this all the time. Imagine you’re asked to agree or disagree with the following statement: “I’m sometimes overworked”. If you agree, fine. But what if you choose ‘disagree’? Does this mean you’re NEVER overworked, or ALWAYS overworked?
Using ‘sometimes’ and double negatives can get you into an awful lot of trouble. Just look at this example asked by the US Jewish Committee in 1992. Needless to say their national survey led to some alarming and untrustworthy results!

Screen Shot 2013-11-12 at 09.56.24

3. Don’t ask unrealistic questions

Check this one out, from Virgin Media. I’m a Virgin Broadband customer, so they sent me a survey. But how on earth am I supposed to know how Virgin’s broadband speed compares with other suppliers?! They know I’m a customer of theirs. Ipso facto I have no experience of other providers. This question is a waste of my time and results in invalid and unhelpful data.

Screen Shot 2013-11-12 at 09.43.01

For more examples of what not to do – and how to get survey design and analysis right – why not come along to the next training event in Bristol on 11th December? More details available here.