3 Survey Pitfalls to Avoid

Surveys are popular for gaining feedback on a program or service. With a survey, you can typically gather and analyze data quickly, relative to focus groups and interviews. Plus, developing and administering a survey can often be more cost-efficient for the budget-conscious evaluator.

Yet, developing a good survey requires thoughtfulness and skill. Here are three pitfalls to avoid when working on yours.

Frequently Changing Response Choices

A common type of survey question is a Likert item. A Likert item is a statement followed by response choices presented on a scale. Here is an example:

Please rate your agreement with the following statement.

My motivation for accomplishing my goals increased as a result of participating in this workshop.

  • Strongly disagree
  • Disagree
  • Neither agree nor disagree
  • Agree
  • Strongly agree

When it comes to Likert items, we have a wide range of options for our scales. The scale above measures agreement. We can also measure frequency, importance, likelihood, and value with Likert scales.

When we change scales frequently throughout our survey, it can be jarring for our survey participants. Each time they are presented with a new option, they have to pause and consider the new scale before responding.

If you do have to change scales mid-survey, you can signal the change by inserting a page break and updating the instruction text to help participants transition to the new scale. 

Using Double-Barreled Questions

Have you ever felt confused or torn when responding to a survey question? Some of that confusion can come from a poorly worded question. Double-barreled questions are the most common error I see in survey construction. A double-barreled question seeks your feedback or perspective on multiple topics, often leaving it difficult to answer. Here is an example:

How well did the presenter understand their topic and keep you engaged?

  • Not at all well
  • Somewhat well
  • Very well
  • Extremely well

What if the presenter understood their topic but failed to keep you engaged? Or what if they were very engaging, yet lacked knowledge about their topic? How would you respond? 

When you notice a double-barreled question, the solution is to split it into two or more questions.

  1. How well did the presenter understand their topic?
  2. How well did the presenter keep you engaged?

The addition of one question will not drastically alter the length of your survey, and your survey participants will thank you for reducing their confusion.

Making the Survey Too Long

When designing a survey to capture impact and process data, it’s easy to let the survey’s length get out-of-hand. When participants feel like a survey is dragging on, they are likely to experience fatigue and skip questions, both of which can compromise the quality of our data.

A rule of thumb I learned a while ago and continue to apply today is this: 7 questions per minute. When it comes to true/false, yes/no, multiple choice, and Likert items, people can typically respond to seven of those questions in one minute. Open-ended questions take more time–sometimes one to three minutes, depending on what you’re asking and the level of thoughtfulness required in the response. 

How long should your survey be? Aim for 10-15 minutes total. If you’ve run a 9-month program with numerous activities, you may need a longer survey if that’s your primary data collection method for your end-of-program evaluation. But be aware that survey length matters when you hope to collect quality data.

Help for Staying out of the Survey Pit

If you need help improving or designing a survey, we would love to help. Schedule a free discovery call, and let’s talk! 

Related Posts