Do your survey question stems match your answer choices?

Today’s question makeover is for The Scout Guide (TSG), a city guidebook dedicated to highlighting local businesses. TSG considers itself a curator of the best a city has to offer. There may be one in your city to check out!

I’ve been featured in TSG in Charleston twice, so naturally, it was disappointing to see the national headquarters send out a bummer of a survey. Love that they want to collect data; hate that they aren’t putting the intentionality required into their survey design.

Below is one of the questions that TSG sent out to people on their listserve last week:

Seems simple enough — they want to know how people are using their guidebooks. They have this nifty little table filled with ways you might use the guide ready to be filled out.

Any thoughts on why this may not be a good question? By “good,” I mean it isn’t a question that will get The Scout Guide information they can use to better understand why people pick up their guide(s).

Let’s take a look.

Every survey question is made up of two main parts: the stem and the response choices. Good survey questions have response choices that match or “fit” with the stem. In this case, we have a mismatch. The proper answer choices for a “How do you use The Scout Guide?” question would be something like: I use it to find restaurants, I use it to learn about local businesses, etc. Really you should be able to read the question and answer choices together as a sentence.

“How do you use The Scout Guide to find out about local events?” … I open it and read. See the problem?

The answer choices are “yes”, “no”, and “sometimes” — Yes/no would be a good match for a question that started: “Do you use The Scout Guide to…” and “sometimes” would fit with a frequency question: “How often do you use The Scout Guide to…”

In short: I’m not sure WHAT they want to learn based on this stem and response choice combo. To add insult to injury, they use vague and unclear language, which means the question is difficult to answer and difficult to analyze.

Let’s change it up for them:

Can you see the difference? In this first option — we completely ditch response choices and simply ask people to check off the various ways in which they use The Scout Guide. It’s simple and clear. If they wanted to get fancy, they could ask follow-up questions about each of the “uses” someone selected.

Depending on how The Scout Guide wants to use the data — that is, their goals for the survey, they may alternatively want to ask people about the top 3 reasons people use The Scout Guide. Ranking questions allow you to report things like: 70% of people said finding local restaurants was the top reason they use our guide. Whereas, a select all that apply question would sound like: 85% of people use our guide to find local restaurants.

Your survey goals drive survey design. So again, this is a case where it would be so fun and useful to sit down with the owners of The Scout Guide and ask them what they want to learn from this survey. Certainly, we could craft a better set of questions to get them really useful data!

If you want to get these makeovers and more in real-time, sign up for my weekly newsletter here.

Previous
Previous

Survey Design 101: If You're Sharing Survey Results - Do This

Next
Next

Even big survey companies create bad surveys