Do you tell people if their survey is bad?

Last week, I had an internal debate over whether I ought to reach out to a colleague on LinkedIn who had shared a survey from the Massachusetts Institute of Technology that was… well, it left a lot to be desired. Do I message them to tell them it is getting them garbage data? Or, do I just let it go? I always feel weird correcting people — like I’m being a jerk, even though I’m doing it out of the goodness of my heart.

I turned to TikTok for some informal information gathering, and someone said: “To me, it is meaner to let them attempt to collect data and then learn it isn’t usable. But start with asking if they are open to feedback.”

Wise commenter, huh? So, I reached out. Much to my surprise they said they would appreciate feedback and then, much to my disappointment (but not surprise) after spending an hour redoing their survey for them (for free - again, goodness of my heart over here!), all I heard was crickets.

So that my hard work doesn’t go to waste, I’ll give ya’ll a sneak peek into the survey that may or may not (but definitely should) get a full makeover.

Below is the question that first caught my eye:

This question has a few things wrong with it:

  1. The “if” clause — this suggests that the question may not apply to everyone taking the survey. In this case, they were looking for teachers, school leaders, parents, and students to take the survey. A good survey is only going to show people questions that are relevant to them. Don’t rely on an “if” to signal someone to skip a question, it’s lazy design.

  2. The question is double barreled — it asks people about two different things: amount and quality of professional development, and forces them to respond once. What if there isn’t enough professional development, but it is really good, or vice versa? A good survey question has only one topic per question.

  3. The response options are all over the place — there seems to be a “core” scale from Poor to Excellent (not my favorite because it uses more advanced language) but then there are two bonus options — one about whether there even was professional development and then “Unhelpful or counterproductive” — is that not redundant with “Poor”? You always want to use a clear and intuitive set of answer choices.

Ya’ll this question is a mess. And, there were many more like it.

When I revised the survey for them, I did a few things.

  • First, I set up survey logic such that only people who work for a school system will see this question about professional development

  • Then, I split their question into two. One that asked about the amount of professional development on generative AI people have received; then, I added a follow-up to ask about the quality of the professional development ONLY for people who indicated that they had attended at least some professional development on generative AI.

  • I also noted that they ought to decide on a timeframe for people to think about/reflect on in responding to these questions — is it the past year? Is it EVER in their whole life?

These are not difficult changes to make — but they do require you take the time to carefully review your survey. The fact that a survey in this terrible of shape went out from a major education institution that wants to use this data to inform their future research is shocking, embarrassing, and appalling.

Thoughtful survey design is critical. We all should master it. Just sayin’.These are not difficult changes to make — but they do require you take the time to carefully review your survey. The fact that a survey in this terrible of shape went out from a major education institution that wants to use this data to inform their future research is shocking, embarrassing, and appalling.

Thoughtful survey design is critical. We all should master it. Just sayin’.

Previous
Previous

Open-ended Questions: Lazy or necessary?

Next
Next

Do you use Typeform?