Categories
methods surveys

Surveys – good for reinforcing biases

It’s hard to create a good survey. Even if you can write non-biased questions, it is the ones that you don’t think of that will get you into trouble. Make survey results actionable by focusing on behavior, not speculation.

The two staple methods of marketing folks – focus groups and surveys – don’t much help us get to the root of user behavior. Both methods encourage speculation rather than tracking real user behavior. Because what users say differs greatly from what they actually do, any method where you aren’t recording direct or indirect user behavior has the capacity to lead you astray.

Surveys can however give you fast answers to your questions, especially when you don’t have direct access to users who you can observe or interview. To create a good survey, learn how to avoid biases, focus on users’ past behaviors, and do not invite them to speculate about the future.  

Be careful who you recruit

The easiest way to find survey respondents is probably to use your existing customers – either through a link on your Web site or via an e-mail message.

The problem is, by using this group you have introduced selection bias. What this means is that the people who respond to your survey will have particular views, behaviors and experiences simply by virtue of being your customers. For instance, they already know about your product, they know the vocabulary that you use, and they have pre-formed attitudes about your business.

It might be that these are exactly the people you are looking for. For instance, there’s no point asking non-users how satisfied they are with your support offerings. On the other hand, if you are doing research to help understand what new features to add, you probably want to cast the net wider and find people who would be future users if you considered their needs.

Create actionable surveys

Make sure you are asking questions about users’ past experiences rather than making them speculate about the future. You can always extrapolate from past behavior to future needs, but it’s hard to have faith in users’ statements about the future. Users often say one thing and then do something completely different.

BAD: Encourages speculation

When do you think you will buy your next phone? 
  [Month]   [Year]   

When you purchase your next phone, will it be a 
  [  ] iPhone
  [  ] Android 
  [  ] Other 

Who knows whether this purchase will actually happen? Even if it happens in the approximate time frame that the respondent states, their desire to own a flashy new smart phone might be offset against the fact that they have no money. They want the smart phone, they speculate that they’ll get the smart phone, but they may actually end up with a feature phone for a whole host of currently unknown reasons. 

GOOD: Records actual behavior

When did you buy your last phone?
  [Month] [Year] 

Was it a
  [  ] iPhone
  [  ] Android 
  [  ] Other

This allows you to plot proportions of each phone type being purchased by your target audience over time, and extrapolate future smart phone growth patterns based on actual user data.

You and your team can have confidence in the data because you know it came from real actions rather than fantasies.

Types of questions to ask

Ask questions that will give you quantitative (numerical) answers. It’s easy to make statements about numbers and proportions. Just be sure that the numbers allow you to say sensible things. The best way to check this is to write a draft of your report before you run the survey.

  • For each statement you want to make, write or sketch out what the answer would look like. Even go as far as to create the graph or spreadsheet that you’ll use for analysis.
  • Now, check that the numbers you’ll get back from your survey will give you the input you need to make those statements.

If you ask qualitative questions (you have text fields for answers), work out how you are going to analyze and report the results before you field the survey. It can be disheartening to work through a whole mess of written responses, so make your life easier from the beginning.

  • Do you really need the text answers? Sometimes, user statements can be a powerful persuasive tool, but sometimes the responses aren’t worth publishing
  • Drop-out rates will be larger with more text boxes. Remember that it takes much longer and much more mental effort for respondents to type a response than to choose from options. 
  • Remember that there will be team members who dispute your survey results. Don’t make their job easier by having wishy-washy questions.
  • Qualify the text answers by grouping them in conjunction with other fields in your survey. For instance, comparing the written responses from new and experienced users might yield interesting training opportunities.

Quantitative questions tend to give you “what” answers. Qualitative questions tend to give you “why” answers. Although it’s important to answer “why” questions, doing so in a survey doesn’t give you much ability to interpret the answer.

It’s better to ask these questions face-to-face in an interview or observe users working with the problem area to work out “why” something is wrong. Then, you can combine the “what” (from surveys) and the “why” (from observation) to understand both the scope of the problem and what to do to fix it.

Don’t lead the witness

It’s easy to create questions that give you the answers you want to hear. It’s harder to create truly unbiased surveys. This goes beyond the basic leading questions and into the problem of omission.

Consider the following question:

What was the last cell phone you bought?  
  [  ] iPhone 
  [  ] Android 
  [  ] Other 

The problem is that the answers aren’t complete. There is no way to indicate that the respondent does not own a cell phone. This is a simple example, but you can see how easy it might be to omit an option.

A similar problem occurs any time you pre-choose responses for a user:

Which is the top feature you'd like to see 
in the next release?
  [  ] Improved splooge gaskets
  [  ] Reticulated flange restraints
  [  ] Positronic warp drive 

If users are struggling to even get the current software to work, new features may not be top on their priority list. They’ll answer the question but the answer might not guide you down the right path.

To resolve this issue, run a pilot test of your survey questions. Do it on paper. The frustrated scrawls that you get back next to each question tell you where you omitted an answer that respondents feel is important. Even if there are no scrawls, have a brief chat with each of your pilot survey-takers. Their insights after they’ve completed the survey will help you improve it before you release it into the wild.

Update: Read this cautionary tale about who actually fills in online surveys, and their real motivations.