User research can be time-consuming and expensive, so it’s best to ask the right questions ASAP and use the right methods to uncover valuable insights. This post addresses common mistakes and provides answers about which questions to use with which method (quantitative or qualitative research) in order to help reduce bias and inform experimentation.
Recently, the Test and Learn Community met with qualitative research subject matter experts Els Aerts, Co-founder of AGConsult, and Justine Erickson, Senior UX Researcher at AnswerLab, to answer the following questions:
- What are the different values to be gained through quantitative vs. qualitative research?
- What should I remember in order to get quality insights from the questions we ask?
- How do I design survey questions?
- Outside of traditional surveys, what are other qualitative methods we can use to understand our customers better?
- What are the risks of asking the wrong questions?
Below I summarize answers to the first two questions, but to get Els and Justine’s full take and the discussion, watch the video here.
Quantitative and qualitative research are both valuable. What’s the difference?
During the TLC session, Els asked a volunteer to describe why qualitative research is important. When a hesitant respondent began to answer, Els interrupted in the middle of his sentence and asked him to answer only using numbers. Els’ impossible ask made everyone chuckle and proved a point: we need quantitative research to know “how many,” and we need qualitative research to know “why.” When gathering data and creating insights, there is value in leveraging both quantitative and qualitative research to get the best results and a holistic analysis.
How do I design survey questions? What to remember to get the most insights from the questions you ask.
Ask yourself, “What am I trying to find out, and is this research method the best way to go?”
This will help identify the best research method to leverage. If your question is primarily behavioral, you can use observational research and quantitative data analysis to find out its answer. For example, “Where do people fall out of the checkout funnel?” can be answered using session recordings and web analytics tools like Google Analytics and Adobe Analytics, whereas “Why do people fall out of the checkout funnel?” requires a different investigative approach. While you can still infer some of the answer by analyzing quantitative data like error and time on task, you won’t get the thoughts, feelings, and user voiceover without conducting a qualitative study and asking them to respond.
Understand mindset as a visitor leaves the site.
Employ exit intent surveys to get the “why” behind task completion. Qual research is often used to get voiceover for pain points and why tasks weren’t completed. Not frequently deployed in this way, exit intent surveys can also provide voiceover on delightful experience and give insight to companies on what they should keep doing. This is as easy as placing an open ended text field response box on the thank you page.
Avoid double-barreled questions.
Double-barreled questions like, “Was this feature interesting and useful to you?” ask two questions with two answers with only one question mark. While we’ll discuss the importance of brevity later in this post, the researcher should either split these questions out into, “Was this feature interesting?” and, “Was this feature useful?” or decide which of those questions is one of real importance.
Avoid leading questions.
We see examples of leading questions most often during in-depth interviews, but surveys lead participants too. Let’s say you see someone struggling with a task. Instead of saying, “I see you’re having an issue, what’s going on? What’s happening?” a less leading phrasing would be “what was easy or difficult about this task?”
Don’t (Like)rt scale
Potentially a hot take from Els and Justine, neither like to use Likert scales for qualitative research because their results are directionable but not actionable. According to Justine, “When you start using scales… there is a danger of the interpretation on the back-end where people want to hang their hat on a number… you shouldn’t be using numbers in that way, because it is really just directional.”
Consider survey fatigue
We know survey fatigue and fallout are real. Justine recommends prioritizing questions and grouping them into “must haves” and “nice to haves” to keep surveys short and participants engaged.
Always run a guinea pig
Moderators and designers should make sure questions are user-friendly. Doing mock interviews and asking colleagues and those outside of the industry to participate before opening it up to your true audience will help create clear, understandable questions. A great recommendation from Els: Have those of a different age group and gender look over the survey to make sure there are not any different meanings/interpretations of words.
In addition to the points above, Els and Justine discuss a handful of qualitative research methods during the session for listeners to test out. Watch the video to hear their full take (Spoiler: Justine’s favorite method is in-depth interviews).
Approach qualitative research systematically to reduce bias.
People often discredit qualitative research as easy compared to quantitative. But asking the right questions is difficult! We’ve made the ability to create and distribute surveys so easy, that people conflate that ease with the ease of design and analysis. However, because of bias, you have to be systematic in your methods. At Search Discovery, we develop and vet our qualitative research questions using a systematic approach to eliminate as much bias as possible.