Create a survey to test hypotheses
When you sit down and talk to potential users, you’re in the perfect setting to learn about the problem you’re tackling with your idea. Interviews are especially well-suited to answer questions around ‘why something happens’ and ‘how something happens.’ In other words, it’s a qualitative research method.
But if you’re working on a complex problem, in a complex vertical (like healthcare, or financial, or education, or even a habit-building product), and your budget is limited for your MVP, you may need to dig deeper to understand which pain points you should prioritise. This is where a quantitative research method, like a survey, may come in handy.
Surveys aren’t the fix-it-all for everything
Surveys aren’t a magic validation tool where just because some questions give you answers with numbers, those suddenly become meaningful and ‘objective.’ In fact, it’s harder to build a good survey than a bad one, and the only way to verify if you’ve done a good job is by checking your results against a secondary source of research. Or you can build something on bad survey insights and it will flop.
- The interviews are suggesting differences you could break down your audience into further ‘buckets’, but you aren’t sure how big those buckets are or who you should focus on first.
- You want to understand better the attitudes of the audience around the problem you’re researching.
- You want to understand the severity and frequency of multiple pain points you’ve identified in the interviews that you need to prioritise.
- You have WHY and HOW questions that are better answered through more interviews.
- You can’t think of hypotheses to test.
- You’re trying to delegate a hard decision to your audience instead of making it yourself.
- You could get the information through another way (desk research, analytics, white papers etc.), instead of running a survey.
What you need:
The insights from audience interviews
To get a specialist onboard (Growth marketer, UX researcher, Product manager).
If you’ve never done a survey before, you’ll need a specialist to look over your survey and safeguard you against gathering biased insights
What to do:
Decide on 1-3 hypotheses to test from the interview insights you’ve gathered.
Write down the screener questions of your survey.
Screener questions make sure the people responding to your survey are part of the audience of your future product. If one of your survey goals is to segment your audience into buckets to identify which to target first, a series of screener questions are doubly important:
- To screen out any people who don’t belong to your audience in the first place.
- To place the responder into their respective audience bucket
What you’ll want to cover in screener questions is usually sociodemographic information, like where someone works, their education, income, marital status, age, location, gender, and any other information of the type that might be relevant for you. Don’t ask for this type of personal information if it’s not relevant.
Write down the research questions of your survey.
To write down the questions for a survey, start with the answers you want to get. You already have a start on the answers you want to get, from the hypotheses you set out to test.
So, let’s start out from one of those as an example of the process.
“People managing small and medium businesses struggle more with managing finances than people who are freelance in a regular month.”
There are several variables here, so let’s break it down a little. First, there are the screener questions for the job position:
X% work as SMB owner;
Y% work as freelancer;
Z% are employed;
W% as something else.
a. SMB owner
Then there’s what we’re actually trying to measure ‘struggle with managing finances’. To see what that actually means, we’d refer to the interview insights. As we’re writing the questions of a survey, we’re focusing on measuring frequency and severity, so let’s say struggle = time spent + money spent. And because we need two answers, we’ll ask about two separate questions:
This responder spends X time on managing finances in a regular month.
a. Under 8 hours
b. 8 to 16 hours
c. 16 to 40 hours
d. 40 to 80 hours
e. More than 80 hours
This responder spends X$ on managing finances in a regular month.
a. Under 50$
b. 50 – 150$
c. 150 – 300$
d. 300 – 600$
e. More than 600$
That’s how the process goes for all the questions of your survey. It’s important that you try to focus on the least amount of questions possible, and not include questions in the survey about things you might be able to find out from different sources.
Have the specialist review your research hypotheses and survey to make sure you’ll gather the right insights. For more insights on writing good survey questions, check out the dedicated section.
Set up the survey online
You can use a free tool like Google Forms, SurveyMonkey or Typeform.
If you’re unsure how you’ll gather responses from your audience organically, you can also set up your survey on a platform like Pollfish, where you can pay to gather responses from a preselected audience.
Try to gather at least 150 responses from your core audience.
If your product has more than one core audience, you’ll need a tailored survey for each of them, and more than 120 responses from each type.
Have the specialist get insights out of the answers you gathered.
While you work
Good rules of thumb when writing survey questions
- Go from general questions towards specific questions.
- When writing the questions and answer options, use simple and to-the-point language that is familiar to the audience. Referring back or simplifying expressions you’ve heard in the interviews is not a bad thing.
- Ask about one thing at a time. Don’t combine two or more questions into one, because the responder won’t know what to answer, or will only answer about one and you won’t be able to interpret the data.
- People like to be liked. If you load your questions or response options with assumptions or preferred answers, they’ll answer that they perceive you want to hear.
- Humans are very good at being… well, human. That means we’re bad at recalling the distant past, and whatever we say about our future intentions should be taken with a grain of salt. If you want reliable predictors for future behaviours, ask if the behaviour has happened in the recent past.
- Closed questions are easier to answer – especially if you offer the right number of options. Balanced to cover both extremes, and enough to cover all likely answers without being overwhelming.
- If you’re writing a multiple-choice question, make sure the answers are mutually exclusive and that they cover all possible categories. If you’re unsure about that, an ‘other’, unspecific category will save the day.