March 10, 2021
How to Design User Surveys that Get You the Results You Want
Get the best content on product management,
UX, and growth delivered to your inbox every week.
We love user research, but even we can admit surveys need to start getting their act together. Too many user surveys are long and full of vague questions that are hard to answer, or just seem pointless altogether.
It’s not just bad for the people taking the survey. Long surveys get fewer responses. Vague surveys deliver unhelpful information.
So, if you want to make sure you’re not wasting your time (and someone else’s) with your user surveys, you need to write the user surveys you’d want to take—something brief, clear, and valuable to the project.
Define your survey’s goals
“If you aim at nothing, you’ll hit it every time,” as the cliche goes. When it comes to surveys, if you aim at nothing, you’ll end up with a lot of data to sift through, no clear direction moving forward, and a burned list of survey participants who probably aren’t willing to give you a mulligan.
Before you worry about specific questions or how you’ll get participants, nail down a plan for your survey and what you want out of it.
Align surveys with the end goal of a better user experience
Even if your project improves a single feature of an app, most users don’t actually care about one feature and how it works. Users want a better, more useful experience. That’s it.
So, if you’re working to improve a chat function in your project management software, for example, a great chat tool is not your end goal. Your end goal is to make a project management platform that enables teams to collaborate on projects faster and with more transparency.
Users want a better, more useful experience. That’s it.
That goal of a more collaborative platform should be front and center as you design your survey. Instead of only asking questions about what people like about chat tools, you could ask questions that help you understand the barriers to collaboration in your current software.
Start with the big picture, and you’ll be on track to write a survey that provides deeper and more useful insights.
Plan the decisions you will make based on survey results
Define the decisions you plan to make after you get survey results. This may seem obvious—when you get the survey results on what’s wrong with your chat tool, you’re going to add features to the chat tool to make it more valuable. Sometimes it might be that simple. Too often, big, vague goals for survey results lead to biased or useless questions.
For example, if you’re trying to decide whether to prioritize improvements or features for your chat tool, don’t just leap into your survey to ask, “What’s wrong with our chat tool?” That’s a vague and open-ended goal—it doesn’t lead to a decision being made at the end of the survey period.
Instead, decide ahead of time that you’re going to use survey feedback to prioritize an existing list of improvements you know you need to make (hopefully sourced via other user research). This will help you stay on target and write questions that give you actionable information.
Write (and rewrite) your survey questions
Each survey question should tie directly to the goals you established in step one to give the kinds of results you need.
To make that connection between goal and question, work backwards. In a hypothetical scenario where X% of users give you a response of Y, will that give you information you can actually use to make the decision you need to make?
Start with the big picture and you’ll be on track to write a survey that provides you with deeper and more useful insights.
Once you’ve made that decision about the basic shape of your questions, analyze and revise each question to make sure they’re unambiguous and user-friendly. Surveys should be short, simple, and easy to understand—there are already enough barriers between you and user feedback without making it harder for someone to respond.
Make questions easy to understand
You’re already working with a thin attention margin—if you make it a chore for participants to understand your questions, you’ll lose them. Avoid this by revising all your questions for simplicity after you write them.
- Write simply. If you can’t ask your question in an easy-to-understand way, you probably don’t know what you’re asking well enough yet and should rethink. (Use the Hemingway app to analyze the simplicity of your questions.)
- Avoid jargon. If your questions rely on technical terms, industry acronyms, or vague corporate-speak, try again. Good user experience (UX) is simple and accessible; your questions should be, too.
- Keep it human. If your questions sound like an AI wrote them, engagement will suffer. By all means, take precautions to avoid bias, but don’t let that stop you from inserting a little warmth into your writing.
Additionally, it doesn’t matter how simple your questions are if there are too many of them. As you write and revise your questions, cut any questions that may be redundant or tangential to your goal.
Don't leave anything open to interpretation
To get usable data from your survey, you need to get very specific. In our article 3 Rules for Writing Effective Survey Questions, we outlined four common pitfalls you should watch for when revising survey questions.
- Reduce ambiguity. Make your question as specific as possible so you can get the answer you need—and then consider how a user could still misinterpret it. Analyze from all angles, then revise. (Consider the difference in clarity between “How likely are you to recommend our marketplace to friends or family members?” vs. “How likely are you to recommend our marketplace to friends or family members as a place to sell products?”)
- Look out for double-barreled questions. Don’t try to ask two questions at once, or you won’t have a clear idea of which question the user is answering. Start by looking for the words “and” or “or” in your questions.
- Avoid overlapping answer choices. With multiple-choice questions, make it very clear how each possible answer is different. You don’t want a user waffling between two equally relevant answers (such as a 35-year-old trying to decide whether to check “18-35” or “35-50” for their age group).
- Avoid incomplete answer choices. Multiple-choice answers should provide as many specific options for the question as possible, or offer an “other” or “none of the above” category. Use these kinds of questions judiciously, though—in some cases, it’s better to give an open-ended question. If nothing else, at least prompt users who answer “none of the above” to give an optional explanation.
It’s better to revise your questions before the survey than to look at survey results and realize you can’t understand user answers to your own questions.
Encourage engagement through empathy
If you want people to engage with your surveys, you need to put yourself in their shoes. When it’s time to deliver the survey, switch the focus from your goals and think more directly about what’s best for your participants. Empathy and respect for other people’s time will get you more engagement and better feedback.
As Patrick Campbell, CEO of Profitwell, said in his analysis after sending one million surveys, “You can’t promise to ‘value my opinion’ and ‘take my time seriously’ if you’re asking me 29 questions that don’t intuitively go together.”
Tailor the format and delivery of the survey to participants if you want them to help you.
Tell your participants how this benefits them
Help participants understand why this survey is important and what it’s trying to accomplish. When writing the email or pop-up message to enlist users, make sure the benefit to them is clear.
We’re not even talking incentives (yet). A $50 gift card will get you responses, but for results,you need people who care about the problem you’re trying to solve, or who use your software and want to see it improve. When introducing your survey, give context about how the survey responses will be used—and how the participant’s honest feedback will ultimately benefit their experience.
Caption: A good example from Contact Monkey explaining the benefit to participants. (Source)
Be honest about the time commitment involved, too. Nothing is more annoying than agreeing to take a “short survey” only to face dozens of open-ended or agree/disagree questions. Get as specific as you can about the time involved; “short” is relative and easy to distrust. Ask a couple of coworkers to go through the survey with a stopwatch to give you an estimate.
“You can’t promise to ‘value my opinion’ and ‘take my time seriously’ if you’re asking me 29 questions that don’t intuitively go together.”
Tie your surveys to user interactions
For more accurate, thoughtful results, don’t hit people with surveys out of the blue. You’ll have better luck if you connect surveys to specific user experiences or actions. The fresher the interaction with you, the more likely a user is to respond. Recency will also improve the data you gather, since the experience will be top of mind.
Want to know how you can improve customer onboarding? Send new users a survey a week after they sign up for your app. Want to understand where your product falls short of expectations? Send a survey to users whose free trial just expired.
Microsurveys ask one to five questions, and are the ideal way to capture user feedback based on interactions and experiences. Instead of long surveys targeting a larger experience, microsurveys are embedded within your website or software. This captures more immediate and accurate feedback, and provides you and your participants with valuable context.
(Still struggling with survey questions? User Leap has you covered with a library of microsurvey templates to get you started.)
Consider offering incentives
The when, why, and how of offering survey incentives could be its own article, but the gist of it comes back to empathy for your user.
If you’re asking someone to give you a quick NPS score, an incentive isn’t really necessary. If your survey is long and in-depth, however, an incentive like a gift card is a good way to make it worth the participant’s investment of time.
Just remember—at the end of the day, an incentive is the cherry on top, not a tactic that will guarantee high-quality survey engagement.
Keep your user surveys people-focused
The goal of running surveys is to make your software better for users, and the survey experience itself should reflect that. The good news is that a people-focused survey isn’t just good for participants. By focusing on who you’re talking to and the unique insights they have to share, you will write better survey questions that will guide you to a better user experience.
UserLeap lets you run targeted, in-product microsurveys on everything from new features to churn with the analysis done for you—all by adding a simple code snippet to your product.