Why People Buy and What it Means for Your Product
Your customers don’t want to buy your product. I mean, sure, they’re willing to hand over the cash (or credit card or bitcoin) to...
“Please complete the following 20-minute customer survey.”
If you saw that subject line in your email inbox, would you feel compelled to pop it open and start the survey? Would you pop it open at all? We’re guessing probably not.
There could be exceptions, of course. If you feel strongly about the company that sent the message, you might be willing to take that survey. You might even want to do it.
Maybe the survey is from Disney, and you just got home from a terrific Disney Cruise and want to share your positive feelings about it with the company. Or maybe it’s from a company you believe mistreated you recently, and you’ve got 20 minutes for them as well.
But the point is, you’d be making the decision to give your time to those surveys because of the companies behind them, not because of how they presented the surveys to you.
In fact, that subject line violates the two most important guidelines you’ll want to follow when asking users to take your surveys. Let’s quickly review those guidelines. Then we’ll dive into some best practices for crafting a killer customer survey itself.
There’s a reason you never see a TV commercial where the message is, “We spent a lot of money developing these products, so please buy them.” Advertisers know we don’t buy things to make the seller’s life better—we do it to make our own lives better.
Which is why you need to present your survey as a two-way proposition—a give and take. Your user gives you a few minutes (see guideline two below) of their time, and you give them something in return.
It doesn’t need to be a monetary something, although you can do that as well—with small gift cards, or putting every respondent’s name into a drawing for a prize. It can also be something intangible that your user perceives as valuable—such as the opportunity to help shape your product going forward.
People are busy. They don’t have a lot of 20-minute blocks of time to just give away to anyone who asks.
This is why you need to ruthlessly trim your surveys to include only the most important questions you want to ask. Make customer surveys as quick and easy as possible to complete.
Before we jump in, it’s worth pointing out how we would define a great customer survey. Contrary to popular opinion, it’s not the one that generates the most responses. No, a great survey is one that provides the actionable insights needed to make your product better for your target user.
Here are some best practices to help get you there.
“A great customer survey is one that provides the actionable insights needed to make your product better for your target user.”
When you think you might get your user’s attention for a survey, it can be tempting to include every question you’ve ever wanted to ask her. This is particularly true today, when great tools like SurveyMonkey and Typeform make it so easy to create and send out surveys.
But that strategy is counterproductive, for a number of reasons:
Instead, cut, cut, cut—and leave in only those questions you believe are necessary to getting to the actionable information you need.
When they’re creating their surveys, many product teams fall into the trap of relying entirely on questions formatted to generate responses from a pre-selected set of options.
For example, they’ll ask a respondent to rank something from 1 to 10, or to click a button to indicate they “Strongly Disagree,” “Disagree,” “Agree Somewhat,” etc.
Although these questions have their place, they can also lead to misleading data—because one respondent will view a “7 out of 10” as something different than another.
The other risk of filling your survey exclusively with click-a-button questions is that, let’s face it, after answering a few of these questions in the same way, your respondents are likely to just keep scrolling down the form and checking the same box—“10,” “Strongly agree”—because they’re seeing a pattern. If they like your product or service, they assume they’ll have a similar response to the next question, and they might not even fully read or process the question itself.
Instead, craft open-ended questions to elicit your users’ original thinking and specific experience with your product. A few of those responses—written as short narratives—are in many ways worth far more than a bunch of “Strongly agree” button clicks.
Let’s say you want to know if your respondent found that getting started with your product was an easy experience. If you simply ask them to select from a dropdown list, you’ll have the same problem we described above. One respondent might think it was easy, another very difficult.
But even if you do that, you can still delve deeper into each respondent’s thinking with a simple follow-up question: Why?
It’s the perfect time to include a “Why” (with an open text field for a response). Because your respondent is thinking about your original question. When they clicked on, say, “Very Easy,” they had to think about their setup experience. They briefly tapped into their memory for that experience. But your dropdown menu left him no room to elaborate.
So always ask “Why” as a follow-up. You’ll catch your respondents at just the right moment.
Even if they pop open your survey with the best of intentions, your respondents are probably going to get distracted at some point as they go through it. But we all have three minutes.
So let your respondent know beforehand (in your survey-request email message, for example) that this is all you’re asking of them.
You should also include in the survey itself a prominent status bar or completion percentage number on the screen. Let your respondent see how far they’ve made it through and how much work remains. That’ll help create a more positive user experience around taking your survey.
If you’ve used one of those restaurant tabletop kiosks to pay your bill, you know the surveys that they ask you to fill out present screen after screen of questions—without letting you know how many screens you have left. You don’t know how much progress you’ve made until it’s actually over.
This can be very frustrating for a respondent who begins the survey in good faith but at some point—maybe after the fifth or sixth screen—starts to feel taken advantage of. And frustrating your users as they’re thinking about your product is a bad strategy. (As we stated earlier, the survey itself is part of your user experience.)
Don’t make this mistake. Be upfront with your survey respondent about how much of their time you’re asking for. And let them see progress at every stage of the survey.
Another issue we’ve found with those restaurant-kiosk survey questions is they create an obvious pattern in the sequence of questions. Did you enjoy this part of your experience (1 to 10)? How about that part (1 to 10)? Was this part great, too (1 to 10)?
When they sense this pattern, customers will just start clicking the 10 (or Strongly Agree) button as they scroll down your form. Hey, their cursor is already over on the right side; it’s just easier to keep answering, “Great!”
So we’d recommend mixing things up. Ask a question worded from a positive perspective (“Was this great?”) followed immediately by one from a negative (“Anything you’d like us to change?”)
And don’t forget to include your follow-up, open-ended “Why?” right after each question where you’re looking for a deeper explanation.
Don’t overdo it by sending surveys to the same list of users or target personas over and over. Don’t let your surveys (and your company’s name) start to feel like a nuisance.
Allowing your respondents to opt out of certain questions can be an effective strategy. It can make the required questions feel more manageable and less cumbersome.
Don’t ask your survey respondents to “Please take this survey.” Instead, tell them this is “Your chance to influence the future of [your product].” Or, if you’re sending to small enough list and you have the budget, get practical and tell them, “Give us 3 minutes for a Starbucks gift card.”