9 Customer Support Survey Practices for More Responses and Tastier Insights

decorative image of customer support survey and fork

Let’s stop bombarding customers with boring customer support surveys that make them want to hurl their computers off a tall building and never check email again.

Customer support isn’t the most glamorous touchpoint in the customer journey. But that doesn’t mean the post-transaction satisfaction surveys have to feel like the final exam for Business Statistics. (I fell asleep in that exam. Face on the desk. Drooled on my test. True story).

Write better surveys and you will be rewarded with higher response rates and deeper insights into what matters most to your customers. Instead of following a trail of bread crumbs through a dark forest, you can be slicing into a juicy filet mignon of customer insight.

Follow these 9 practices to spice up your customer support surveys, improve response rates, and deliver tastier insights.

1. Personalize It

“Dear Valued Customer” is the fastest way to disengage the customer and make them feel like a generic, nameless, meaningless number in your database.

You know your customer’s name, what their issue was, who they worked with, and what the outcome was. Include some of that information in your invitation to take the survey.

72% of consumers say they only engage with personalized marketing messages1 and a survey is similar – you’re proactively reaching out to a customer and asking them to take an action.

Personalizing the survey invitation improves response rate and also helps the customer remember the interaction at the critical moment when they take the survey.

2. Ask about the customer, not yourself

The intent of your survey should be to learn about the customer’s experience, not your employee’s performance, company’s brand image, or anything else about you. It’s all about the customer.

This subtle but important difference affects the customer’s engagement and the quality of their responses.

“How would you rate Molly’s ability to solve your issue?” is a terrible question. It puts the customer in the uncomfortable position of reviewing Molly’s performance, not knowing if Molly will read it or if they will have to work with Molly again in the future.

Instead, ask “Please rate your satisfaction with the service Molly provided on this issue.” It’s a subtle difference, but it shifts the question to the customer’s feelings, not Molly’s performance.

3. Ace your CTA

When you ask a customer to take your survey, it’s a Call To Action (CTA). Through your words, you are attempting to influence a person to do something they may not normally take time to do.

CTAs like “Click here to take a survey” or “Help us out with your feedback” are not very persuasive. You’ll only get a handful of responses from customers who are either bored or angry.

Instead, answer the customer’s unspoken question, “What’s in it for me?” In other words, how will it benefit the customer to take the survey?

Be more specific than “we’re collecting customer feedback to improve our service.” Tell them how you’re improving, how the feedback is used, and when you’ll be following up.

Ace your CTA and your response rate will improve.

4. Ask the most important question first

It’s a fact of life that customers will abandon surveys. People get bored, busy, or distracted and never complete the survey.

Most survey platforms have the capability to detect abandonment and count partial survey responses. By asking the most important question first, you ensure that you get the most value from partial responses.

For many customer support organizations, “Did we resolve your issue?” should be the first question on the survey. However, if you only send surveys when the customer has agreed to close the issue, you can skip this question.

5. Keep it REALLY short

Aim for 3 questions – two multiple choice, and one open-ended question. Don’t exceed 5 questions total.

The shorter your survey, the higher your response rate will be.

You should also consider how often individual customers receives surveys. If they only get one survey a year, they will tolerate more questions than if they get 5 surveys per month.

An excellent 3 question customer support survey using the CSAT metric would look like this:

  1. Did we resolve your issue?
  2. Please rate your satisfaction with the service you received. (5-point Likert scale)
  3. What mattered most in your experience?

6. Follow the standard

If you want to measure customer effort, use the standard Customer Effort Score (CES) question. Don’t try to put your own spin on it, improve it, or alter it.

The same goes for Customer Satisfaction (CSAT) and Net Promoter Score (NPS). (My advice is not to use NPS for transactional support surveys – it’s a relationship metric).

There are two reasons why you should follow the standard questions:

  1. The standard questions are unbiased and easy to understand. The designers spent lots of thought, time, and effort to test and refine the question for the metric.
  2. If you ask the standard questions, you can benchmark against other companies that also ask the standard questions. You can license or purchase benchmark data from third parties and compare your scores to the competition. But only if you ask the same question.

7. Collect customer comments

Many organizations place emphasis on the numerical scores, and that’s important for trending and benchmarking.

However, the tasty insights come from customer comments. The deepest insights lurk in customers’ own words about an experience. Through this raw, verbatim feedback you can pinpoint elusive customer pain points and moments of truth.

Ask at least one open-ended feedback question on your survey.

Through Natural Language Processing (NLP), modern survey platforms can analyze the sentiment – positive or negative – of these comments and also categorize the comments into themes or topics. Statistical analysis of these themes produces the tastiest morsels of customer insight.

For example, I built a linear regression model to understand what topics had the greatest impact to CES for a B2B SaaS company. A customer’s mention of an unresolved or repeat issue had an average impact of -3.6 points to CES on a 5-point scale. Surprisingly, the mention of an agent’s helpfulness – positive or negative – had zero impact on the customer’s effort.

This kind of customer understanding isn’t possible with numerical scores alone. You need to understand the experience driving the number.

8. Eliminate response bias

Response bias is anything that causes survey respondents to give inaccurate results.2 The way you write your survey can impact the results!

The most common cause of response bias is leading questions – writing your survey questions to elicit a favorable response from your customer. Words like “could” and “should” are not interchangeable – they may produce a 20% difference in agreement to a statement!3

Leading language can be subtle:

To what extent do you agree with this statement: Company made it easy to resolve my issue.

This focus on agreement will produce a statistically more favorable response than:

To what extent do you agree or disagree with this statement: Molly made it easy to resolve my issue.

If you want unbiased results, don’t ask leading questions.

9. Respect the inbox

Customers are bombarded with survey requests every day from the hundreds of companies they interact with. This over-saturation of survey requests is leading to survey fatigue and survey blindness. The long-term effectiveness of surveys depends on our cumulative respect for the customer’s inbox.

You don’t need to send a survey for every customer support transaction, especially in a B2B context where you have a small population of customers with regular customer support interactions.

Consider only sending surveys on 25% of your support transactions, and use the settings in your survey platform to prevent sending multiple surveys to an individual within a reasonable time frame.

Estimate based on knowledge of your business and your customers and experiment to find the survey frequencies that maximize your response rate and insights.


Endnotes

  1. SmarterHQ – Privacy & Personalization Report
  2. Chattermill – Ultimate Guide to CX Surveys
  3. Qualtrics – 7 Tips for Writing Great Questions

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top