Surveys for Customer Satisfaction Feedback: Do you make these mistakes?

Surveys for Customer Satisfaction Feedback: Do you make these mistakes?

Get the Business Leaders’ Practical Guide to Implementing Net Promoter

Because I’m in the industry, I maybe like you, take all the customer satisfaction feedback surveys that I can, to see what organisations are doing, what works and what we should avoid doing.

Rather than keep this information to myself, I’ve decided to post the reviews here.  In this series, I’ll review a survey that I’ve received and deconstruct it.

For the most part I’ll delete the company name.  I don’t see any need to “name and shame” organisations who are at least trying to collect customer feedback.  Most often it is their advisors (research providers and related companies) that are letting them down.  These are the organisations that are paid to know the best practices but depressingly often don’t.

Background

I contacted our web-hosting company a few days ago on a small issue.

The service was for the most part prompt (although I spent a little too much time on hold), accurate, and good natured.  All in all I was happy.

Survey Good Points

  1. The survey was delivered the next day:  Good transactional survey practice is to get feedback from people soon after an interaction so this was done just right.
  2. It was short: the survey itself was only four questions, plus a sevice recovery question.  That it was short was good, that the questions could have been better (see below) was not.

Could improve points

The survey invite was not personalised

Rather than have my name at the top of the survey there was a generic: “Thank you for contacting…”  They know my email address so a little first name personalisation would not be that hard.

The sign-off came from the very personal “Customer Service – Leadership Team”

Hey, we all know that these emails are automated but if you put a (real) person’s name at the bottom of the email then staff and customers will feel much more strongly that the feedback will be used, i.e. the buck stops with signatory.

Question “1.  As you interacted with us last month, how would you rate our Customer Service Support?”

Fine, but I would go with the Net Promoter Score question.

Question “2. How proficient was the consultant who assisted you?”: Well Above Standards to Well Below Standards

Hmmm… not really sure how to answer that.

Do you mean my standards or your standards?  Even the word “proficient” is difficult in this circumstance.

Questions “3. Purpose of your request?”

This is one of my pet peeves with surveys: don’t ask what the customer has a right to expect that you already know.

The company obviously has the ability to extract my email details from their systems.  Why couldn’t they also extract the reason code for the call, as entered by the agent?

Not only is this a negative for the customer survey experience but if they don’t ask this question they could ask a more important question and still keep the survey short.

Question “4  Please select one of the following to provide feedback on the highest priority area for us to improve our service delivery”

Customers have great difficulty in objectively determining what is important in the customer experience.  Asking them in a simple “select one” question is almost never reliable.  Plus you’ve used up another question spot that you could get more value from.

Question 5 “If you feel that your enquiry is unresolved with Company X, please provide us with your contact information, and one of our senior representatives will contact you to investigate and resolve.”

This is a good effort to start the service recovery process.

But again why are you asking me my name, phone, email (hey you just used that to contact me), etc ,etc.

I may be labouring the point but don’t ask what your customers have a right to expect you to know.  This is asking the customer to invest their time to do what you are too lazy to do.

What’s missing: any sort of qualitative feedback.

All of these quantitative questions are great and make for lots of cool charts.  They can tell you WHAT needs to change but they don’t help you to understand HOW to change.

Only qualitative questions (or lots of quantitative questions) can do that effectively.

My approach would be to replace questions 3 and 4 with “So tell us the most important reason that gave us that score.”  Then at least you would have some powerful statements in the customer’s own words to drive change in the organisation.

Have you seen a good or bad example of a survey recently?  Contact me via the comment box below and we can deconstruct it for everyone else to learn from.

I've created a Business Leaders’ Practical Guide to Implementing Net Promoter.  Download it Here