Customer feedback is very important and surveys are a valid way to collect that information but they are often poorly implemented. In this on-going series I review surveys that I’ve received to see what’s working and what’s not.
Today’s example comes from a supplier who shall remain nameless. In these reviews I don’t reveal the name of the supplier because at least they are trying to collect data, even if they could be doing it in a better way.
Mistake 1: Ask too much/private/sensitive too soon
Here is the first question on the survey:
Hey what about buying me a drink and letting me get comfortable before you dive right into the heavy stuff! For many privately owned businesses, asking about revenue is quite high on the “hey I’d rather not tell you that” scale.
Don’t ask anything too sensitive in the first few questions. Let the respondent get to know the process and the survey a little bit before asking anything too personal.
Ask some easy questions to get things moving. Later in this survey they ask about social network preferences. That would be a much better and less confrontational way to start the survey.
Mistake 2: Not giving me an out
A common response to that first question would be: “Ahh, I’d rather not tell you that if that’s okay with you.”
Actually it’s not okay: this is mandatory question and there is no “n/a” option so you have to answer it to move forward.
There are two problems here:
In this case the organisation was putting up some prizes as an incentive (another not great idea) so perhaps they felt the customer owed them a response. Perhaps they even felt that some respondents would abuse the process by not providing feedback just to get a chance at the incentive.
These are not reasons to make questions mandatory. If you offer an incentive you have to assume some small level of abuse.
In this survey instrument there are 32 questions with sub-questions and I estimate that 90% were mandatory responses. That is not respecting your customers. Plus you now have the next problem.
No n/a option
On a mandatory questions not having an ” n/a” option causes even more problems than normal because if a person really does not want to answer the question you have given them no way out except to provide false feedback. So they will give any response to move forward.
Now you are receiving feedback but you have no idea if it is real or false. Congratulations, you have all the questions filled in but all the data is suspect. A lousy outcome.
Mistake 3: Asking me what you should already know
This is really a pet peeve of mine.
I think that it is disrespectful to ask customers to spend their time providing information that you have but are too lazy to attribute yourself. Here is one of the final questions:
The survey invite was sent to me by email AND had a tracking link on it so they know who opened this form, yet they still ask for my email address. At the very least they can pre-fill that in for me on the off chance that I might want to change the contact address for the competition. They also have all of the other data in their system.
Here is another example:
They have a set of server logs so they could simply use those to determine who was using iPad devices but instead they ask me.
And one more:
Again a quick linkage of my account email address and my usage of the system will tell them if I am using a particular feature.
The “why not” part is valid in this case, but it’s a mandatory question, see above. I would not be surprised to see lots of “because I don’t like it”, “I don’t want to” and “asdfg”s in response to this question. Forcing people to respond does not predispose them to investing time in giving a good quality information.
Not only is this disrespectful of the customer’s time it also reduces the amount of information you can collect and the response rate. By using questions for information you already know you are either:
- Extending the survey and lowering response rates; or
- Reducing the number of really useful questions you can ask.
Either way it is reducing the effectiveness of your feedback approach.