Do you participate in customer service surveys? Sometimes I do, and sometimes I don’t. Frankly I’m not sure if they actually serve to improve customer service, but I do know completing them makes me feel better when I’m upset. Based on my experiences, here are some tips to obtaining better survey results, assuming that is your goal.
Don’t Over Survey Customers: My first encounter with a customer service survey gone awry was with an equipment vendor. (I think they’re out of business now – might there be a connection?) I called them often, attempting to resolve several ongoing problems. Each call generated a follow-up customer service survey via fax. They irritated me even more by sending a survey after every call, even if I called multiple times the same day. Even so, I filled out each one, thinking I was doing my part to help them improve their processes and increase quality.
Protect the Identity of the Survey Respondents: I was fair in my evaluations of them, giving high marks when they had been earned and not so high ratings when warranted. Even so, I never gave them a below average grade. Imagine my surprise when I met their staff at a convention a few months later.
Over the course of the three-day event, no less than five people from that company, including the CEO, confronted me about my survey scores. They were all aware of the marks I had been giving them – and they were mad. It seems I was lowering the curve; each one claimed my “low” responses negatively affected them. I thought I was helping, but they didn’t share my perspective. I was making them angry. This was certainly not improving my chances of receiving the help I needed when I called. After their repeated rebukes, I never filled out another one of their customer surveys.
Make Sure the Survey Is Evenly Distributed: Next is my Web hosting company. I don’t need to call them often – and when I do, they’re responsive and helpful, usually resolving my issue quickly and on the first call. Sometimes at the call’s completion, they ask if they can email me a customer survey. Over the years, I’ve called enough to realize they generally make this offer when the call went exceptionally well but not when it was difficult or lengthy. By picking who to survey, they skew the results and garner only favorable feedback. Consequently, any conclusions are meaningless.
Be Careful With Post Call Surveys: The next series of surveys were set in motion by my decision to change cell phone providers. My carrier had been acquired and there were ongoing quality issues with the new one; it was time to switch. First, I needed to do a usage analysis to ensure that the new plans we considered would cost what we anticipated.
Unfortunately, I couldn’t log into my account to download the call detail report; it was my first attempt since my account had been migrated and something wasn’t working. By the time I found a number someone would answer and worked my way to the right department, I was less than pleased. But after thirty minutes of effort, I had successfully logged in and downloaded the needed data. When given the opportunity to complete a post-call survey, I jumped at the chance so I could express my displeasure. On the question, “Would you recommend us?” I gave them a two on a ten-point scale.
Prohibit Survey Coaching: That night, with my analysis complete, my clan headed off to procure our new phones. As we left the store with product in hand, our accommodating salesman mentioned I would receive an automated customer service survey. He asked me to respond with a five on every question – doing so would verify I was pleased with his service.
Interestingly, when selecting cell phones three years prior, I received the same plea by a different rep, at a different store, from a different carrier. Is this a common industry practice?
Only Follow Up When Appropriate: The next day, before my old number was ported over to my new phone, I received a call from my old carrier on my old phone. Not recognizing the number, I didn’t answer it, and they left a voicemail message. The call was in response to my reluctance to recommend them, as noted on my survey.
Call Back in the Manner Requested: In the rep’s message, she asked I call back with the best time and best number for them to reach me. I did, leaving my office number and asking for a return call on Monday. Instead, they called my cell number on Sunday, causing me further irritation by ignoring the information they requested and I provided. The agent made no attempt to win me back or leave the door open for my return, but he did condescendingly remark that had I gotten new phones from them, my quality issues would have been resolved.
Leave Understandable Messages: A week later I received a welcome call from my new carrier. It was not automated but a person. Curiously, she didn’t call my cell phone but my office number. I was out, and she left a message. The fast-talking agent spewed forth her message, callback number, and an eleven-character identifier I was to provide them on my return call. I played the message four times before I could catch all the digits.
Make Sure Your Phone Numbers Work: I called the number. It rang a couple times, I received the typical “your call may be recorded” message, heard another ring, and then silence. Repeated calls produced the same result. A few days later, my attempt was greeted with an announcement: “At this time we are unable to answer your call; please try your call again later.” I’ll never know if they wanted me to take another survey or not, but if I do get that chance, I’ll be sure to remind them how important it is to make sure their phones are working when they ask me to call – and that they actually answer the phone.
If you’re surveying your customers, what is your goal? Is your methodology supporting that goal or thwarting it?
Peter DeHaan is a magazine publisher by day and a writer by night. Visit peterdehaan.com to receive his newsletter, read his blog, or connect on social media.