IVR surveys are an automated customer feedback survey that allows customers to use dial pads on their phones to input responses to your survey questions.
Now more than ever call centers are striving to enhance the quality of service they provide their customers in an effort to increase customer satisfaction and loyalty. In order to accomplish this, they enhance call center agent training, implement company-wide customer-centric policies, monitor agents for quality assurance, analyze KPIs and provide feedback based on this data. But the only way to really know if all of these efforts are making an impact on your customers is to ask them!
One way to obtain direct feedback from customers is for agents to conduct post-call surveys. This can be helpful, but also time consuming for the call center workforce and introduces response bias as most customers will hesitate to give agents negative feedback.
A more optimal solution is to leverage interactive voice response (IVR) surveys to collect customer feedback. However, utilizing this technology is only the first step in the right direction. In order to collect reliable and meaningful data, you must design the IVR survey with a few things in mind. Below are best practices for creating effective call center IVR surveys. They are compiled from CFI Group, a leader in customer satisfaction analysis, as well as standard practices for research design and methodology from the field of psychology.
Sign up for CX and call center insights delivered weekly to your inbox.
Start your survey development by deciding on a specific topic. You can brainstorm areas of focus (e.g., customer satisfaction with customer service, customer satisfaction with product, brand awareness, etc.) and then narrow this down to as specific topic as possible (e.g., the degree to which customers believed the issue they called about was resolved to their standards, their satisfaction with their interaction with the call center agent, their overall satisfaction with your company or product, etc.). A clear, concise topic will help you keep your survey focused and will yield survey results that can be interpreted unambiguously.
Once you have decided on your IVR survey topic, you should clearly define the key words in your topic so that you know exactly what you will be measuring. For example, if your IVR survey topic is “The degree to which customers believed their issue was resolved to their standards”, define:
Clearly defining the key words in your survey topic will allow you to create questions that are specific to the topic and will enhance the validity of the survey.
Once you have decided on a specific topic and clearly defined your topic, you’ll want to ensure that the questions you develop relate specifically to that topic. For instance, if you are interested in measuring customer satisfaction with their interaction with the call center agent, it might not be helpful to ask how long they were waiting in the waiting queue (unless you’re interested in measuring whether their perception of wait time had an impact on their level of satisfaction with their interaction). Stick to the survey topic and this will enhance the interpretability of the data gleaned from your survey.
It is important that each survey items asks only one question. For example, “Was your agent knowledgeable and helpful?” is actually asking two questions at once. Perhaps the agent was knowledgeable and rude. How would the respondent answer? And based on their answer, how would you interpret their response? Avoid this common survey mistake by keeping your survey questions very precise.
Questions with complex words may not be understood by all survey respondents. If you are using a word that has more than seven letters, a simpler alternative is most likely available and more appropriate.
Asking a customer if they thought their customer service representative was incompetent may cause the respondent to feel uneasy and they may therefore be more likely to have a softer response. Avoid this by keeping your wording neutral.
When organizing your survey questions, make sure similar questions are presented back-to-back. This will build “cognitive ties” between related groups of questions and will be much easier for your respondents to complete.
Asking open ended questions in IVR surveys can provide rich qualitative data, but interpreting this data can be time consuming. As an alternative, try asking restricted questions with ordered alternatives. For instance, asking “How many times have you called us for the same issue this year?” with the following options, “Press zero for never, one for once, two for twice, three for three times, four for four times, five for more than four times, and six if you are unsure” as ordered answers (they go from low frequency to high frequency) will reduce respondent confusion and provide enriched data.
Many survey respondents tend not to pick answers in the extremes. As a result, if you have a five point scale and your customers tend not to choose answers on the extremes, you really have a three point scale which limits response variability. Additionally, if your list of responses is too long, customers may become impatient, or forget their options and choose answers randomly. For IVR-based surveys, ACSI recommends a sweet spot: a nine-point scale that utilizes the numbers one through nine. This eliminates the need for the customer to enter double digits and offers them enough options; optimizing the survey response set.
Once you have selected your optimal number of survey responses, remember to not restrict the range too much as you might miss important data. For instance, customers who answer on the extremes might be your most unhappy customers. Allow them the opportunity to answer in the extremes, so that your team can act on the results.
Labeling your scale with the right descriptors is essential to eliminating customer confusion. For example, with a seven point scale using the labels: very weak, weak, slightly weak, neutral, slightly strong, strong, very strong; will provide your customers enough options to choose an answer that accurately reflects how they feel without introducing confusion.
With automated surveys, there is a greater chance for drop-off than with live surveys. Thus, keep your survey short enough that respondents will be more likely to complete it. Post-call surveys between 2-3 minutes in length collect enough information to be useful and still hold the respondent’s interest. Aim for this sweet spot.
Well-constructed IVR surveys are easier to analyze and interpret and therefore offer more meaningful data. Follow the aforementioned 12 best practices and you’ll be well on your way to keeping your finger on the pulse of your customers and enhancing the quality of service you provide.