Contact Center Trends

11 best practices for creating effective call center IVR surveys

By Kevin McNulty

0 min read

Ivr Survey

IVR surveys are some of the quickest, easiest methods for collecting high-quality data about your customer’s experience. There’s no question that customer feedback is crucial for optimizing workflow in a contact center.

The question is how you can get that data, and how you can ensure that it’s accurate, actionable, and low-bias. 

Accuracy and bias come into play not only with the system you choose, but also with the way you word your questions and the options you give for responses. Actionable data, on the other hand, requires each person in your contact center to have access to customer feedback.

Your executive team will need to see those IVR responses just as much, if not more, than any individual agent. Fortunately, there are a few strong best practices you can follow to make sure you’re getting the right data.



What is an interactive voice response (IVR) survey?

First, a quick refresher on IVR surveys. IVR stands for interactive voice response; IVR surveys are automated phone surveys that allow customers to use their telephone keypads or voice to respond to survey questions.

An IVR system will most commonly conduct surveys post-call, as an alternative to agent-administered satisfaction surveys. The key benefit to post-call IVR surveys when compared with online surveys is timing. The interaction with the agent is still fresh in the customer’s mind, and there’s no chance the caller will have had a second interaction prior to the survey.

This means when you collect data, it’s crystal clear which interaction that data is referring to.



How contact centers are using IVR surveys.

To increase customer satisfaction and loyalty, modern contact centers enhance agent training, implement company-wide customer-centric policies, monitor agents, analyze KPIs, and provide personalized feedback.

There’s just one problem. Unless you ask your customers, there’s no way to know whether all these actions are making a difference.

Some contact centers try to collect information on the customer experience using agent-administered post-call voice surveys. This has two glaring problems. First, it’s much more time-consuming and takes away time that could be better used to service. Second, it introduces response bias. Most customers will hesitate to give agents negative feedback directly, making it difficult to collect accurate data.

IVR surveys are an optimal way to collect feedback on the customer experience since the system can neatly sidestep both problems.



11 tips for creating an effective IVR survey.

Our tips for creating an outstanding IVR survey are compiled from years of industry-leading experience helping contact centers track and improve upon KPIs. Our award-winning strategies for innovation are just as helpful in optimizing the use of our tools as they are in developing new applications.



1. Decide on a specific topic.

Start your IVR survey development by deciding on a specific topic. The topic should align to the KPIs you need to track most urgently.

Brainstorm areas of focus (e.g., customer satisfaction with customer service, customer satisfaction with the product, problem solution, time on the phone, brand awareness, etc.). Then, get as granularly specific as possible (e.g., the degree to which the issue they called about was resolved to customer standards, their satisfaction with the call center agent, their overall satisfaction, etc.).

Clear, concise topics will help you keep your survey focused on your customers. For you, they yield survey results that can be measured consistently.



2. Clearly define your survey keywords.

Once you have decided on your IVR survey topic, clearly define the keywords in your topic.

If your IVR survey topic is “The degree to which customers believed their issue was resolved to their standards”, define:

  • Customer—Is this anyone who spoke with an agent? Anyone who has purchased a product? Anyone who has inquired about a product?
  • Issue—Is this the main reason the customer called? Or what the agent determined the main issue to be?
  • Resolved—Does this mean a solution was offered? Or was a solution implemented?
  • Standards—Does this mean the customer’s perceptions of how the issue should be resolved? Is it compared to the way similar issues have been resolved in the past?

Clearly defined keywords in your survey topic allow you to create questions that are specific enough to provide valid survey data.



3. Develop questions that are relevant to the topic.

Next, ensure that the questions you develop relate specifically to the topic you’re analyzing.

One use of an IVR survey is to measure customer satisfaction during their interaction with a call center agent. In this case, it might not be helpful to ask how long they were waiting on hold.

That is unless you’re interested in measuring whether their perception of wait time had an impact on their level of satisfaction with the agent managing the interaction.

Skip logic can help you stick to the survey topic. Closely aligning with your topic enhances the interpretability of the data gleaned from your survey.



4. Make your questions precise.

Each IVR survey item must ask only one question.

For example, “Was your agent knowledgeable and helpful?” is actually asking two questions.

Perhaps the agent was knowledgeable and rude. How would the respondent answer? And based on their answer, how would you interpret their response?

If you interpreted the data incorrectly, how might this impact your customer experience? Avoid this common survey mistake by keeping your survey questions very precise.



5. Keep your wording simple.

Questions with complex words might not be understood by all IVR survey respondents. Even when your customer base is highly educated, complex words could cause respondents to disengage, as the survey suddenly feels “too difficult.”

As a general rule, stick to a 5th-grade reading level and/or only use words with fewer than six letters.



6. Avoid biased wording.

What if an IVR survey asked a customer if they thought their customer service representative was incompetent?

Such wording may cause the respondent to feel uneasy and they may therefore give a softer, more positive response.

On the other hand, what if your survey began with, “we’re sorry you had a negative experience”? This leading introduction may lead the customer to give harsher, more negative feedback.

Avoid this by keeping your wording neutral and to the point.



7. Give your survey continuity.

When organizing IVR survey questions, present similar questions back-to-back.

This builds cognitive ties between related groups of questions. The questions then feel related and relevant, so completing the survey will be much easier for your respondents.



8. Develop restricted questions with ordered answers.

Asking open-ended questions in IVR surveys can provide rich qualitative data, but interpreting this data can be time-consuming.

Instead, ask restricted questions with ordered alternatives.

For instance, ask “How many times have you called us for the same issue this year?” with options such as, “Press or say zero for never, one for once, two for twice, three for three times, four for four times, five for more than four times, and six if you are unsure,” as ordered answers.

Ordered answers go from low frequency to high frequency and reduce respondent confusion. They provide enriched data, even without receiving open-ended responses.



9. Use the right scale.

When you limit the questions on an IVR survey to “yes” or “no,” the responses you receive will always be limited. Sometimes, this is a good thing, but when it comes to learning exactly where and how you can improve, the limited feedback from a “yes”/”no” response can prevent you from gathering the data you need.

On the other hand, if your list of response options is too long, customers may become impatient and hang up, or forget their options and choose answers randomly. Typically, we recommend a five-point scale, which gives your caller specific options without overwhelming them.



10. Use the right labels.

Labeling your scale with the right descriptors is essential to eliminating customer confusion.

Generically asking customers to rate “on a scale of one to five”, for example, may cause overlap in the mid-range answers. Instead, label your scale, for example “Excellent,” “Very Good,” “Good,” “Fair,” and “Poor.”

This gives your caller valuable insight into how they can best answer the survey. People tend not to answer in the extremes, but when they do, you’re receiving a piece of strong feedback, especially when your survey is clearly labeled.



11. Keep your survey short.

Automated surveys of any length streamline agent call time, but to streamline customer service, keep IVR surveys short. While the chance of a customer beginning an IVR survey is higher compared to online surveys, the chance of drop-off mid-survey is also higher.

Post-call surveys that are 1-2 minutes in length collect enough information to be useful and still hold the respondent’s interest.



Start improving your customer service with a more impactful IVR survey.

Well-constructed IVR surveys offer meaningful data that’s easier than ever to analyze and interpret. Armed with these best practices, you’ll be well on your way to having your finger on the pulse of your customers and enhancing the quality of service you provide.

But increasing customer satisfaction doesn’t stop there.

Get access to a free demo to see how your contact center can leverage features like AI-powered agent assist, automated post-call workflows, automated reports, and more.

About Help Organizations Around World

DEMO

A better way, in every way. Find out how.

FAQs.

IVR systems are complex tools. Here are the answers to a few of the most common questions we get about IVR.



Can IVR surveys ask open-ended questions?

Yes. IVR surveys can ask open-ended questions, but you need to proceed with caution. Remember that respondents would be able to leave recorded answers to these open-ended questions, and that data is difficult to interpret. While AI tools can help, they require a significant amount of training to translate audio data into actionable insights.



What is skip logic?

Skip logic allows customers to be appropriately directed during their survey by skipping certain questions automatically when appropriate.

For instance, your survey might ask the following questions:

  • Have you called about this issue before?
  • How many calls did it take to resolve your issue?
  • Would you like us to reach out to you to receive more detailed feedback from you about your experience today?

Skip logic would allow the customer to bypass the second question if they answered “no” to the first. If they had never called about the issue before, asking how many calls it took to resolve the issue is no longer relevant.



What are the benefits of conducting a post-call IVR survey?

Post-call IVR survey benefits will depend on the quality of your IVR tool and the questions you ask. With Talkdesk, you can expect a multitude of benefits, such as:

  • Ease of execution. Visual flow builder that allows you to point, click, and publish any survey you design without involving coding or IT.
  • Cost-effectiveness. Talkdesk Studio is a core component of Talkdesk CX Cloud and comes with every edition.
  • Higher response rate than online surveys. Since post-call IVR surveys happen as part of the initial customer service interaction, the customer is more likely to respond to at least part of the survey.
  • Immediate feedback. With automated, instant post-call surveys, you don’t have to wait for a customer to check their email. You can instantly see how each agent is performing on a call-to-call basis.
  • Accurate feedback. Since the caller’s experience is fresh in their mind, they’ll be able to provide the most accurate feedback possible.

SHARE

Kevin McNulty

Kevin McNulty is a senior director of product marketing at Talkdesk. He has helped launch numerous enterprise SaaS products for some of the leading technology companies in Silicon Valley and Boston. He has written extensively on the impact of artificial intelligence, cloud computing, and digital transformation in the modern workplace and he keenly understands the challenges business’ face when updating their legacy systems. Prior to Talkdesk, Kevin headed up go-to-market strategies for Everbridge and Veeva Systems.