My father always says, “It’s not what you say, but how you say it.” Indeed, it is apparent that there are other dimensions to understanding what people are saying aside from the words been used.
There are three key analytics capabilities which can be used to pull out valuable information from conversations besides the actual words analysis – talk over, silence, and emotion detection. All of these can shed light on agent capabilities and customer intent.
Let’s drill down each of these and see how they can help improve the customer experience.
He said, she said
“Talk over” is exactly what it sounds like – a situation where the customer and the agent talk at the same time, interrupt one another, and probably don’t listen to what’s being said. This can indicate some kind of argument, which can quickly lead to low customer experience or even customer dissatisfaction. By tracking these conversions, organizations can gain useful insights into agent performance: Is it a one-off event or something more consistent? Does it happen during certain types of calls or is it this type of behavior cross-organizational? All of this can indicate poor call-handling skills and highlight opportunities for additional training or coaching.
Sounds of silence
We can also learn about agent performance from situations where agent responses include long periods of silence. This may indicate a knowledge gap, especially when it repeatedly occurs during calls of the same type. Silence may also indicate that an agent is “muting” the phone, which can be hard to detect otherwise. The beauty of these two capabilities is that they are language and speaker independent.
Some things aren’t said explicitly but can be implicitly understood – so the tone of voice can make all the difference. For example, a customer may make a positive comment, “You have great service” or an ironic, angry statement “Oh, you have GREAT service, alright!” The words are almost identical, but certainly don’t mean the same thing. Here, you need sophisticated speech analytics tools to be able to differentiate.
Emotion detection is critical to truly understanding the nuances of the voice of the customer. The emotional state of a speaker can be identified through characteristics such as pitch variations, energy, patterns of stress and intonation, etc. Interactions (spoken or written) in which customers exhibit high levels of emotion can present a good indication about their level of satisfaction. Based on this, organizations can take corrective actions, if needed, or simply use these cases as lessons in best practices. It’s important to always keep in mind that the customer may not remember what he was told, but will always remember how you made him feel.
Click here to download a full copy of the white paper ‘Why Combining Phonetics and Transcription Works Best’to better understand how these capabilities work and how NICE Interaction Analytics can help your organization make to most out of your speech technologies and engines.