We used AI to identify $100k+ savings for contact centers
Our new feature, Root Cause AI, analyzes the reasons behind long AHT and finds ways to save operational costs
July 20, 2022
November 13, 2020
In the era of advanced customer listening, tonal voice of customer sentiment analysis offers a lot less business value than it might suggest. Some might even call it primitive. So, why is focusing on the emotional tone of a customer conversation alone not going to give you the conversation insight you’re looking for? And why won’t tonal sentiment analysis tools and systems equip you to make business decisions or move key CX metrics in your organization? Let’s dig in and excavate the basics together, shall we?
You’ve seen them before in a product, support or customer service email or on a piece of content you’ve just read. Or even at a kiosk in the airport near security. (Those were the days, eh? But do any of you actually miss going through lengthy airport security lines this long into a global pandemic?)
What I’m speaking to are those customer communications and interactions where a business uses smiley faces, neutral faces or frowny faces to tell them, “was your experience good? Did it fall flat? Was it bad?” What if we told you that all you’re getting out of tonal sentiment analysis is a smiley or a frowny face?
“All you get out of tonal sentiment is a smiley or frowny face.” Matt Dixon, Ph.D., chief product and research officer, Tethr
Here’s the concern with most companies touting their customer sentiment analysis tool for scaling customer listening or improving customer satisfaction, service quality and customer experience: they fail to mention that “sentiment analysis is merely one of the first and easiest steps in analyzing customer experience,” explains Tom Shepherd, Ph.D., and senior software engineer, machine learning and analytics at Tethr.
Tonal sentiment analysis alone seeks to answer the question: "Is the customer happy?" And the problem with stopping there, as Shepherd shares, is that “it can only provide the most rudimentary answer to that question by telling you whether the overall emotional content of the conversation was positive or negative, but it can’t tell you why.”
So, if tonal sentiment analysis translates to the most primitive customer voice data you can get about your customers’ experience, what’s better and why? Well, since we’re getting prehistoric, let’s imagine back to the bygone days of air travel. Your customer is just leaving the security line and they pass an airport kiosk asking them about their visit. They press a button for the smiley face or the frowny face on their way to their gate. Both faces are meant to tell the airline how their experience has been.
“That is the lowest form of experiential data,” says Matt Dixon, Ph.D., our chief product and research officer. As the airport or airline CX leader, you’ve probably just spent lots of money on a machine that tells you what customers are experiencing in a sort of caveman-speak: “Experience bad. Me no like.”
Imagine instead that you could listen in and have access to all the conversations that same person had along their entire customer journey through the airport. So now, you’re gathering data from their interactions with a greeter and check-in kiosk rep in the ticket line at bag check. You’re also collecting customer data on their interaction with the ticketing service agent and their chats about their experience with random strangers (other customers) heading to the same or different destinations in the security line. Finally, you’re collecting the VoC in their interaction with the TSA agent right before they go through the airport scanner.
“At Tethr, we choose to be the best in the world at syntax-based customer understanding.” Matt Dixon, Ph.D., chief product and research officer, Tethr
The above scenario describes the fundamental difference between understanding the how in your voice of customer data through tonal sentiment analysis and understanding the what and why of your VoC data by enriching it with syntax-based customer sentiment analysis. “Both are helpful, of course, but for business people syntax matters,” explains Dixon. “No company makes business decisions off of tonal sentiment analysis alone. They’re going to need to know what the customer said and why they said it.”
Why does a customer’s syntax matter more than just understanding a customer’s tone? Shepherd sums it up like this: “It's the difference between listening to a conversation in a language you don't understand versus one that you do. (Let’s say a caveperson is speaking to you, actually, in grunts and squeals.) You’re going to get the emotional gist, but you don't understand what's actually happening.”
Tonal offers customer sentiment detection, or a sense of overall sentiment, but it fails at more nuanced sentiment classification as well as emotion detection and emotion classification. And, while tonal can get at the basic emotional tone, “it doesn't cover the range of possible human emotions. And it doesn't address how emotions changed over the course of a customer service interaction or what led to the detected emotions,” explains Shepherd.
Tethr's approach to customer sentiment using syntax is more detailed and nuanced than crude, tone-based sentiment analysis systems. Shepherd expands, “we don't merely try to identify how positive or negative a conversation was; we try to identify specific emotions such as frustration, confusion or apprehension. Utilizing a semi-recursive, hybrid tensor network with a Markov field type surface structure that is fed by graph-based Markov classifier outputs, real-valued variables and raw lexical inputs, our machine learning-based approach to customer sentiment helps us identify where these emotions occur in the customer conversation and even what caused those emotions.”
So, why are these nuances so essential to have in your voice of customer dataset? They drive more in-depth insight into the customer interaction as a snapshot of an entire customer experience. And, they allow us to equip our customers with a sentiment analysis system offering more actionable insight. A conversation analysis system that drives a positive customer experience by informing you of not only how, but what and why.
Tethr’s syntax-based machine learning approach to customer sentiment covers these essential angles of the customer experience:
While tonal sentiment analysis can be useful as a general barometer for customer satisfaction — it can indicate whether things are generally good or generally bad — that’s where its customer insight ends. Shepherd goes on to explain that “we can evaluate tonal customer sentiment analysis easily using the presence of certain keywords which have positive or negative sentiment, or connotations, without the need to understand what’s being said fully.”
“If you see the words’ terrible’, ‘awful,’ ‘bad,’ ‘frustrated,’ ‘angry,’ for example, in a conversation, you can be reasonably sure that things aren’t going well,” adds Shepherd. “When you’re analyzing raw customer conversation audio, certain changes in volume and pitch can clearly indicate that a person is experiencing negative emotions. But, you don’t have to hear what someone is saying to know they are upset.”
“Tonal sentiment analysis can be useful as a general barometer for customer satisfaction in that it can indicate whether things are generally good or bad. This, however, is often where the value of tonal customer sentiment analysis ends.” Tom Shepherd, Ph.D., senior software engineer
machine learning and analytics, Tethr
Similarly, positive sentiment, or happy customer emotions, can be easily detected from both keywords and audio attributes. Tonal customer sentiment analysis is useful because it can provide a simple indication of customer satisfaction or dissatisfaction, allowing companies to scrutinize specific customer calls or multiple customer interactions across channels more closely.
At Tethr, we look at tonal sentiment as the rudimentary analysis of positive sentiment versus negative mentions in a customer interaction. “When a certain product name or problem keyword occurs in presence of good or bad keywords, it can provide a general indication of customer satisfaction with a product or problem. This, however, is often where the value of (basic tonal) sentiment analysis ends,” states Shepherd. To get to the bottom of what's really happening still requires someone (like a Quality Assurance Manager or call center Floor Supervisor) to review the customer calls and omnichannel customer interactions across live chat, support emails or any other service channel. Ultimately, confirms Shepherd, “customer sentiment analysis can tell you how a customer is feeling, in a limited way, but it can't tell you what or why.”
“For comparison,” illustrates Shepherd, “traditional tonal sentiment analysis is like asking someone, ‘How was your day?’ with the expected answer of ‘good’ or ‘bad.’ What Tethr does is akin to ‘Tell me all about your day.’ Then Tethr provides constructive advice to ensure that the next day is better.”
At Tethr, we look at syntax-based sentiment analysis as using the presence and the arrangement of words to identify customer sentiment. So, rather than leaving customer sentiment at simple positive sentiment versus negative mention ratings based on word connotations; instead, we use “the lexical patterns in utterances to classify them as having a given emotional content,” says Shepherd.
In other words, we look at the complete pattern of words used in an utterance to determine what emotion is being expressed rather than looking at the emotional content of individual words. Both the presence and arrangement of all words determine sentiment.
If we don’t use sentiment ratings of individual words or groups of words to detect and calculate customer sentiment, how does our conversation analysis and intelligence platform do tonal and syntax-based sentiment analysis? We utilize semantic classification of utterances in a customer interaction (like a customer call, chat, email message) that we call a category. Using sequences of these simple categories related by conditions like inclusion, exclusion, co-occurrence, order, time, and part of the interaction help determine what emotions are being expressed throughout the customer interaction or across multiple parts of an interaction over days, week or months.
So, syntax-based sentiment analysis can also utilize sets of real-valued variables. Things like the frequency of category occurrences within a customer call, the call’s duration or the customer service agent’s talk time, silence time, etc. Shepherd shares an example to illustrate further how these real-valued variables work:
“For example, a sequence of short, terse utterances followed by extended silence can convey a great deal of emotion (emotion detection) without the use of specific verbiage (sentiment lexicon).
I'm... not... interested... (long pause) OK?’
‘I'm not interested, OK?’
Both have the same verbiage, but we can detect very different presentations and very different emotional intensities.”
Tethr’s sentiment analysis model “determines and detects sentiment, or more accurately, detection of emotional expressions based on a Markov classifier using a training set of full utterances, (like customer frustration, confusion, concern, for example),” details Shepherd.
When you want sentiment analysis insight that supports a broader customer conversation analysis strategy, it’s helpful to understand where you are on the spectrum of sentiment analysis.
On one side of this sentiment analysis spectrum, you’ve got tonal sentiment analysis. On the other side, you’ve got a type of automatic speech recognition (ASR) called LVASR, or large vocabulary automatic speech recognition, which is defined just like it sounds.
“Tonal sentiment analysis — especially coming from audio — lends itself to the caveman analogy,” adds SVP Product, Ted McKenna, and ultimately leaves you with a lot of useless customer grunts. Meanwhile, on the other side of that spectrum is Tethr scoring capabilities drawn from large vocabulary automatic speech recognition, which works to exponentially increase the level of analytical sophistication you’re going to get out of the voice of your customer.
So you might be asking yourself where your company’s conversation analysis strategy falls on this spectrum? Well, let’s look at the types of sentiment analysis that fall in between the two ends we’ve just covered to help you assess where you’re at today and where you want to evolve.
From primitive audio-only tonal sentiment analysis, the level of analytical sophistication increases by including phonetics. How? Automatic speech recognition begins with raw audio processing (via conversion to Fourier spectra, then cepstral coefficients) and progresses to natural speech sound (phoneme) recognition. Similarly, audio-only sentiment analysis builds from superficial audio characteristics like volume and pitch changes. Because the quality of our voices and the speech sounds we make change with our emotions, audio sentiment analysis then moves toward more complex speech sounds associated with specific human emotions.
For example, an angry person tends to have a more strained and gravelly sound to their voice than a calm person. The way we enunciate in our speech and the specific vowels and words we stress changes. Analyzing the voice of customer for these kinds of phonetic changes, using spectral features, gives us more information about the emotions contained in human speech.
And from phonetic analysis, you can evolve analytic sophistication on the spectrum using traditional text analytics, or sentiment keywords drawn from large vocabulary speech recognition. So just as basic phonemes (speech sounds) we mentioned above form words, and words convey higher meaning in human speech, so too can words express human emotion.
For example, words like ‘angry,’ ‘happy,’ ‘sad,’ ‘confused,’ express quite specific emotions and tell you more about the customer or agent’s emotional state than simple voice quality or phonetic analysis. However, words alone only express these emotions in very isolated and general ways as they don’t fully capture the meaning of what the customer or agent is expressing.
For more on the difference between traditional text analytics and how we approach scoring data on three levels using LVASR at Tethr, check out the limitations of text analytics.
But to significantly leap forward on the sentiment analysis spectrum (and evolve your conversation analysis from prehistoric to making history) without initiating a huge data science project in your org, you’ll need to consider layering other aspects of the conversation’s full transcript and historical data, as just two examples.
This is where Tethr’s built-in scoring capabilities drawn from large vocabulary speech recognition (LVSAR) come into play. By taking into account not just words, but phrases, sentences and conversational interactions with customers across channels, Tethr detects and derives much more detailed insight about the expressed emotions and what’s driving those emotions. Tethr’s conversation analysis model can isolate specific emotional reactions to particular exchanges during a conversation and determine which interactions lead to positive and negative experiences across a wide variety of emotions. Our model allows us to identify what customer service representative actions (or behaviors) will lead to the most desirable CX outcomes based on the conversational context.
For the speech, language and conversation analysis geeks out there who may be wondering more about the gory, geeky details of our unique sentiment and, more so Tethr’s conversation analysis model, stay tuned for more in our customer listening superpowers series.
Until then, check out seven of our customer conversation analysis features that will put you on the evolutionary path to becoming a conversational analysis pro.
Then ask us a question or reach out for a free demo today. Tonal customer sentiment analysis should be your first step, not your only step, in deepening your voice of customer data and evolving your voice of customer insight.
Get the latest insights from the Tethr team and level up the metrics that matter!
Our new feature, Root Cause AI, analyzes the reasons behind long AHT and finds ways to save operational costs
July 20, 2022
Should agents try to build a relationship with customers they’re serving? We used our AI platform to analyze conversations to get a definitive answer.
October 10, 2022