“By 2022, your personal device will know more about your emotional state than your own family,” says Annette Zimmermann, research VP at Gartner in their latest study on Emotion AI.
Virtual assistants (VAs) like Alexa, Google Home, and Cortana are in almost every household and tech giants are continually attempting to make them "seem more human". Their latest feat is enabling these AI marvels to read our emotions.
This development is part of a larger effort called Emotion AI, also known as affective computing in which everyday objects can detect and respond to human emotion–creating more personalized human experiences.
If you're wondering exactly how emotion-detecting voice interfaces can be used, here are a few use cases.
Add a "human quality" to virtual assistants
By incorporating emotion, virtual assistants can achieve two things: 1) speak more realistically by having varying moods themselves, 2) detect your mood and modify their behavior accordingly.
Imagine this: after a long day at work, you greet your virtual assistant with noticeable exhaustion in your voice. Your assistant instantly recognizes it and proceeds to automatically dim the lights and switch on your favorite Netflix show.
In another scenario, you're feeling down and wistfully ask your virtual assistant to cancel a gift order. It hears you, cancels the order, then gently offers helpful suggestions to cheer yourself up–like playing an upbeat playlist or ordering a tub of Ben & Jerry's.
Provide relationship advice
According to an article in The Independent, virtual assistants like Alexa and Google Home can use their "always listening" feature to analyze the communication between couples. They'll be able to detect red flags in the couple's speech patterns–like sadness or anger, and even interrupt an argument to suggest a peaceful resolution.
You can soon think of your virtual assistant as a live-in couple's counselor, which could either save your relationship or earn your VA a one-way ticket out the window.
Improve the customer experience
Interacting with branded voice apps is already routine for many users. With Emotion AI, virtual assistants can detect your emotion and modify its next response to improve the interaction (and save the brand).
For example, if you're using Alexa for banking and find yourself unable to make a transaction, Alexa can notice the frustration in your voice and offer a solution–like calling the support team for you.
Another example is if you're interacting with a voice-first shopping app and can't find the pants you've been searching for. The disappointment or moodiness in your voice can prompt the VA to offer you a discount code.
Drivers routinely make rash decisions when they feel angry, aggressive, anxious, or drowsy. If a virtual assistant can detect these feelings and help manage them, it can improve road safety for everyone.
Monitor the health of patients
Our tone of voice changes during an anxiety attack, a depressive episode, or in the midst of a migraine. These acoustic nuances can trigger virtual assistants to take the appropriate actions.
Coupled with wearables, emotion-detecting virtual assistants can help monitor the patient's mental, emotional, or physical health and alert the right people. This is an important feature for healthcare which has long been incorporating voice technology in clinics, ambulances, and even in homes of the elderly.
As Zimmerman states in the Gartner study, “We can expect technology and media giants to team up and enhance their capabilities in the next two years, and to offer tools that will change lives for the better.”
To keep learning about what's new in voice technology, follow VOICE on Twitter.