Behavioral Signals

Award-Winning Tech Enables Emotionally Intelligent Conversations with AI

Behavioral Signals


Emotion underpins most human interaction. Even for the briefest conversation, you instinctively analyze their facial expression, body language, tone of voice, and what kind of mood they're in so you can adjust your approach. When someone is sad, you react sympathetically, when your friend is angry, you speak to them calmly, when your mother-in-law is angry, you leave the house immediately. (You get our point.)

This element of emotion is what differentiates an interaction with a fellow human from one with a machine. But that may not be true for much longer.

Decades of research have been dedicated to teaching machines how to read human emotions just as well as we can. What for? Javier Hernandez, a research scientist with the Affective Computing Group at the MIT Media Lab, explained:

“How can [a machine] effectively communicate information if it doesn’t know your emotional state, if it doesn’t know how you’re feeling, if it doesn’t know how you’re going to respond to specific content?”

The idea of bridging the communication gap between humans and machines with emotion AI is precisely what led Alex Potamianos and Shri Narayanan to create a company dedicated to achieving it. Not just for them, but for every developer, designer, or company looking to add the power of emotion to their own AI. 

Behavioral Signals

Behavioral Signals is immovably focused on developing "emotionally intelligent conversations with AI." Over a decade ago, in a lab at USC, the team made pioneering steps in a field known as Behavioral Signal Processing (BSP).

In short, BSP analyzes scores of human voices and turns emotional cues into machine-readable data. BSP adds a sense of empathy and personality to seemingly mindless machines. It enables your car's voice assistant to suggest a different route when you sound frustrated at nearby drivers, or signals your kid's Smart toy to coax them through the morning chaos before school.

In our email exchange with Co-Founder and CTO, Alex Potamianos, he explained, "Without emotional intelligence, voice assistants are just there to perform tasks for you; play music, set alarms, control your lights, etc. With emotional intelligence voice assistants become companions, teachers, healers, and even friends."

Behavioral Signal's award-winning and patented technology was then neatly packaged into their fast-evolving Oliver API, which is kindly available for anyone hoping to add the nuance of emotion into their Voice applications.

Oliver API

"Understanding, interpreting and regulating emotions is at the very heart of who we are as humans," Alex wrote in his email, hinting that AI should do the same if it's ever to move beyond the role of mechanical order-taker.

Fueled by years of voice data and focused on deep learning, Oliver API is an emotion-and-behavior-understanding engine capable of transforming indifferent voice experiences into emotionally-intelligent ones. With this API, Voice designers and developers can finally get their hands on emotional and behavioral metrics to add a very human spark to their virtual assistants. 

But don't let your mind settle on the picture of just an emotionally-aware Alexa. We're talking about social robots for assisted living, smart toys to teach kids, customer service bots, and even the voice assistant in your car.

If you're keen on trying it out yourself, you can register for Oliver API and get a complete emotional intelligence SDK for a collection of languages. It even has examples to help you get everything set up.

"From call centers and business VA's to working with children on the spectrum and behavioral and couples therapy—the sky’s the limit."

Ask Alex at VOICE Summit

Alex Potamianos' upcoming talk, "Virtual Assistants are both marvelous and god awful," has already piqued the interest of many attendees. The talk will sift through the missing pieces of the voice user experience puzzle and describe what it should actually look like.

"You will walk away with a clearer picture of how to add emotional and social intelligence to your application and how the Oliver API from Behavioral Signals can help you get there," he noted.

This talk is open to everyone, of course, but gives an extra inviting wink to voice-assistant developers, UI/UX experts, or any kind of voice professionals.

To save your seat at this exceptional talk and dozens of workshops, panels, and even a VOICE Award ceremony, nab your ticket and we'll see you at VOICE this July. Alex and a very excited voice-first crowd await!

RSVP TO VOICE 2020