Artificial intelligence (AI) has come another step closer to analyzing and interacting with humans, this time in the form of emotional intelligence and understanding.

Affective computing, which began about 10 years ago, is a form of AI that focuses on understanding emotions expressed via speech, written text, and videos. From affective computing came a new branch of AI, known as behavioral signal processing, which enables computers to understand human emotions and turn them into actionable insights.

Alex Potamianos, CEO of Behavioral Signals, has spearheaded new “speech-to-emotion” AI technology, which “[analyzes]  emotions and behaviors in speech, things like if you’re happy, angry, sad.”

In a demo of the technology, the AI determined emotions, arousal, and speaking rate when listening to and analyzing a conversation on the podcast “Ask Gary Vee.”

Potamianos says that much of Behavioral Signals’ AI research comes from data shared by call centers and mentions the example of identifying a customer’s emotion upon calling in and correlating it with what is being said.

For future applications of the software, Potamianos believes that it could be useful in health care, as the AI could learn to understand the onset of certain mental illnesses (e.g. depression) through analysis of the human’s voice and speech.

For more articles on AI, visit our Robotics & AI Channel.