Artificial intelligence (AI) has come another step closer to analyzing and interacting with humans, this time in the form of emotional intelligence and understanding.
Affective computing, which began about 10 years ago, is a form of AI that focuses on understanding emotions expressed via speech, written text, and videos. From affective computing came a new branch of AI, known as behavioral signal processing, which enables computers to understand human emotions and turn them into actionable insights.
Alex Potamianos, CEO of Behavioral Signals, has spearheaded new “speech-to-emotion” AI technology, which “[analyzes] emotions and behaviors in speech, things like if you’re happy, angry, sad.”
In a demo of the technology, the AI determined emotions, arousal, and speaking rate when listening to and analyzing a conversation on the podcast “Ask Gary Vee.”