'Emotion AI' may be the next trend for business software, and that could be problematic
As businesses experiment with embedding AI everywhere, one area starting to gain more attention is Emotion AI.
As businesses experiment with embedding AI everywhere, one unexpected trend is companies turning to AI to help its many newfound bots better understand human emotion.
It’s an area called “emotion AI,” according to PitchBook’s new Enterprise Saas Emerging Tech Research report that predicts this tech is on the rise.
The reasoning goes something like this: If businesses deploy AI assistants to execs and employees, make AI chatbots be front-line salespeople and customer service reps, how can an AI perform well if it doesn’t understand the difference between an angry “What do you mean by that?” and a confused “What do you mean by that?”
Emotion AI claims to be the more sophisticated sibling of sentiment analysis, the pre-AI tech that attempts to distill human emotion from text-based interactions, particularly on social media. Emotion AI is what you might call multimodal, employing sensors for visual, audio, and other inputs combined with machine learning and psychology to attempt to detect human emotion during an interaction.
Major AI cloud providers offer services that give developers access to emotion AI capabilities such as Microsoft Azure cognitive services’ Emotion API or Amazon Web Services’ Rekognition service. (The latter has had its share of controversy over the years.)
While emotion AI, even offered as a cloud service, isn’t new, the sudden rise of bots in the workforce give it more of a future in the business world than it ever had before, according to PitchBook.
“With the proliferation of AI assistants and fully automated human-machine interactions, emotion AI promises to enable more human-like interpretations and responses,” writes PitchBook’s Derek Hernandez, senior analyst, emerging technology in the report.
“Cameras and microphones are integral parts of the hardware side of emotion AI. These can be on a laptop, phone, or individually located in a physical space. Additionally, wearable hardware will likely provide another avenue to employ emotion AI beyond these devices,” Hernandez tells TechCrunch. (So if that customer service chatbot asks for camera access, this may be why.)
Article