Emotion could transform the way we experience artificial intelligence

Every time you talk to Alexa or Google Assistant or Bixby or Siri or Cortana, they understand the context but not the semantics of it.

If these digital assistants hope to replace our current set of devices powered by touch mainly then they need to understand emotion.

Emotion, as defined by Wikipedia, is “Any conscious experience characterized by intense mental activity and a certain degree of pleasure or displeasure.” One key reason why smart assistants cannot understand the mental activity or tone of your request is because they lack the conscious experience.

“AI – whether consumer-centric or business-centric – need to emotionally aware in the next five years,” said Ranjan Kumar, Founder & CEO, Entropik Tech, a company working on Emotion AI. Amazon and Google are both working on adding an emotion layer to their digital assistants.

An user might be asking something angrily to Alexa, Amazon’s digital assistant but the reality is that Alexa is not able to understand that context itself.

Ranjan says Alexa or Google Assistant are not capable of understanding emotional context in their current form and his company aims to add that layer of emotional activity and are calling it as EmotionAI. How can Emotion be added to current form of AI? Emotion is an integral element to humans and it can be detected via facial expression, voice synthesis and neural response.

“With EmotionAI, the interaction between AI assistant and a human could be as good as one between two humans, contextually aware as well as emotionally aware,” says Ranjan Kumar.

This article was summarized automatically with AI / Article-Σ ™/ BuildR BOT™.

Original link