Software systems, including those that control robots, are beginning to be able to interpret the emotional nuances of human behaviour. These innovations may very well lead to machines that are emotionally intelligent enough to engage in human interactions on a deeper level. For the most part, these will not be like the AIs in Her or Ex Machina, seducing us into romance. Instead, their EQ will enable us to forget they are machines at all.

Human emotional intelligence is a balance of awareness of self and awareness of others. For machines, self-awareness is limited to their effect on our affect. Artificial intelligence, for example, raises EQ through monitoring human behaviour and correctly interpreting our emotions as the context for its actions. Until the AIs “wake up”, this will suffice for most economic applications.

“Advertising is the low-hanging fruit,” says Nick Langeveld, CEO of emotional analytics software maker Affectiva, “it’s a great way to monetise and fund our research.” Affectiva captures human facial data and uses them to interpret emotional states. These kinds of insights are also extremely valuable to the $40bn global market research industry—but this is just the start.

Affectiva emerged out of the MIT Affective Computing Lab in 2009. Elliott Hedman, also an alumnus of that lab, now runs research consultancy mPath. He uses psycho-physiology—the study of how the mind and body interact—to detect user stress and help companies make their products more empathetic. The healthcare industry is a prime target for this approach, says Mr Hedman: “If patients have a better experience, they have better outcomes and are more likely to come back.”

Yuval Mor, CEO of Israeli software company BeyondVerbal, also sees healthcare and wellness as key applications for emotional AI. His company uses vocal intonation data to understand a speaker’s mood, attitude and personality. Because of how much speech people produce every day and how easy it is to capture, vocal analysis lends itself to continuous monitoring. “Being able to measure changes over time,“ says Mr Mor, “is where a lot of the insights are coming from.”

The Internet of Things will provide many opportunities for these technologies. Both Mr Langeveld and Mr Mor concur that the multi-trillion-dollar auto industry is particularly interested in improving the driver’s emotional experience of its products—electronics and software are a big innovation focus for the industry. Household companion robots, like Jibo, will be shipping this year, but the eventual effect of affective autos is expected to be significant.

By the time this happens, people will already be accustomed to simulated humans on screens or in immersive goggles. “I have visits from executives from Fortune 500 companies weekly,” says Jeremy Bailenson of Stanford’s Virtual Human Interaction Lab. “One of the most common questions I get involves AI—when will virtual reality characters feature true emotional intelligence?” These characters will not only sell, but also teach, coach and advise.

Artificial emotional intelligence, therefore, does not have to be as good as a human’s to be useful. It will mimic the appearance and timing of human emotion with “subtle micro-movements that make interactions feel real”, says Mr Bailenson. Voice, face, gestures and physiology are deeply complementary. Successful AI EQ could help weave them all together to help us cross the “uncanny valley” of human-machine communication.

This article is published in collaboration with GE Look Ahead. Publication does not imply endorsement of views by the World Economic Forum.

To keep up with the Agenda subscribe to our weekly newsletter.

Author: Anthony Wing Kosner writes for GE Look Ahead.

Image: Humanoid robot of British company RoboThespian “blushes” during the opening ceremony of the Hanover technology fair Cebit.     REUTERS/Wolfgang Rattay