Computers can now read your emotions. Here’s why that’s not as scary as it sounds
Computers are starting to build emotional intelligence Image: REUTERS/Peter Nicholls
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:
Values
Emotions critically influence all aspects of our lives, from how we live, work, learn and play, to the decisions we make, big and small. Emotions drive how we communicate and connect with each other, and impact our health and well-being. Human emotional intelligence (or your EQ) is our ability to recognize not only our own emotions but also those of other people, and to use emotions to guide our behaviour, adapt to different environments and achieve our goals. Humans with high EQ lead more successful professional and personal lives – they are more likable and more persuasive, tend to be more effective leaders, and generally lead healthier, happier and even longer lives.
Today, our lives play out in a digital world. We are surrounded by lots of hyper-connected systems, smart devices and advanced AI (artificial intelligence) systems. In other words, lots of IQ, but no EQ. That’s a problem, especially as our interactions with technology are becoming more conversational and relational. Just look at how we use our mobile devices and interact with intelligent agents such as Siri and Amazon’s Alexa. These technologies that are designed to interact with humans need emotional intelligence to be effective. Specifically, they need to be able to sense human emotions and then adapt their operation accordingly. My company, Affectiva, is on a mission to humanize technology with artificial emotional intelligence, or as I like to call it: Emotion AI.
What is Emotion AI? You might also know this as emotion recognition technology. Our Emotion AI unobtrusively measures facial expressions of emotion. Using just a standard webcam, our technology first identifies a human face in real time or in an image or video. Computer vision algorithms identify key landmarks on the face – for example the corners of your eyebrows, the tip of your nose, the corners of your mouth. Machine learning algorithms then analyse pixels in those regions to classify facial expressions. Combinations of these facial expressions are then mapped to emotions. Now also using deep learning approaches, we can very quickly tune our algorithms for high performance and accuracy.
Emotion AI uses massive amounts of data. In fact, Affectiva has built the world’s largest emotion data repository. We have analysed more than 5.2 million faces in 75 countries. That is really important because people around the world don’t look the same, and certainly express emotion differently when they go about their daily business “in the wild”. In all this data we have gathered we are seeing very interesting aspects of human emotional behaviour.
For example, we all know women are more expressive than men. Our data not only confirms that, it also shows that women smile more and that their smiles last longer. Then there are cultural differences: in the US women smile about 25% more, in France 40% more, but curiously in the UK we found no difference between men and women. The Spanish are more expressive than Egyptians, but apparently we Egyptians show more positive emotion. We can also detect notions of the polite smile seen in cultures such as Japan. In general, in more collectivist cultures we see that in group settings people dampen their emotions, but are very expressive when they are at home alone. In more individualistic cultures, such as North America and Europe, it’s the opposite – people are more expressive in group settings than when they are by themselves. Maybe that’s because there are cultural norms that make the individual stand out from the pack?
So how is Emotion AI being used?
Over 1,400 brands are using our tech to measure and analyse how consumers respond to digital content, such as videos and ads, and even TV shows. Emotion data helps media companies, brands and advertisers improve their advertising. Emotion AI also gets integrated into other technologies to make them emotion-aware. Now with our Software Developer Kit (SDK), any developer can embed Emotion AI into the apps, games, devices and digital experiences they are building, so that these can sense human emotion and adapt. This approach is rapidly driving more ubiquitous use of Emotion AI across a number of different industries.
Robots such as Mabu and Tega are using Emotion AI to understand the moods and expressions of the people they interact with. In education, Emotion AI will understand if a student is frustrated or bored, but what if the learning content would adapt? The Little Dragon learning app is among the first of such adaptive apps designed to help children learn language in a more interactive and interesting way.
Video games are designed to take us on an emotional journey but do not change their gameplay based on the emotions of the player. The Nevermind game changes that all around – this biofeedback thriller game gets more surreal and challenging as players show signs of distress. In healthcare the impact of Emotion AI can perhaps be the most significant, from drug efficacy testing and telemedicine, to research in depression, suicide prevention and autism. The team at Brain Power has built an autism program that is already changing the lives of families with children on the autism spectrum. There are many more examples in automotive, retail and even the legal industry where emotion recognition technology is in use.
We recognize that emotions are private and we always want to be transparent about how our technology works and how it’s being used. In these emotion-aware digital experiences we want people to have the option to opt out or turn off this emotion sensing capability. And if these solutions are compelling enough people will choose to use this and get value out of it.
I am often asked what the future holds for Emotion AI, and my answer is simple: it will be ubiquitous, engrained in the technologies we use every day, running in the background, making our tech interactions more personalized, relevant, authentic and interactive.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Related topics:
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Emerging TechnologiesSee all
Patrick McMaster
October 9, 2024
Simon Torkington
October 8, 2024
Sebastian Buckup
October 7, 2024
Andrea Willige
October 4, 2024