Emerging Technologies

This AI can predict your personality just by looking at your eyes

An exhibitor presents replacement puppet eyes at the Northern German puppet, teddy bear and miniatures fair in Hamburg February 14, 2010. Picture taken February 14, 2010.  REUTERS/Morris Mac Matzen (GERMANY - Tags: SOCIETY BUSINESS)

Studies have shown that specific eye movements are signifiers of certain personality traits. Image: REUTERS/Morris Mac Matzen

Emma Charlton
Senior Writer, Forum Agenda
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Behavioural Sciences is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Behavioural Sciences

Are you neurotic, agreeable, an extravert, or conscientious? New technology purports to know you better than you know yourself.

Machine learning is bringing a new meaning to the phrase “in the blink of an eye” with researchers at the University of South Australia and the University of Stuttgart using eye movements to predict key traits.

The research underscores the close link between eyes and personality and may help enhance work enabling robots to interact better with humans. And while this study was conducted on a small scale, it sheds light on the importance of non-verbal communication that could aid the development of systems to recognize and interpret social signals.

“Thanks to the machine learning approach, we could automatically analyze a large set of eye movement characteristics and rank them by their importance for personality trait prediction,” the researchers wrote. “Going beyond characteristics investigated in earlier works, this approach also allowed us to identify new links between previously under-investigated eye movement characteristics and personality traits.”

Image: Wikimedia Commons

The project used artificial intelligence to track and monitor the eye movements of 42 individuals using tools from SensoMotoric Instruments. Those findings were then cross-checked with well-established questionnaires that define personality traits.

Of the five key traits – openness, conscientiousness, extraversion, agreeableness and neuroticism – the technology easily identified four: neuroticism, extraversion, agreeableness and conscientiousness.

Tracking eye movements

The 42 people were fitted with an eye tracker and given five Australian dollars and 10 minutes to make a purchase in a university campus shop. When they returned they removed the eye tracker and filled in personality and curiosity questionnaires.

The findings were analyzed to show how trait-specific eye movements vary across activities. While the study used a small sample and the authors said the predictions aren’t yet accurate enough for practical applications, it does shed light on the close link between personality and eye movements. Pupil diameter, for example, was important for predicting neuroticism.

Image: Frontiers in Human Neuroscience

The next step, they said, is the development of systems that can recognize and interpret human social signals.

And this sort of research is likely to increase in importance as AI becomes more and more embedded in the fabric of our lives. Already the technology powers virtual assistants like Apple’s Siri or Amazon’s Alexa.

Human-like robots

Making these services and other robots more human is a focus of much research, since machines are also already able to do many practical tasks. In China, hotels use robots to deliver goods to a room or help customers who have lost their way. And in Shanghai you can have your coffee made by a robot.

Yet at the moment, humans interact with these robots via an app, rather than face-to-face. Unlocking the power to help them converse intelligently with humans, learn from those interactions and respond with a computer version of empathy could pave the way for many other potential uses.

“Such knowledge of human non-verbal behaviour might also be transferred to socially interactive robots, designed to exhibit human-like behaviour,” the researchers wrote. “These systems might ultimately interact with humans in a more natural and socially acceptable way, thereby becoming more efficient and flexible.”

Have you read?
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

CEOs and CFOs, take note: The 3 pillars of successful GenAI adoption

Kalin Anev Janse and José Parra Moyano

April 22, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum