Should you believe your eyes? Not necessarily in virtual reality

People of all ages experiencing virtual reality

In virtual reality, customers are more likely to base their product expectations on past experiences, new research suggests. Image: Unsplash/Lucrezia Carnelos

Neuroscience News
Share:
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
  • Virtual reality (VR) is advancing but there's still a lot to learn about how we process information in virtual environments.
  • Our perception of VR is strongly influenced by expectations rather than the visual information before our eyes, new research suggests.
  • But more information on perceptual differences is needed to support assumptions that research outcomes from VR will generalize to the real world.

A new study by Western neuroscientists suggests that, unlike true reality, perception in virtual reality is more strongly influenced by our expectations than the visual information before our eyes.

The researchers point to the challenge of online shopping, where customers sometimes mis-estimate the size of a product based on their expectations, discovering for example that a sweater purchased online is indeed lovely but sized for a doll not an adult.

This happens in part because the physical cues to size that are present when seeing an item in a store are typically eliminated when viewing photos online. Without seeing the physical object, customers base their expectations of familiar size on prior experience. Since most sweaters are sized for people, not dolls, the visual system assumes that an unfamiliar sweater is, too.

The advent of virtual reality offers new opportunities for applications like online shopping and also for research on visual perception. But researchers wanted to understand if users of virtual reality perceive size as accurately as they do in the real world.

A research team, led by Canada Research Chair in Immersive Neuroscience Jody Culham, presented study participants with a variety of familiar objects like dice and sports balls in virtual reality and asked them to estimate the object sizes. The trick? Objects were presented not only at their typical ‘familiar’ sizes, but also at unusual sizes (e.g., die-sized Rubik’s cubes).

The researchers found that participants consistently perceived the virtual objects at the size they expected, rather than the actual presented size. This effect was much stronger in virtual reality than for real objects.

“While virtual reality is a useful research tool with many real-world applications, we cannot assume it is always an accurate proxy for the real world,” said Culham, a psychology professor and senior author on the study.

“It is promising to see advances in virtual reality and its applications, but there is still a lot we don’t understand about how we process information in virtual environments. If we need to rely heavily on past experiences to judge the size of objects in virtual reality, this suggests other visual cues to size may be less reliable than in the real world.”

A man wearing a virtual reality headset.
The advent of virtual reality offers new opportunities for applications like online shopping. Image: Neuroscience.

Yet, the results of this study also have some promising implications.

“If we know that familiar objects can serve as strong size cues in virtual reality, we can use this information to our advantage,” said Anna Rzepka, a former student in the Culham Lab and co-first author on the study.

“Think about viewing an item in a scene where accurate size perception is crucial, such as when removing a tumor using image-guided surgery. Adding other familiar objects to the virtual scene could improve perception of the tumor’s size and location, leading to better outcomes.”

Loading...

Abstract

Familiar size affects perception differently in virtual reality and the real world

The promise of virtual reality (VR) as a tool for perceptual and cognitive research rests on the assumption that perception in virtual environments generalizes to the real world. Here, we conducted two experiments to compare size and distance perception between VR and physical reality (Maltz et al. 2021 J. Vis.21, 1–18).

In experiment 1, we used VR to present dice and Rubik’s cubes at their typical sizes or reversed sizes at distances that maintained a constant visual angle. After viewing the stimuli binocularly (to provide vergence and disparity information) or monocularly, participants manually estimated perceived size and distance.

Unlike physical reality, where participants relied less on familiar size and more on presented size during binocular versus monocular viewing, in VR participants relied heavily on familiar size regardless of the availability of binocular cues.

In experiment 2, we demonstrated that the effects in VR generalized to other stimuli and to a higher quality VR headset. These results suggest that the use of binocular cues and familiar size differs substantially between virtual and physical reality.

A deeper understanding of perceptual differences is necessary before assuming that research outcomes from VR will generalize to the real world.

Have you read?
Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum