Nature and Biodiversity

How mixed reality is helping scientists study forests

New research shows how mixed reality can quantify relatively mature trees in the wild - one measure of forest health.

New research shows how mixed reality can quantify relatively mature trees in the wild - one measure of forest health. Image: Jeff Fitlow/Rice University)

Mike Williams
Senior Media Relations Specialist, Rice University
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Nature and Biodiversity?
The Big Picture
Explore and monitor how Virtual and Augmented Reality is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Virtual and Augmented Reality

  • New research shows how mixed reality can quantify relatively mature trees in the wild - one measure of forest health.
  • VegSense could be a suitable alternative to traditional classical field measurements at a low cost.
  • The software is open source and uses the HoloLens.
Loading...

To measure vegetation in the wild, researchers set up a Microsoft HoloLens as a mixed reality sensor to feed their application called VegSense.

A proof-of-concept study by Rice University graduate student Daniel Gorczynski and bioscientist Lydia Beaudrot shows VegSense could be a suitable alternative to traditional classical field measurements at a low cost.

Their study in Methods in Ecology and Evolution shows the hardware-software combination excels at quantifying relatively mature trees in the wild, which is one measure of a forest’s overall health.

Gorczynski came up with the idea to try HoloLens, commonly marketed as a productivity tool for manufacturing, health care, and education. He developed the open-source software for the device and notes that while the combination is less effective at picking up saplings and small branches, there’s ample room for improvement.

Gorczynski says he first encountered mixed reality sensing while an undergraduate at Vanderbilt University and recognized its potential for biological studies. “It seemed sort of like a natural fit,” he says. Gorczynski brought the idea to Beaudrot in 2019 shortly after his arrival at Rice.

The combination of stock hardware and custom software costs far less than systems based on lidar (for “light detection and ranging”) most often used in three-dimensional field studies, says Gorczynski, who developed VegSense on a platform geared more toward 3D games and interactive experiences than hard science.

Field tests at Houston’s Memorial Park showed that at least for mature trees, the smaller solution is just as good. In their case study, VegSense easily detected 48 of 50 such trees in the target area, a circle about 30 feet in diameter that Gorczynski walked, looking up, down, and around to build the 3D database. (“Imagine an asterisk with a circle around it,” he says, describing the data-capture pattern.)

“For this study, we wanted to be really deliberate in trying to replicate more traditional understory vegetation structure measurements,” Gorczynski says. “We tried to get that level of detail.”

What he sees as he scans the environment is a holograph-like grid pattern that tracks the surfaces of vegetation. “What’s really cool about that is you can see what the scanner is picking up, but also the spots you missed,” Gorczynski says. “The idea is to get the mesh to cover as much of the vegetation as possible because that’s what gets you the best scan.”

“The results were so nice that Dan quickly wrote it up for publication,” Beaudrot says, noting that Gorczynski expanded his validation of the gear during a subsequent field trip to Tanzania, the focus of one of 15 tropical forests in a recent rainforest study by the group.

“This device can facilitate a lot of great ecological research, particularly because it’s so cost-effective,” she says. “Collecting vegetation information on the forest floor right now is really hard to do without a lot of manual labor, or a really expensive lidar system.”

“So this is a groundbreaking, cost-effective device,” Beaudrot says. “It’s not going to give you the same resolution data that lidar will, but this is just the first application. We hope making VegSense open-source to the ecological research community will spur all the potential ways it can be developed.”

Northrop Grumman, Conservation International, and Rice supported the research.

Have you read?
Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

4 charts that show how organized crime is endangering wildlife and damaging ecosystems

Michelle Meineke

June 11, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum