Mental Health

The language you speak doesn't look like it changes how your brain is organized

Plaster phrenological models of heads, showing different parts of the brain, are seen at an exhibition at the Wellcome Collection in London March 27, 2012.  We've pickled it, dessicated it, drilled it, mummified it, chopped it and sliced it over centuries, yet as the most complex entity in the known universe, the human brain remains a mysterious fascination. With samples of Albert Einstein's preserved brain on slides, and specimens from other famous and infamous heads such as the English mathematician Charles Babbage and notorious mass murderer William Burke, an exhibition opening in London this week is seeking to tap into that intrigue. The exhibition Brains: The Mind As Matter runs from March 29 to June 17. REUTERS/Chris Helgren       (BRITAIN - Tags: SCIENCE TECHNOLOGY SOCIETY HEALTH) - RTR2ZYK1

Study shows brain 'reads' sentences in the same way across languages. Image: REUTERS/Chris Helgren

Shilo Rea
Director of Media Relations, Dietrich College of Humanities & Social Sciences, Carnegie Mellon University
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Mental Health?
The Big Picture
Explore and monitor how Mental Health is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Mental Health

When the brain “reads” or decodes a sentence in English or Portuguese, its neural activation patterns are the same, researchers report.

Published in NeuroImage, the study is the first to show that different languages have similar neural signatures for describing events and scenes. By using a machine-learning algorithm, the research team was able to understand the relationship between sentence meaning and brain activation patterns in English and then recognize sentence meaning based on activation patterns in Portuguese.

The findings can be used to improve machine translation, brain decoding across languages, and, potentially, second language instruction.

“This tells us that, for the most part, the language we happen to learn to speak does not change the organization of the brain,” says Marcel Just, professor of psychology at Carnegie Mellon University.

 Carnegie Mellon Brain Scan.
Image: Carnegie Mellon

“Semantic information is represented in the same place in the brain and the same pattern of intensities for everyone. Knowing this means that brain to brain or brain to computer interfaces can probably be the same for speakers of all languages,” Just says.

For the study, 15 native Portuguese speakers—eight were bilingual in Portuguese and English—read 60 sentences in Portuguese while in a functional magnetic resonance imaging (fMRI) scanner. A computational model developed at Carnegie Mellon was able to predict which sentences the participants were reading in Portuguese, based only on activation patterns.

The computational model uses a set of 42 concept-level semantic features and six markers of the concepts’ roles in the sentence, such as agent or action, to identify brain activation patterns in English.

With 67 percent accuracy, the model predicted which sentences were read in Portuguese. The resulting brain images showed that the activation patterns for the 60 sentences were in the same brain locations and at similar intensity levels for both English and Portuguese sentences.

Additionally, the results revealed the activation patterns could be grouped into four semantic categories, depending on the sentence’s focus: people, places, actions, and feelings. The groupings were very similar across languages, reinforcing the organization of information in the brain is the same regardless of the language in which it is expressed.

“The cross-language prediction model captured the conceptual gist of the described event or state in the sentences, rather than depending on particular language idiosyncrasies. It demonstrated a meta-language prediction capability from neural signals across people, languages, and bilingual status,” says Ying Yang, a postdoctoral associate in psychology and first author of the study.

Additional coauthors are from Carnegie Mellon and the Federal University of Santa Catarina in Brazil.

The Office of the Director of National Intelligence and the Intelligence Advanced Research Projects Activity via the US Air Force Research Laboratory funded this research.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

From 'Quit-Tok' to proximity bias, here are 11 buzzwords from the world of hybrid work

Kate Whiting

April 17, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum