Exploring novel approaches to reduce the spread of misinformation could strengthen action on global issues. Image: Unsplash.
Explore and monitor how Media, Entertainment and Sport is affecting economies, industries and global issues
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:
Media, Entertainment and Sport
Listen to the article
- The 2023 Trust Barometer highlights that trust in media is declining, with especially low trust in social media.
- Studies show how changes to the incentive structures on social media could help reduce the spread of misinformation.
- As trust in media is declining and giving way to echo chambers, it is becoming harder to solve problems collaboratively.
- Exploring novel approaches to reduce the spread of misinformation could strengthen action on global issues.
By the end of today, 2.4 billion pieces of content will have been shared on Facebook, more than 90 million photos will get uploaded on Instagram, users will have uploaded over 700,000 hours of video content on YouTube and 490 million tweets will be released into cyberspace. That’s 347,000 tweets every minute.
Why do we spend so much precious time every day sharing information and updates with others? There are probably many reasons, but a key reason is that sharing insights and information with others is internally rewarding. Brain imaging studies have shown how the reward centre in the brain gets strongly activated when we share our wisdom with others and that’s no different on social media platforms.
Do readers trust what they read?
The 2023 Trust Barometer highlights that the media is not trusted, with especially low trust in social media (trust in traditional media has declined from 66% to 58%; social media down from 44% to 39%). As a shared media environment has given way to echo chambers, it is becoming harder to solve problems collaboratively.
If we take a closer look across selected media markets, news from social media is most popular in South Africa, while social media is considered the least trustworthy source of general news and information in Europe and North America.
What is clear is that we live in an era where social media has a key role as a platform, despite consumers’ doubts and reservations.
How does the brain process information?
If we look more closely at readers’ behaviour, we discover that when people read new information, they are quick to adopt data that is in line with their preconceived ideas, while any counterevidence is looked at with a much more critical eye.
As Tali Sharot, director of the Affective Brain Lab and Professor of Cognitive Neuroscience, shared with us, the brain tries to assess any piece of evidence in light of the knowledge it already stores.
How is the World Economic Forum shaping the future of media?
“When a new piece of evidence doesn’t fit, it’s either ignored or substantially altered. So, perhaps instead of trying to break an existing belief, we can attempt to implant a new belief altogether and highlight the positive aspects of the information that we’re offering.”
And there is more that comes into play, as social media platforms offer incentive structures – from “likes” to “shares” – that might help spread misinformation, as social rewards are completely disassociated from the truthfulness of the information.
In an age where the internet is the main source of news and information for many, audiences globally are at higher risk than ever of encountering and sharing fake news.
Every day, consumers read, watch, and share updates and posts, from their favourite celebrities to political candidates, often taking it for granted that their sources provide truthful and reliable information. As navigating the news media landscape is becoming more difficult, consumers have started actively avoiding the news altogether.
What if we could reduce the spread of misinformation?
A recent study tried exactly that. By adding buttons to provide feedback on trust and the perceived trustworthiness of posts on social media sites, the team showed that small changes to the incentive structure reduce the spread of misinformation while keeping user engagement high.
Crowdsourcing approaches to solving information challenges aren’t new. The ability of crowds to contribute through the relatively high-quality accumulation of knowledge has been clearly shown in other online settings, such as Wikipedia.
Such efforts are currently happening alongside other targeted crowd-sourced approaches for fact-checking. For example, Twitter’s Birdwatch, was tested as a community-based system to mobilize users to write and rate noted suspicious tweets in 2021. While recent studies suggested its effectiveness as a fact-checking method, it revealed some shortcomings in terms of low consensus and a clear focus on tweets by profiles with higher visibility.
Looking at the response of traditional media players, a newly formed team of 60 or so highly specialized journalists at the BBC are taking on board the challenge of mistrust in a new way, through its new brand BBC Verify. The ambition behind BBC Verify is to build audience trust by pulling back the curtain on how journalists’ work and how they know what they report is true and fact based.
Exploring these and other measures to shift incentive structures on social media platforms could contribute positively on issues that are often targeted by misinformation, strengthening efforts towards collective action on key priorities such as climate action and pandemic preparedness.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
The views expressed in this article are those of the author alone and not the World Economic Forum.
More on Media, Entertainment and SportSee all
March 1, 2024
February 20, 2024
January 16, 2024
Ben Feringa and Sir Paul Nurse
January 16, 2024
November 14, 2023