The four key ways disinformation is spread online

People on their smartphones as disinformation campaigns continue to spread online.

The speed at which online disinformation can spread makes it uniquely nefarious. Image: Unsplash/camilo jimenez

Spencer Feingold
Digital Editor, World Economic Forum
Share:
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale

Listen to the article

  • Social media has allowed online disinformation to flourish.
  • False and out-of-context information is largely propagated by people looking to distort public opinion and advance particular agendas.
  • Disinformation is often advanced in four key ways, according to two experts.

Social media has ushered in an era of unprecedented connectivity. Yet it has also allowed disinformation and so-called fake news campaigns to proliferate and flourish.

Disinformation—which includes false and out-of-context information spread with the intent to deceive or mislead—is largely propagated by people looking to distort public opinion and advance particular agendas.

“Key is how they exploit the inherent openness and opaqueness of the content ecosystem,” explained Doowan Lee and Adean Mills Golub, two experts on disinformation analysis and the co-founders of Veracity Authentication Systems Technology (VAST).

Disinformation can be propagated by a host of online actors, including governments, state-backed entities, extremist groups and individuals. For example, the World Economic Forum recently reported how one anonymous anti-Semitic account on the image board 4chan sparked a misinformation campaign that targeted the Forum.

At VAST, Lee and Mills Golub monitor content from over 10 billion websites in 75 languages to track how content spreads online. Disinformation campaigns, according to Lee and Mills Golub, are propagated in four key ways:

  • Social engineering: Providing a framework to mischaracterize and manipulate events, incidents, issues and public discourse. Social engineering is often aimed at swaying public opinion in favor of a certain agenda.
  • Inauthentic amplification: Using trolls, spam bots, false identity accounts known as sock puppets, paid accounts and sensational influencers to increase the volume of malign content.
  • Micro-targeting: Exploiting targeting tools designed for ad placements and user engagements on social media platforms to identify and engage the most likely audiences that will share and amplify disinformation.
  • Harassment and abuse: Using a mobilized audience, fake accounts and trolls to obscure, marginalize and drone out journalists, opposing views and transparent content.

Examples of disinformation infecting online and offline discourse in recent years are plentiful

During the 2016 US presidential election, for instance, Twitter identified over 50,000 Russian-linked spam accounts that were spreading divisive content related to the election. Climate change denial, the Russian invasion of Ukraine and war in Syria are other issues that have been steeped in disinformation.

The COVID-19 pandemic has also been plagued by disinformation. In fact, the issue has been so severe that pandemic-related disinformation was dubbed a so-called infodemic. “There seems to be barely an area left untouched by disinformation in relation to the COVID-19 crisis,” Guy Berger, a top UNESCO official and one of the United Nations’ leading figures combating disinformation, said in 2020.

Experts note that one of the common facets of disinformation campaigns is discrediting authoritative voices

Ruth Ben-Ghiat, a professor of history at New York University who studies authoritarian leaders and propaganda, explained that purveyors of disinformation often attempt sow doubt on elites and trustworthy sources by connecting them to “supposed conspiracies to control and harm the population” and by portraying them as corrupt “cabals” associated with lewdness.

“Anti-science and anti-'globalism' are related,” Ben-Ghiat said.

The best way to combat disinformation remains a complicated topic of debate. Yet experts largely agree that cooperation between the public, regulators and social media companies is necessary—and that curbing the spread of disinformation is crucial.

As Lee and Mills Golub note, "the more of the content ecosystem [disinformation campaigns] occupy, the more challenging for trusted organizations to compete for audiences.”

Have you read?
Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum