Forum Institutional

What is information warfare and how pervasive is it?

information warfare

Information warfare has flourished, via new forms of media and old. Image: REUTERS/Pilar Olivares

John Letzing
Digital Editor, World Economic Forum
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Forum Institutional?
The Big Picture
Explore and monitor how Peace and Resilience is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Peace and Resilience

This article is part of: World Economic Forum Annual Meeting

Listen to the article

  • Using information distortion and suppression as weapons of war is nothing new.
  • But a wider range of options for waging information warfare means it's now more difficult to avoid.
  • The disinformation accompanying Russia's assault on Ukraine provides a stark example.

A confession: during the earliest days of Russia’s invasion of Ukraine I’d steal an occasional glimpse at RT, the Russian broadcaster accused of peddling propaganda, before it was yanked from my cable package in Switzerland.

Those peeks at an inverted version of events were infuriating. They also gave me a firmer grip on the unsettling but important fact that not everyone has been watching the same war unfold.

In fact, distorted counter-programming has flourished on messaging apps like Telegram, social media, and elsewhere – touting an imaginary logic for Russia’s “special military operation,” advocating carnage, and smearing Ukrainian refugees.

Russia’s onslaught spawned just the latest example of combat by propaganda, which may be more pervasive than many people assume. Efforts to destabilize rivals with weaponized disinformation have exploited divisive issues like vaccines, race, and human rights. They’ve effectively stirred fear and anger, and impacted elections.

Western governments pressed social media firms to remove pro-Russia propaganda after the war in Ukraine began, and outlets like RT have been wiped from popular platforms. That may create an impression in much of the world that related disinformation is on the decline.

But in places like Latvia, where the Soviet era left behind a large Russian population, people living side by side are confronting each other after being served starkly different takes on reality.

information warfare
Accounting for information warfare operations Image: World Economic Forum

Information warfare has come a long way since the days of dropping leaflets from an airplane (though that particular tactic is still in use).

Now that well over half the global population is online, and social media services have been engineered to activate our worst instincts, opportunities to distort and deceive abound.

Information warfare - then to now

During the Cold War, Soviet “active measures” included manipulating the media – for example, by tampering with the making of an otherwise legitimate documentary in West Germany to aggravate tensions over the country's Nazi past. The US and its allies countered with their own unsavory efforts.

The contemporary version relies on the internet for broader, more tailored distribution. Russia’s onslaught in Ukraine has been called the world’s first TikTok war, because so much messaging both for and against it has been boiled down to short videos on the app.

On Twitter, bots typically used to target people over a broad range of issues including COVID-19 vaccines were shifted into a narrower effort to prop up the Russian invasion. It was dubbed a “bot holiday.”

British troops fill shells with propaganda leaflets in January 1945.
British troops fill shells with propaganda leaflets in January 1945. Image: Imperial War Museums/Public Domain

Companies with large online audiences have tried to restrict pro-Russia messaging, but its purveyors seem to be finding ways around that.

Twitter first began publishing data related to “state-backed information operations” on its service nearly five years ago.

A few years after that, it detailed related policing efforts, like removing hundreds of Venezuelan accounts parroting official government narratives, and others attacking the Libyan government.

Soon after the invasion of Ukraine, Twitter started labeling certain tweets containing links to Russian state-backed media, alongside a prompt to “stay informed.” Within Russia, the government has simply blocked Twitter and other social media services.

While the flow of information may be warped in Russia, some people living outside of its restrictions nonetheless choose to opt in to pro-war propaganda – a group of Russian speakers that’s been called the “deceived generation.”

In a recent interview, Ukraine’s First Lady despaired of ever being able to successfully appeal to these people. “They don’t want to hear and see,” she said.

More reading on information warfare

For more context, here are links to further reading from the World Economic Forum's Strategic Intelligence platform:

  • It would be difficult to design a social platform more optimized for disinformation than WhatsApp, according to this piece, but the end-to-end encrypted service has recently showcased a feature that may help stem the flow. (NiemanLab)
  • Millions of Russians are blocked from accessing anything resembling free and independent news, according to this piece, but international news organizations are finding ways around this digital iron curtain. (The Atlantic)
  • This think tank created a website to provide a visual breakdown of state-linked information operations on social media that have been deployed to undermine NATO, Ukrainian leaders, and US domestic politics. (Australian Strategic Policy Institute)
  • An example of how information warfare works, according to this piece – tweets suggesting Ukrainians are faking civilian casualties featured a “corpse” moving in his body bag, who was actually a climate protestor in Vienna. (ProPublica)
  • During the buildup to and early days of the invasion of Ukraine, Russian state-backed propaganda designed to build support for the effort performed surprisingly well on Google News, according to this analysis. (Brookings)
  • Russian misinformation seeks to confound, not convince, according to this piece – and is designed to create decision paralysis that leads to inaction. (Scientific American)
  • A lack of action against social media sites for enabling information warfare on science has led to measurable harm, according to this piece, which attributes hundreds of excess deaths per day in the US to vaccine disinformation. (STAT)

On the Strategic Intelligence platform, you can find feeds of expert analysis related to Social Media, Peace and Resilience and hundreds of additional topics. You’ll need to register to view.

Image: World Economic Forum
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Forum InstitutionalIndustries in Depth
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Why private capital is indispensable to close the great development and climate financing divide

Mahmoud Mohieldin and Manuela Stefania Fulga

October 3, 2024

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum