Artificial Intelligence

Researchers want you to play this game to understand the risks of nuclear conflict

Researchers hope to learn more about how countries might act in nuclear warfare scenarios. Image: Lorenzo Vidali/Sandia National Laboratories

Kara Manke
Science Writer, News Office, Duke University
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Artificial Intelligence?
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Artificial Intelligence

Researchers designed the game to explore how various weapons capabilities, such as low-yield, high precision nuclear weapons, may affect the behavior of different actors in an escalating global conflict.

Starting May 15, SIGNAL will be available during open play windows—currently scheduled from 1 to 5 PM PDT every Wednesday and Thursday—for anyone to log on and play. The researchers may expand those times if there is enough interest.

“What we’re working toward is being able to better understand how different force structures, like what types of weapons you have in your arsenal, might change how people act in a crisis,” says Bethany Goldblum, a researcher in the nuclear engineering department at the University of California, Berkeley.

A demo of the online version of SIGNAL. Image: Lorenzo Vidali/Sandia National Laboratories

“The more we can understand that, the better we can inform policymakers on possible options for reducing the risk that those weapons pose to the world.”

On its surface, SIGNAL looks like many other military strategy board games: Each online player represents one of three hypothetical countries, and the goal of the game is to maintain territorial integrity while amassing more resources and infrastructure than your opponents. Players have the opportunity to “signal” their intent to take actions such as building civilian and military infrastructure or attacking an opponent with conventional, cyber, or nuclear weapons. Players can also negotiate trades and agreements with other players.

By tracking how players behave, the researchers hope to get a better understanding of how countries might react in times of conflict.

Military leaders and policymakers often explore these questions through “war games”—seminar-style discussions or tabletop exercises that explore tactical, operational or strategic aspects of a simulated conflict scenario. But these war games are limited, in that they only reveal how one specific set of people reacts to one specific set of circumstances, says Andrew Reddie, a PhD candidate in the political science department.

Social scientists may also use surveys to understand how a broader swath of the population would respond to a particular conflict, but this approach also has its drawbacks, Reddie says.

“If I give you a piece of paper and say, ‘Read this, and then write down a response,’ you have no skin in the game, as a research participant. You don’t win anything or lose anything on the basis of answering in any particular way, whereas, if I put you inside of a gaming environment, and I say, ‘Here are the win conditions,’ we are arguably more likely to get a true sense of how you would respond,” Reddie says.

Have you read?

“What’s cool about what we’re doing is we’re pioneering a new term, the experimental war game, which is to use this type of wargaming framework but for scientific inquiry.”

Goldblum and Reddie say they currently plan to collect plays through the end of summer, at which point they will start analyzing data from the game. In the future, they may also consider adding additional capabilities to examine how other emerging technologies, like cyberwarfare, artificial intelligence, or drones may also impact conflict escalation.

“There is no reason why we couldn’t include missile defense capabilities, or artificial intelligence decision support, and then use the same kind of experimental wargaming framework to be able to explore the potential impact of those capabilities,” Reddie says.

“I think that’s what’s really interesting about this method—while we were looking at this specific research question, what we ended up doing was developing a method that can be applicable for a variety of different research questions. It opened up a lot of space for inquiry.”

Sandia National Laboratories and Lawrence Livermore National Laboratory are partners in the effort providing subject matter expertise and student mentoring. Funding for the project came from a grant from the Carnegie Corporation of New York.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Artificial IntelligenceInternational SecurityMedia, Entertainment and Sport
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

How we can prepare for the future with foundational policy ideas for AI in education

TeachAI Steering Committee

April 16, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum