Emerging Technologies

Elon Musk leads 116 experts calling for a ban on autonomous weapons

The humanoid robot AILA (artificial intelligence lightweight android) operates a switchboard during a demonstration by the German research centre for artificial intelligence at the CeBit computer fair in Hanover March, 5, 2013. The biggest fair of its kind open its doors to the public on March 5 and will run till March 9, 2013.  REUTERS/Fabrizio Bensch (GERMANY - Tags: BUSINESS SCIENCE TECHNOLOGY) - RTR3ELOF

The campaign to stop 'killer robots' is calling on the United Nations for strict oversight of autonomous weapons. Image: REUTERS/Fabrizio Bensch

David Z. Morris
Technology Writer, Fortune
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Artificial Intelligence

One hundred and sixteen roboticists and AI researchers, including SpaceX founder Elon Musk and Google Deepmind co-founder Mustafa Suleyman, have signed a letter to the United Nations calling for strict oversight of autonomous weapons, a.k.a. "killer robots." Though the letter itself is more circumspect, an accompanying press release says the group wants "a ban on their use internationally."

Other signatories of the letter include executives and founders from Denmark’s Universal Robotics, Canada’s Element AI, and France’s Aldebaran Robotics.

Image: Campaign to Stop Killer Robots

The letter describes the risks of robotic weaponry in dire terms, and says that the need for strong action is urgent. It is aimed at a group of UN officials considering adding robotic weapons to the UN’s Convention on Certain Conventional Weapons. Dating back to 1981, the Convention and parallel treaties currently restrict chemical weapons, blinding laser weapons, mines, and other weapons deemed to cause “unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately.”

Have you read?

Robotic warriors could arguably reduce casualties among human soldiers – at least, those of the wealthiest and most advanced nations. But the risk to civilians is the headline concern of Musk and Suleyman’s group, who write that “these can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways."

The letter also warns that failure to act swiftly will lead to an “arms race” towards killer robots – but that’s arguably already underway. Autonomous weapons systems or precursor technologies are available or under development from firms including Raytheon, Dassault, MiG, and BAE Systems.

Element AI founder Yoshua Bengio had another intriguing warning – that weaponizing AI could actually “hurt the further development of AI’s good applications.” That’s precisely the scenario foreseen in Frank Herbert’s sci-fi novel Dune, set in a universe where all thinking machines are banned because of their role in past wars.

The UN weapons group was due to meet on Monday, August 21, but that meeting has reportedly been delayed until November.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Emerging TechnologiesManufacturing and Value Chains
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

A short history of AI in 10 landmark moments

David Elliott

October 3, 2024

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum