Elon Musk leads 116 experts calling for a ban on autonomous weapons
The campaign to stop 'killer robots' is calling on the United Nations for strict oversight of autonomous weapons. Image: REUTERS/Fabrizio Bensch
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:
Artificial Intelligence
One hundred and sixteen roboticists and AI researchers, including SpaceX founder Elon Musk and Google Deepmind co-founder Mustafa Suleyman, have signed a letter to the United Nations calling for strict oversight of autonomous weapons, a.k.a. "killer robots." Though the letter itself is more circumspect, an accompanying press release says the group wants "a ban on their use internationally."
Other signatories of the letter include executives and founders from Denmark’s Universal Robotics, Canada’s Element AI, and France’s Aldebaran Robotics.
The letter describes the risks of robotic weaponry in dire terms, and says that the need for strong action is urgent. It is aimed at a group of UN officials considering adding robotic weapons to the UN’s Convention on Certain Conventional Weapons. Dating back to 1981, the Convention and parallel treaties currently restrict chemical weapons, blinding laser weapons, mines, and other weapons deemed to cause “unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately.”
Robotic warriors could arguably reduce casualties among human soldiers – at least, those of the wealthiest and most advanced nations. But the risk to civilians is the headline concern of Musk and Suleyman’s group, who write that “these can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways."
The letter also warns that failure to act swiftly will lead to an “arms race” towards killer robots – but that’s arguably already underway. Autonomous weapons systems or precursor technologies are available or under development from firms including Raytheon, Dassault, MiG, and BAE Systems.
Element AI founder Yoshua Bengio had another intriguing warning – that weaponizing AI could actually “hurt the further development of AI’s good applications.” That’s precisely the scenario foreseen in Frank Herbert’s sci-fi novel Dune, set in a universe where all thinking machines are banned because of their role in past wars.
The UN weapons group was due to meet on Monday, August 21, but that meeting has reportedly been delayed until November.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Related topics:
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Emerging TechnologiesSee all
David Elliott
October 3, 2024
Mirek Dušek
October 1, 2024
David Elliott
September 30, 2024
Eleonore Pauwels and Steven Vosloo
September 27, 2024
Pooja Chhabria and Chris Hamill-Stewart
September 27, 2024
Emma Charlton
September 27, 2024