Forum Institutional

Why we need to regulate non-state use of arms

Non-state actors can now convert civilian products like drones into lethal autonomous weapons.

Non-state actors can now convert civilian products like drones into lethal autonomous weapons. Image: Reuters

Stuart Russell
Professor of Computer Science and Director of the Center for Human-Compatible AI, University of California, Berkeley
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Forum Institutional?
The Big Picture
Explore and monitor how Drones is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Drones

This article is part of: World Economic Forum Annual Meeting

Listen to the article

  • Using open-source AI and lightweight onboard processing, civilian devices, such as drones used for photography, can be converted into lethal autonomous weapons.
  • Once the software of commercial unmanned aerial vehicles (UAVs) is adapted for lethal purposes, global dissemination is inevitable and there is no practical way to prevent the spread of this code.
  • States must agree on anti-proliferation measures so that non-state actors cannot create very large numbers of weapons by repurposing civilian products.

An emerging arms race between major powers in the area of lethal autonomous weapons systems is attracting a great deal of attention. Negotiations on a potential treaty to ban such weapons have stalled while the technology rapidly advances.

Less attention has been paid to the fact that open-source artificial intelligence (AI) capabilities and lightweight, low-power onboard processing make it possible to create “home-made” autonomous weapons by converting civilian devices, such as camera-equipped quadcopters.

Have you read?

Home-made lethal autonomous weapons

Non-state actors can now deploy home-made, remotely piloted drones, as well as weapons that, like cruise missiles, can pilot themselves to designated target locations and deliver explosive materials. Examples include an attack on Russian bases in Syria involving 13 drones and an assassination attempt against the Prime Minister of Iraq. The Washington Post reports that “[t]he [Iranian] Quds Force has supplied training for militants on how to modify commercial UAVs [unmanned aerial vehicles] for military use.”

Will home-made, fully autonomous weapons be the next logical step? Such weapons could evade or destroy defensive systems, locate and attack targets based on visual criteria or hunt down individual humans using face or gait recognition.

Already, commercial UAVs can manoeuvre through fields of dense obstacles and lock onto specific humans. Once such software is adapted for lethal purposes, global dissemination is inevitable. There is no practical way to prevent the spread of this code. The only plausible countermeasure is to prevent the proliferation of the underlying physical platforms.

More countries are using drones, which can potentially be converted into lethal autonomous weapons.
More countries are using drones, which can potentially be converted into lethal autonomous weapons. Image: Statista

There seems to be no obvious way to stop the manufacture and use of small numbers of home-made autonomous weapons. These would not present a very different threat from small numbers of remotely piloted weapons, which non-state actors can easily deploy. With either technology, we might expect civilian casualties in “atrocity” attacks to rise from two to three figures. On the other hand, with no need for communication links, which can be jammed or traced, remote assassination could become more of a problem.

The real threat, however, is from large numbers of weapons: swarms capable of causing tens of thousands to, eventually, millions of casualties. Already, aerobatic displays involving more than 3,000 small, centrally controlled UAVs are becoming routine at corporate events. At present, making such UAVs fully autonomous requires onboard processing and power requirements that can be carried only by relatively large (50cm or so) quadcopters. However, ASICs – low-power, special-purpose chips with all the AI built-in – could lead to lethal autonomous weapons only a few centimetres in diameter. A million such devices could fit inside a standard shipping container. In other words, entities able to put together large numbers of small autonomous weapons can create weapons of mass destruction.

It would make sense, therefore, for nation-states to agree on anti-proliferation measures so that non-state actors cannot create very large numbers of weapons by repurposing civilian products. There are ample precedents for this: for example in the Nuclear Non-Proliferation Treaty’s rules on ownership and transfer of nuclear materials, reflected in the Nuclear Regulatory Commission’s detailed rules on how much uranium individuals can purchase or possess; and in the procedures of the Organization for the Prohibition of Chemical Weapons to ensure that industrial chemicals are not diverted into the production of weapons.

Specifically, all manufacturers of civilian UAVs such as quadcopters and small, fixed-wing planes should implement measures such as know-your-customer rules, geofencing and hardware kill switches. Nation-states can also use intelligence measures to detect and prevent attempts to accumulate components in large quantities or build assembly lines. It would also be helpful for global professional societies in AI – including the Association for Computing Machinery, the Association for the Advancement of Artificial Intelligence and the Institute of Electrical and Electronics Engineers – to adopt policies and promulgate codes of conduct prohibiting the development of software that can choose to kill humans.

Lethal autonomous weapons have low support

Support for lethal autonomous weapons is low in many countries
Support for lethal autonomous weapons is low in many countries Image: Statista

As remotely piloted weapons become widespread tools of war, it is also important to ensure that they cannot easily be converted to autonomous operation via software changes. A small terrorist group can deploy only a small number of remotely piloted weapons, but it can deploy a very large number of autonomous weapons, for the simple reason that they do not require human pilots. Conversion can be made much more difficult if these weapons are designed with no connection between onboard processors and firing circuits.

Of course, global efforts to prevent the large-scale diversion of civilian technology to create lethal autonomous weapons are pointless if non-state actors can easily obtain lethal autonomous weapons direct from the manufacturer. For example, Turkey’s STM sells the Kargu drone, announced in 2017 with a 1.1kg warhead and claimed to possess “autonomous hit” capability, face recognition, and so on. Kargus have been delivered to non-state actors and used in 2020 in Libya despite an arms embargo, according to the UN.

If it makes sense to prevent non-state actors from building their own weapons of mass destruction, then it also makes sense to prevent arms manufacturers from doing it for them. In other words, the world's major powers have every reason to support, rather than block, a treaty banning the development, manufacture, deployment and use of lethal autonomous weapons.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Forum InstitutionalResilience, Peace and Security
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

What to expect at the Special Meeting on Global Collaboration, Growth and Energy for Development

Spencer Feingold and Gayle Markovitz

April 19, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum