World Economic Forum Annual Meeting

20-23 January 2016 Davos-Klosters, Switzerland

Would you rather send man or machine to fight in a war? Would you rather be attacked by a human or a robot?

These were poll questions posed to the session audience. The response was clear: when it came to the battlefield, people would prefer combatants to be automated. But faced with the prospect of being attacked themselves, most would prefer a human assailant.

“This betrays extraordinary confidence in the sophistication of artificial intelligence,” says Alan Winfield, engineer and roboethicist. “The state of the art, now and in the near future, isn’t that high.”

So, while lean, mean killing machines may be a way off, we need to act now to ensure regulations are up-to-date and ready for when innovations in the field of automated weaponry take place.

One group has been working hard to galvanize policy-makers: Campaign to Stop Killer Robots believe it’s time for formal policies on the use of autonomous weapons to be put in place.

Here’s their report from the 2015 Convention on Conventional Weapons at the United Nations in Geneva. Its central premise: human control.

Two-thirds of the states that spoke referred to the need for meaningful or effective or adequate human control. Countries continued to return to the notion of meaningful human control throughout the week, indicating its central relevance as a “touchstone” for addressing fully autonomous weapons.
- Campaign to Stop Killer Robots

Here's an excerpt from the Special Rapporteur’s Report to the Human Rights Council:

Lethal autonomous robotics (LARs) are weapon systems that, once activated, can select and engage targets without further human intervention. They raise far-reaching concerns about the protection of life during war and peace. Read more here

Can machines be moral?

The laws of war are difficult even for human soldiers to follow, says Stuart Russell, Professor of Computer Science at Berkeley. You can’t attack civilians or pilots as they parachute from a burning aeroplane. Can a machine follow the rules?

Can machines be accountable?

Artificial weapons become devoid of any responsibility and find difficulty in discrimination, says Roger Carr of BAE Systems. They observe the rules of war with no emotion or concern, mercy or even identification of who is friend or foe. There needs to be a treaty on the proper use and development of unmanned weapons.

This video from the International Committee for Robot Arms Control explains that a ban on killer robots would not have a negative effect on the development of other robotics applications and research.

Small, lightweight and deadly

There are very good uses for automated systems, says Angela Kane of the Vienna Center for Disarmament and Non-Proliferation. "There's underwater engineering and mine clearance, for example. Nuclear weapons are not easy to make because you have to get hold of all the materials. With drones, however, development is accelerating. People are already printing 3D guns."

Russell explains: "The robots we’re talking about here are systems that weigh less than an ounce and fly faster than a person can run. It can blow holes in their heads with 1 gram of explosives and can be launched in the millions."

"We should never give the decision to kill humans to a machine. I would like to see those countries that have the knowledge elevating the debate on this," he adds.

What do you think?

Do you agree with the panellists as to the danger of artificial weapons? If so, write to your member of parliament and tell them, says Russell. "Policy-makers need to know from us, the people, that it’s not acceptable," he says.

As it happened Last update: 23 Jan 15:06 UTC

Read more Link arrow white
Moderated by

Tiny teypda1wtcatwpqaedzeqjoaczyarboy3 ozbkmnleg Michael Duffy


Tiny lzptxyn9 j41dk2bjdhdmbwu4kh21 dmrnrcb2q0ock Stuart Russell

Tiny pzr5pa ioh0hbdc ycbk03o jbt2elxyhsm1kgqwoym Roger Carr

Tiny sor73sgu69y9gz6ehh20s9kfvl4on1a8kytkycpbqi8 Alan Winfield

Tiny flnizoldgvz59o9avqkfb9didotnqtlw7qecqc4wxgg Angela Kane