Autonomous weapons

Autonomous weapons might seem like the far-off stuff of sci-fi and dystopian pop culture, but autonomous weapons in various shapes and sizes, are being used and developed across the world. This raises serious issues and concerns over ethics, responsibility and the law, which emphasise the need for greater restrictions and guidance over the use and development of these weapons.  

An autonomous weapon is a type of weapon that can search for, identify, track, and attack a target without human intervention. These weapons require a human to activate them and input a target, but the weapon uses artificial intelligence alone to find and attack the target. This happens by sensors and software searching for the target. If the weapon loses its communication link, the operator will not know exactly where, when or who/what it will attack. Autonomous weapons are sometimes called ‘killer robots’.  

While some autonomous weapons have existed for years, they have been limited in terms of length of operation, geographical limits and certain environments. This is changing as technological advances have enabled these weapons to be developed further and new weapons with advanced capabilities to be created.  

Examples of autonomous weapons include drones equipped with AI, stationary autonomous guns as well as unmanned vehicles capable of carrying weapons.  

Although it is hard to know, and technological developments are being kept quiet, it is widely believed that autonomous weapons are already being used. Foreign Policy Magazine reported back in May 2022 that Israel, Russia, South Korea and Turkey had already used autonomous weapons.1 Countries including Australia, the UK, China and the US were also listed as investing heavily in developing autonomous weapons.2  

In a more recent article, the BBC reported on drones with autonomous weapons being used by Russia in Ukraine, stating that ‘the move to drone warfare is a combination of necessity and innovation’.3  

Autonomous weapons are here, and they are taking lives.  

drone line drawings

There are many ethical and legal issues with autonomous weapons, particularly around accountability and responsibility. There are varying degrees of autonomy and, on the whole, militaries will want to use weapons with autonomous capabilities, with a person in the loop who permits the machine to strike a target (after the machine has identified and selected a target). The concern here is that people tend to place too much trust in technology. What awareness of the battlefield will the operator (potentially located thousands of miles away) have in practice?  

If communication links are broken or if fast decisions are required, a military advantage could be gained by dispensing with the person in the loop. Using a weapon to identify and eliminate an enemy target autonomously weakens the link between action and responsibility that is intrinsic to centuries of accepted ethics on warfare and international law.  

Autonomous weapons pose issues for legal responsibility, particularly when things go wrong. In war, things inevitably do go wrong, regardless of whether a human or a machine is in control. Accountability is much harder when a human did not fire the weapon or order the specific attack. If international law is broken, the wrong target is killed or a civilian bus is attacked instead of a military vehicle, who is responsible and who can explain what happened?  

The main ethical issues around autonomous weapons are over enabling life and death decisions to be made by technology, sensors and software, completely removing any agency, context or feelings out of decisions. The International Committee of the Red Cross (ICRC) describe this as a ‘dehumanising process that undermines our values and our shared humanity’. 4   

Do we want a battlefield to become one where humans are reduced by technology to ‘datasets’, reducing human agency, context or feelings?  

We are calling for greater restrictions and guidance on making the use of artificial intelligence safer, and for the well-being of people and planet. The United Nations and the International Committee of the Red Cross are calling on political leaders to urgently establish new international rules on autonomous weapon systems, to protect humanity.5 Many governments, NGOs and faith groups have joined the call for negotiations to begin with legally binding prohibitions and restrictions on autonomous weapons. They also called for clear restrictions for all autonomous weapons to ensure compliance with international law and ethics. This would include limiting the use of where, when, for how long, strength of force, and the types of targets for which these weapons are used, as well as ensuring effective human supervision, intervention and deactivation.  

“Despite the increasing reports of testing and use of various types of autonomous weapon systems, it is not too late to take action.”   UN Secretary-General António Guterres and ICRC President Mirjana Spoljaric  

Engage further by learning more about these issues in different ways with JPIT’s Future of Arms project.  

Sign the Stop Killer Robots petition

Call on government leaders around the world to launch negotiations for new international law on autonomy in weapons systems – to ensure human control in the use of force and to prohibit machines that target people, reducing us to objects, stereotypes, and data points.

What you need to know about autonomous weapons | ICRC  

Killer Robots | Human Rights Watch (hrw.org)  

Lethal Autonomous Weapons Systems Are Here—and We Need to Regulate Them (foreignpolicy.com)  

Ukraine thrown into war’s bleak future as drones open new front – BBC News  

UN and Red Cross call for restrictions on autonomous weapon systems to protect humanity | UN News  

First Committee Approves New Resolution on Lethal Autonomous Weapons, as Speaker Warns ‘An Algorithm Must Not Be in Full Control of Decisions Involving Killing’ | Meetings Coverage and Press Releases (un.org)  

Annie Sharples, previous JPIT intern

Source

This entry was posted in Latest News.. Bookmark the permalink.