The case against Killer Robots

Image description: A protest, with banners with slogans like “Stop Killer Drones”. 

On Saturday 12th December, youth representatives from 20 countries, alongside members of the UN Disarmament Committee, representatives of the Japanese Ministry of Foreign Affairs and the Director of Human Rights Watch Japan, convened online for the International Student Conference on Fully Autonomous Weapons.

Tokyo’s office of Human Rights Watch, one of the 170+ NGOs which comprises the global Campaign to Stop Killer Robots, hosted the event. I was honoured to be speaking as the youth representative from the UK, and it was inspiring to hear fellow youth activists from around the world make powerful speeches highlighting the myriad issues surrounding lethal autonomous weapons systems (also known as “killer robots”, or LAWS), and urging governments to support international legal regulation of them.

Thanks to coverage by the Japanese broadcasting service  NHK World, the event is intended to reach a widespread audience, placing pressure on the Japanese government to fully engage with discussions on legal regulation of LAWS at the Convention on Certain Conventional Weapons (CCCW) at the UN. The CCCW is the forum which seeks to “ban or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to combatants, or to affect civilians indiscriminately”.

Ultimately, campaigners want the adoption of a treaty to regulate the development, acquisition and use of LAWS, much like the bans introduced by the 1992 Chemical Weapons Convention, the 1997 Mine Ban Treaty, and the 1998 pre-emptive ban on blinding laser weapons.

If you’ve not come across “killer robots” or “LAWS” before, you may be wondering what all the fuss is about. What are these systems? And why are so many people opposed to them?

Lethal autonomous weapons are weapons systems which identify and engage human targets without meaningful human control, delegating decisions over the deployment of lethal force to a machine, with no human ‘in the loop’. Such weapons systems have been dubbed the ‘third revolution’ in warfare, after gunpowder and the atomic bomb, for their potential to radically alter the way in which wars are conducted.

LAWS makes killing a more efficient, impersonal, coldly calculated process. This is the heart of the moral and ethical objection to them: the reduction of human lives to mere data points for value assessment by complex algorithms alone makes the line between life and death as fine as a line of code.

In addition, no algorithm is perfect. Anyone who opposes discrimination on the basis of race or gender must be concerned about the AI used in LAWS absorbing the biases present in our world. Facial recognition technology is far worse at identifying and differentiating people of colour as compared to white people, which could lead to a much higher rate of mistaken identity and wrongful killings when LAWS are used in non-white communities. The same is true for facial recognition of women compared to men.

Anyone who opposes discrimination on the basis of race or gender must be concerned about the AI used in LAWS absorbing the biases present in our world.

Similarly, if autonomous weapons are used to identify high-risk targets, it is likely that minority communities would be disproportionately targeted, in the same way that programmes used to grant or deny bail calculate that African-American people are 45% more likely to re-offend than their white counterparts of the same age, gender and criminal record.

We must be concerned with the ability of LAWS to adequately distinguish soldiers from civilians, without which they will inevitably breach International Humanitarian Law. Scientists and tech companies already fear that development towards lethal autonomous weapons has gone too far, and that we may be reaching a treacherous point of no-return, after which we will no longer be able to control or understand the technology we create.

In 2015 over 3000 AI experts – including Stephen Hawking and Elon Musk – warned against autonomous weapons in an open letter, and a 2017 letter from 116 tech companies called on the UN to ban lethal autonomous weapons. Moral integrity and human dignity, alongside the intersectional, legal and technological concerns, builds a compelling case for demanding effective international legal regulation of LAWS.

The bad news is that many states continue to deny the need for effective legal regulation of these weapons systems. The USA, Russia, China, South Korea, Israel, France and the UK are the world-leaders in the pursuit of autonomous weapons. Here in the UK, the Ministry of Defence’s pursuit of AI and machine-learning technology to facilitate autonomous weapons is extensive.

The MoD runs an Autonomy programme with activities including “developing underpinning technologies to enable next generation autonomous military system”. The MoD’s research arm, the Defence Science and Technology Laboratory (DSTL), conducts extensive public-private military collaboration, with one example being the Autonomous Systems Underpinning Research (ASUR) programme. The ASUR programme is funded by the DSTL and led by leading arms company BAE Systems, with support from universities including Cranfield and Loughborough.

More evidence of collaboration between academia, government and industry in this field is GCHQ’s strategic partnership with the Alan Turing Institute, the UK’s institute for data science and AI, which was established by five UK universities (Oxford, Cambridge, Edinburgh, Warwick and UCL). The Institute’s activities include the defence and security programme, with one project pursuing large-scale coordination of autonomous swarm robotics.

Perhaps most striking is the completion of the Taranis armed drone: a project by the MoD and BAE Systems, with input from other arms companies including QinetiQ and Rolls-Royce, which the MoD described as ‘fully autonomous’, with the ability to ‘think for itself, navigate, and search for targets’. The UK’s pursuit of autonomous weapons was epitomised just last month in our Chief of Defence Staff’s comments that the UK will ‘absolutely’ avail itself of autonomous systems, and that in the near future “we could have an army of 120,000, of which 30,000 might be robots”.

An army of robots must be avoided. Resorting to autonomous weapons makes waging war too easy, and reduces war to a surreal, video-game-like experience. Far from making wars ‘risk-free’, easy access to autonomous weapons lowers the threshold for war, threatening global peace and security.

The possible domestic consequences of autonomous weapons are also stark. Once autonomous weapons become commonplace, they could be bought and deployed by powerful individuals, corporations, or terrorist groups. Autonomous weapons would allow such actors to destabilise the national order, without needing the support of people. It is a threat to democracy to enable unelected, minority interests to exert unaccountable and disproportionate influence in this way.

Far from making wars ‘risk-free’, easy access to autonomous weapons lowers the threshold for war, threatening global peace and security.

Effective regulation becomes much more difficult if the development and circulation of LAWS is allowed to occur freely. Governments must act now to regulate the development and use of these weapons systems. In particular, democracy-loving states like the UK, with historic traditions of alleged respect for human rights, must realise that there is far more to lose than gain from the proliferation of autonomous weapons.

This tumultuous year has seen the world united in its efforts to confront the COVID-19 pandemic. The unified voices of representatives at the International Student Conference raise the hopeful possibility that the problem of lethal autonomous weapons will be confronted in a similarly determined and cooperative manner, in order to create a safer and more humane future for us all.

Japan’s hosting of this event was especially poignant considering their recent commemoration of the 75th anniversary of the world’s first nuclear attack at Hiroshima. Their country knows better than any other the devastating effects that a ‘revolution’ in warfare can bring, and we must follow the lead of their activists in working to ensure that lethal autonomous weapons do not become the next technology to cast a dark, haunting shadow across human history.

You can find out more about the Campaign to Stop Killer Robots at and follow the youth campaign at #YouthInTheLoop.

Main Sources:


Image credits: Tony Webster