‘Killer Robot’ projects found in Oxford

A September 2022 report investigating the role of UK universities in the development of autonomous weapon systems (AWS) found that the University of Oxford has been involved in at least six projects that have been deemed moderate risk or higher in helping to produce such systems.

The report was carried out by the Campaign to Stop Killer Robots, a group of more than 180 NGOs working across 60 countries. It examined 13 UK higher education institutions with significant funding from groups such as the UK Ministry of Defence, the US Department of Defence or private arms manufacturers and dealers. 65 separate projects across these institutions were identified as potentially contributing to AWS programmes.

Publication of the report coincides with recent strides in AI development, as well as rising concurrent concern amongst political analysts, AI experts and scholars of the dangers that new technologies potentially bear. Geopolitical risk consultancy Eurasia Group has identified new AI technologies as third on their ten defining political risks of 2023. AWS development, however, has not received as much attention.

AWS can be understood as the range of weapon systems that detect and apply force to a target based on sensor inputs without human input, or ‘killer robots’. The first recorded usage of LAWS (lethal autonomous weapons systems) was observed in March 2020 in Libya by the UN, who filed a report in 2021 that classified a skirmish involving a Kargu-2 drone as autonomous. It is unclear that the drone caused any casualties. The Turkish arms manufacturer STM that manufactured the Kargu-2 drone denies its autonomous capabilities.

A recent UN summit in December 2021 discussed LAWS within the framework of the UN Convention on Certain Conventional Weapons (CCW), a platform used to restrict dangerous or indiscriminate weapons. However, the positions of major world powers, including the US, UK, Russia, and China, meant that no significant progress was made, as consensus is required before action can be taken within the CCW framework. The US specifically has claimed that current frameworks are enough to control the usage of LAWS, and the UK government has stated that it has no LAWS and has no intention of developing them. Despite this, the Campaign to Stop Killer Robots report found that the Ministry of Defence has spent heavily on research and development in technologies that constitute LAWS.

The report created four tiers of potential contribution to AWS in research projects, these being higher risk, medium risk, lower risk, and insufficient information to decide. Higher risk means that a research project directly contributes to development of AWS and is funded by military or arms groups. Medium risk identifies research with what is termed ‘dual use potential’, potentially legitimising AWS in future or perhaps being used for development of AWS in absence of proper controls. Lower risk identifies no immediate dual use potential or AWS potential.

The report identifies 6 projects affiliated with the University, 5 of which are categorised as medium risk and one of which is categorised as high risk. It also states that Oxford is one of the universities with the highest levels of military funding in the UK, alongside the University of Cambridge, Cranfield University, Imperial College London, and the University of Sheffield.

Moreover, seven of the military contractors sponsoring research at Oxford have been targeted by Amnesty International for ‘alarming indifference to the human cost’ of their business. Oxford was also in receipt of £6m from Rolls-Royce, whose engines are frequently used in aerospace engineering, from 2017-2019. During this period, Rolls-Royce paid £671m in penalties after anti-corruption investigators found they had bribed countries internationally to secure government contracts. £700,000 was also received in the same period from Atomic Weapons Establishment, the company responsible for the production of the UK’s nuclear warheads.

Another issue that the report identifies across the University and all 12 other institutions is a lack of transparency. For many projects, there is a scarcity of information available online; the report claims that the University, for example, does not make the details of research grants it receives public. The report also alleges that the activities of the Committee to Review Donations and Research Funding at University of Oxford are not publicly available. The body’s responsibilities include deciding whether specific donations may go against the university’s internal frameworks for the acceptability of donations. Moreover, no university which the report examined mentioned AWS or LAWS within ethical frameworks related to research.

“I asked at the beginning of the [academic] year about demilitarizing, and the university was basically like — OK, work out how we’re going to finance the whole of the engineering department then.”

Following the publication of the report, students at universities in Lancaster, Warwick, Nottingham, Bristol, and Sheffield engaged in ‘demilitarisation’ protests whose primary aim was to remove the influence of the military and arms traders from UK universities. However, some would argue that those goals are misplaced, given the current underfunding of UK universities and the wider issue of academic funding. River Butterworth, education officer at the University of Nottingham’s student union, stated that “I asked at the beginning of the [academic] year about demilitarising, and the university was basically like — OK, work out how we’re going to finance the whole of the engineering department then.” 

Six College JCRs (Junior Common Rooms), Oxford University Amnesty International Society and more than 70 individual students have signed an open letter addressed to the University stating a ‘deep concern’ in its potential role in the development of AWS and LAWS. The letter asks the university to establish a policy forbidding the development and production of LAWS and sign the Future of Life Pledge.

The report also highlights the attitudes of researchers working within the universities. One researcher at the University of Cambridge, who was sympathetic towards the Campaign to Stop Killer Robots’ views regarding AWS technologies, stated that he would have taken money from groups furthering AWS if it ensured funding for his research. Researchers were also largely unaware of the potential issues attached to ‘dual use’ frameworks, despite many academics within the institutions having signed the Future of Life Pledge, a call to international leaders to regulate AWS technologies. These comments reflect a precarious position that academics at the University and at other institutions could find themselves in. Despite potential ethical qualms with the sources of funding for their research, some may feel that they are left with no choice due to the lack of funding available elsewhere.

The report encourages all UK universities to sign the Future of Life Pledge calling for international regulations and laws against LAWS, as well as pledging to establish mechanisms to minimise the risks that dual-use research could pose. It also encourages specific ethical frameworks for LAWS and other AI technology, as well as other measures to increase transparency.

Groups operating within the University could help to devise these specific ethical frameworks. The Institute for Ethics in AI at the University was founded in February 2021 following £175m in donations by Blackstone financier Stephen Schwarzman. It is chaired by John Tasioulas, a Greek-Australian philosopher, who has argued that, despite the currently embattled state of the humanities, they will be essential to discussions surrounding regulation and usage of AI and technologies like AWS in the 21st century.

The group’s dedicated philosopher for examining the ethics of killer robots on the battlefield, Dr Linda Eggert, has publicly probed the issue of AWS. In 2023, she questioned “Do we have a right for decisions, even the decision to kill us, to be made by people not machines?”

A University spokesman commented that ‘The University of Oxford, working with collaborators, funders and other organisations around the world, pursues world-leading research that advances knowledge and discovery. Our research has wide-ranging applications across many domains from the development of new technologies to addressing key societal challenges such as climate change and pandemic preparedness. We maximise dissemination through open publication and sharing of our research outputs, whilst ensuring we maintain strict compliance with relevant legislative and regulatory frameworks such as export control requirements governing the transfer of sensitive technologies outside of the UK.’