All Student Vote (Spring 2023)
Motion to Take a Stance Against the Development of Autonomous Weapons
Back
For Warwick SU to lobby the University of Warwick to publicly pledge to not knowingly contribute to the development of autonomous weapons.
This union notes:
-
Warwick SU has no publicly declared stance on the subject of autonomous weapons (AWS) development. AWS ‘are the range of weapons systems that detect and apply force to a target based on sensor inputs. The specific object that is struck by an autonomous weapons system, and the time and place of this, are determined by sensor processing: following activation, there is a period of time where such systems can apply force to a target without direct human approval’.1 In short, AWS are weapons systems that can identify and engage targets without human intervention. Examples include the STM Kargu-2 and the Jaeger-C by Gaardtech.
-
Warwick University has no publicly declared position on the subject of autonomous weapons.
-
Warwick University lacks both a policy on the ethics of AI research and a Research Ethics Committee for Computer Science and AI.
This union believes:
-
Autonomous weapons systems are both ethically and practically problematic. The UN Secretary-General Antonio Guterres stated that they are “are politically unacceptable, morally repugnant and should be prohibited by international law”.2
-
Concerns over autonomous weapons arise from the belief that an AI decision will be made according to algorithms and thus will rely upon pre-packaged decisions inappropriate for complex human environments. Furthermore, given that AI will learn what constitutes a target through learning to recognise one in data sets, there is great concern that it will learn from data sets that could either be intentionally or unintentionally bias.3 Moreover, there is a risk that the perceived low cost of deploying these machines may lower the threshold for war.
-
The Campaign to Stop Killer Robots acknowledged that projects contributing to the development of autonomous weapons may do so unintentionally, by virtue of their outputs being ‘dual-use’ technology that can have military applications. This is also explicitly noted by the MoD.4 As the Campaign to Stop Killer Robots notes, ‘AWS-relevant research, projects and technologies… could contribute different building blocks or components that make up such systems. Acknowledging that much research can have a range of applications or be dual-use (have both civilian and military applications), we are not inferring that all these projects are currently and directly contributing to autonomous weapons systems, but rather, that they may have the potential to do so whether intentionally or unintentionally. This is why action by universities… is urgently needed to contain such risks’5.
-
In the past three years the Warwick Computer Science department has undertaken projects funded by DSTL (Defence Science and Technology Laboratory), which have focused on the development of AI for surveillance.6?
-
In 2021 the British government confirmed that DSTL is engaged in research to develop AI for military applications.7
-
The Campaign to Stop Killer Robots released its report on Universities in September of 2022 with Warwick being one the 13 universities studied. It concluded that Warwick University lacks a university-wide ethics policy ‘and safeguarding mechanisms to control the specific applications of their research and thus prevent dual-use technology developed in their premises from being incorporated by third parties into component technologies that are liable for adoption into an autonomous weapons system or other malicious uses’.8
-
Additionally, The Campaign to Stop Killer Robots Universities report documented a significant link between British Universities and arms companies like BAE Systems, Thales UK and QinetiQ. The Campaign highlighted a number of university projects with significant relevance to autonomous weapons development, that were simultaneously funded by arms companies that are known to be pursuing such weapons.9 Specifically, the Campaign states that ‘Our investigation identified at least 65 recent and ongoing projects within the realms of sensor technology, AI, robotics, mathematical modelling and human-machine pairing. Of these we judged 17 to pose a higher risk of use in the development of autonomous weaponry. Developers of AWS relevant technology are consistently present within our target institutions as major strategic partners, or more specific project funders, whilst it is not uncommon for university research staff members to also occupy positions within defence companies’.10
-
There is thus a real risk that university students may be contributing to the development of autonomous weapons.
This union resolves:
-
Warwick SU should lobby the University of Warwick to publicly pledge to not knowingly contribute to the development of autonomous weapons.
-
Warwick SU should lobby the University of Warwick to form an internal Research Ethics Committee for Computer Science and AI, in order to assess and mitigate the risk that university dual-use research could be applied for unintended malicious uses or incorporated in harmful weapons systems, such as autonomous weapons systems.
-
Warwick SU should lobby the University of Warwick to incorporate a specific policy to assess the risks of dual-use research using AI and autonomous technology into a university-wide ethical framework (and within the framework of relevant faculties) to help guide ethical decision-making on research funding and activities.