Print Friendly, PDF & Email

In 2023 the UK’s House of Lords established a committee to investigate the use of artificial intelligence (AI) in weapon systems. It is valuable and important to have these inquiries, however, the resultant report has failed to properly take account of the current state of AI in weapon systems, and how it might change in the immediate future.

The committee published its findings in December 2023 and the British government was required to respond by 19 February 2024. The report’s findings are predictable, it provides recommendations on lethal autonomous weapons (LAWs) definitions, the importance of the law of armed conflict, a suggestion about the spectrum of autonomy, and requests to prohibit AI from use in nuclear command and control (C2). It also recommends that the UK should become a leader in AI regulation and seek to ban certain use cases. However, it is unlikely that these recommendations will enable the British government and Ministry of Defence to consider and prepare for the future of LAWs because they do not address the current and emerging state of autonomy in defence.

The report’s definitions of a spectrum of autonomy are not dissimilar from the US Department of Defense’s own definitions, which apply to a variety of systems from Javelin through to the Phalanx air defence system. The primary difference in the House of Lords report is that they discuss the role of AI (by which they mean machine learning and other elements of AI), as opposed to hard-coded automation. Hard-coded autonomous weapons are not new, the Harpy loitering munition was developed during the late 1980s to hunt air defence radars, and was used in the 2020 Nagorno-Karabakh war. It autonomously detects, locates, and attacks air defence radars based on their emissions. It does this without human oversight in the targeting process, although a human decides where and when to launch it. The difference between Harpy, and the use cases discussed in the House of Lords report is that Harpy did not learn how to identify those radars from a set of data fed into an algorithm. It is eminently possible to achieve the same ends and more against a wider target set with AI – the question then is whether the means are important. Here, it is worth examining the war in Ukraine.

It is highly likely that weapons using AI and with a very high degree of autonomy have already been used in this war, and that fully autonomous weapons will be deployed by both sides before the war is over. One driver is the presence of electronic warfare (EW). Russia made EW a pillar of its way of war long before 2022 in a bid to degrade NATO’s perceived superiority in C4ISR and the resultant precision strikes. Its EW forces have performed this role in Ukraine, but they have also found themselves in a near constant struggle against the thousands of drones used by the Ukrainian forces. Ukraine has likewise developed and deployed its own EW to degrade the resultant Russian acceptance of small drones. Each force is likely adapting its drones and EW at a tactical level to improve countermeasures and survivability. This process will drive them inexorably towards AI and autonomy, because an autonomous drone would be largely immune to EW.

At least some of Russia’s Lancet loitering munitions are known to carry the Nvidia Jetson TX2, a single board computer designed for AI applications, including computer vision. Such Lancets are therefore likely capable of navigating without GPS by matching pre-loaded imagery to what they sees around them. It is also possible that they can conduct a portion of their engagements autonomously. New iterations of Lancet, such as ‘Product 53’ are claimed to be fully autonomous and capable of selecting and engaging targets within a geofenced area. Ukraine has in turn deployed the Saker Scout, an AI-enabled reconnaissance drone with the ability to engage targets and adjust its targeting with some degree of autonomy. AI-enabled autonomy will reduce Russia’s EW advantage if it can be realised at scale. This is because a drone can become self-reliant and navigate to a target based on what it detects in the environment around it rather than the easily-jammed satellite navigation signals they currently rely upon. It may also ease the burden on 155 mm artillery ammunition, and restore Ukraine’s combat power.

If Russia perceives its use of AI as a success, and Ukraine is able to embrace the benefits of AI to reduce the efficacy of Russian EW, it follows that the UK might have no choice but to develop its own. The kind of LAWs that are represented by Russia’s Lancet are not destabilising nor disruptive to the international order. However, they can have a very valid and specific battlefield application in providing frontline forces with tactical reconnaissance strike capabilities. This amplifies their firepower and can make or break an operation. The House of Lords’ recommendations are commendable in their intent, but it is time to move beyond trying to establish definitions for autonomous weapons, and into an active process of exploring their tactical combat utility.

Sam Cranny-Evans