We are calling for evidence to see how AI could determine the future of warfare
The House of Lords Committee on Artificial Intelligence in Weapon Systems has just launched its call for evidence.
This is a pressing, critical issue and we are looking for advice, views and opinions to help steer our work.
Artificial intelligence has spread into so many areas of life, and defence is no exception. AI has applications across the military sphere, from optimising logistics chains to processing large quantities of intelligence data. There is a growing sense that AI will determine the future of warfare, and forces around the world are investing heavily in capabilities powered by AI. But, despite these advances, fighting has largely still been carried out by humans.
Bringing AI into the realm of fighting through the use of AI powered weapons systems could be a revolution in warfare technology and is one of the most controversial uses of AI today. Procedures, rules and regulations play a crucial part in warfare, and the use of autonomous weapon systems is no exception. Our Committee is investigating how AWS might be developed and utilised, both now and in the future.
In 2018, a previous House of Lords inquiry into AI described the application of AI for military purposes as likely to be the “most emotive and high-stakes area of AI development” and that remains true today. A 2020 poll conducted across 28 countries found that more than 60% of people opposed lethal autonomous weapons systems.
Autonomous weapons systems have been defined as systems that can select and attack a target without human intervention. These systems could revolutionise warfare, with some suggesting that they would be faster, more accurate and more resilient than existing weapons systems and could limit the casualties of war.
However, there are concerns about the ethics of these systems, how they can be used safely and reliably, whether they risk escalating wars more quickly, and about their compliance with International Humanitarian Law.
We are at the edge of a step change in warfare. It is crucial that the next moves are the right ones. This is a period of intense scrutiny of the use of AI in the military, with international conferences and UN groups dedicated to building international consensus on the new norms of warfare.
The UK Government is a major player in these negotiations, but the Government’s own policy on autonomous weapons systems is, at the moment, somewhat sparse. The UK does not have an operational definition of autonomous weapons systems. The Government’s current policy position is that weapons systems which use AI should have “context-appropriate human involvement”, and that the UK does not have fully autonomous weapon systems and does not intend to develop them.
We will be investigating the UK Government’s policy on autonomous weapons systems and international efforts to regulate them. We have just begun our inquiry and will be hearing from experts in the field on the benefits, risks and challenges posed by autonomous weapons systems.
Our work relies on the input of a wide range of individuals and is most effective when it is informed by as diverse a range of perspectives and experiences as possible. We are inviting all those with views to share, including both experts and non-experts alike, to respond to our call for evidence by 10 April 2023.
Visit the Committee’s website or follow on Twitter @HlAIWeapons for more information and to take part in the call for evidence.
Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.