KILLING BY ALGORITHM

Australia lacks adequate ethical restraints on its booming development of autonomous weapons that kill with minimal human input.
The Australian-made DefendTex Drone40 kamikaze drones sent to Ukraine in 2022. (Photo: United States Marine Corps)

The development of autonomous weapons technologies in the military domain is being heralded by academics and analysts as the third revolution of warfare, with rapidly increasing autonomy in weapons well underway. 

Australia is in a group of countries leading the charge along with US, UK, Russia, China, Israel, India, and South Korea. A range of aerial, land and under-water systems with autonomous capabilities are being developed and deployed, including in current conflicts. Weapons that would operate without human control over the selection of targets and decision to attack, referred to as fully autonomous weapons or ‘killer robots,’ are only a few steps away from current reality. Such weapons pose legal, ethical and security risks as decisions of who to kill are delegated to machines.

Australia is, however, embracing this rapidly developing industry with barely any public discussion. 

Australian company DefendTex provided 300 of its Drone 40 loitering munitions to Ukraine last year. There have been mainstream media reports on the use of drones, loitering munitions, and other autonomous capabilities in Ukraine, including in the New York Times and Financial Times, and how the conflict has become a vital testing ground for Western weapons. This is mostly framed as a positive development with only limited criticism

DefendTex is just one of many Australian companies in an expanding landscape of Australian development in AI for defence, involving private industry and universities. To foster collaboration between these sectors, from 2017 as part of its Next Generation Technologies Fund, the Coalition government established Defence Cooperative Research Centres.

The inaugural centre, Trusted Autonomous Systems, based in Brisbane, was awarded an initial $50 million investment from the government for its first seven years.  

Universities are active in projects facilitated by Trusted Autonomous Systems as well as other initiatives in partnership with Australian defence or arms companies that focus on autonomy and related capabilities. For instance, both the Defence Science Institute which connects defence, industry and Victorian universities, and the NSW Defence Innovation Network have autonomy as a focus area.

STELaRLab is a partnership between the University of Melbourne and controversial international arms company, Lockheed Martin, which has been accused of complicity in war crimes. A core area of work for STELaRLab is autonomy, robotics research and development. A student activist group in Australia, Lockout Lockheed Martin, protested against the partnership involving university collaboration in weapons production. 

Given the nature of autonomy-related technologies, it’s hard for university students to know the end-use of their projects or research when these are in collaboration with arms companies or defence. Some students who are concerned about these ethical risks will turn down opportunities to work on projects connected to defence, while others lack awareness and ethics education in fields such as computer science.

AUSTRALIAN COMPANIES ON THE FRONTLINE

Large arms manufacturers, as well as smaller Australian arms companies, undertake collaborative projects supported by Trusted Autonomous Systems, many pushing the autonomy envelope. 

DefendTex, the creators of the Drone 40 loitering munition being supplied to Ukraine, are developing a range of capabilities, including swarming technology. A swarm is where numerous weapons are deployed in a connected group, moving en-masse as they find targets. Without limits to their geographic area and duration of operating, these would be difficult for operators to adequately control and increase risks to civilians as well as intensifying the pace of warfare.

The Cyborg Dynamics company has built a Warfighter unmanned ground vehicle (UGV). (Photo: Cyborg Dynamics)

Cyborg Dynamics and Skyborne Technologies are two Australian arms companies based in Queensland developing weapons which skirt moral and ethical red lines, in the absence of any specific limits on autonomy. Skyborne Technologies is developing Cerberus GLH, an autonomous drone carried in a backpack and equipped with rapid multi-shot grenades. Cyborg Dynamics is co-developing the Warfighter Unmanned Ground Vehicle with Australian robotics company BIA5, also armed with various ammunitions. These weapons have been exhibited by both companies at Australian and international arms fairs and an upcoming US convention for military and industry.

In order to avoid disastrous consequences, particularly for civilians, these weapons must be controlled by human operators who can understand and evaluate the environment and not be unleashed in conflict zones which are increasingly urban areas. This is most crucial in decisions for targeting and whether or not to attack.

The above two companies also share a collaborative venture in Athena AI which focuses on the development of an AI-enabled targeting system to track, identify and select targets. These targeting capabilities can be integrated into other weapons and are a dangerous step towards fully autonomous weapons. 

Athena AI capabilities are being utilised by Red Cat who make drones for the US used for the protection of military bases and border control. There are risks in exporting systems, components or software for use by other companies or countries, especially as they may be used or adapted in new ways that are not lawful. It has not yet been clearly delineated what use of autonomy in weapons are legally and morally acceptable.

When this concern was put to the Albanese government in Questions on Notice in April 2023, the written response on behalf of the Minister for Defence Richard Marles avoided confirming whether or not Australian arms companies or projects from Trusted Autonomous Systems were being exported and asserted that, “there is no widely agreed definition of autonomous weapons, and Defence exports a range of goods and technologies including for training and operations.”

There are some significant projects where Australian defence has partnered with large arms companies. Recently, a partnership was agreed for autonomous submarines between the Royal Australian Navy and Andurila key collaborator for the US military, also providing autonomous weaponry to Ukraine.

Anduril founder, Palmer Luckey, made news for developing a VR headset that would kill its wearer if they died in a video game. Anduril expanded to Australia in 2022, with the submarines central to that venture.      

‘Ghost Bat’ is an autonomous aircraft developed in a flagship project by Boeing Australia with the Royal Australian Air Force. Boeing is a multinational aerospace company that works in civil areas, such as commercial aircraft, and communication satellites, as well as defence. The company sells equipment to repressive states such as Saudi Arabia. 

The Ghost Bat project has facilitated the establishment of a Boeing manufacturing facility near Toowoomba in Queensland as part of the Wellcamp Aerospace and Defence Precinct. The site is the first Boeing manufacturing site outside of the US. The aerospace hub currently centres on the Ghost Bat contract with the Australian government.

When the hub was announced in 2021, Queensland Treasurer Cameron Dick said that, “our vision for this precinct is to be the epicentre of aerospace and defence development, advanced manufacturing, globally, research and development and education.” 

Autonomy is integral to innovation in aerospace in defence and civil domains but currently there is an absence of regulation in these areas. In order to innovate responsibly, clearer guardrails are needed from government and within the private sector.

AUKUS EMBRACES AI

The innovation in autonomy and investment in AI for defence is shared by Australia’s allies, especially the US and UK. Cooperation between these countries on autonomous capabilities is set to increase through the AUKUS security alliance. The alliance was announced to foster cooperation for regional security between these partners but has received criticism in Australia from analysts, academics, past government leaders including Former Prime Ministers Paul Keating and Malcolm Turnbull, and the public including the civil society Australian Anti-AUKUS coalition

AUKUS’s next phase, or ‘second pillar’, focuses on technology sharing of ‘advanced capabilities,’ a major aspect being AI. Cooperation on advanced capabilities is intended to increase security and the ability of the three partners and their defence forces to work together. The advanced capabilities pillar was recently showcased in a joint ‘autonomy trial’ hosted by the UK, with collaboration of all three militaries at the testing. This is just the outset of collaboration on AI-enabled capabilities through the alliance. 

The “autonomy trial” in 2023 involved experimental work by Australia, UK and US on detecting and tracking military targets. (Photo: UK Ministry of Defence)

Recently, the Australian government also announced a new scheme, the Advanced Strategic Capabilities Accelerator, with autonomy as a priority area. This was in response to the recent Defence Strategic Review’s findings, and AUKUS’ second pillar. The Defence Strategic Review only contained one other reference to autonomy; a general reference to air capabilities and specifically Ghost Bat. Given the extensive landscape of development in autonomy for defence, the limited references seem unusual, as it is repeatedly articulated as a priority. 

By contrast, the 2020 Defence Strategic Update outlined how “emerging technology will be rapidly utilised and incorporated into the new strategic framework, with autonomous weapon systems and long-range weapons being increasingly developed, researched and tested.” 

The last few years have illustrated the pursuit of this. Deputy Prime Minister and Defence Minister, Richard Marles, recently commented on the Advanced Strategic Capabilities Accelerator and said that, “Australia must invest in the transition to new and innovative technologies for our Defence Force.” Autonomy is seen as central to these goals. 

QUESTION OF ETHICS

Some of the ethical concerns of developing in this area have been recognised in a paper on AI ethics in defence commissioned by the Australian Department of Defence in 2021. It proposed three different tools: an Ethical AI for Defence Checklist, Ethical AI Risk Matrix, and a Legal and Ethical Assurance Program Plan, but these do not reflect current Australian defence policy. 

The Australian government uses a framework for the development of all weapons called the “System of Control.” This framework does not have specific considerations related to autonomous capabilities in the weapons design or how the weapon operates. Policy is lacking on limits to how autonomy is used in weapons and the amount of human control required, in particular over the ‘critical functions’ of selecting targets and deciding to attack. 

Without the establishment of clear policy, development is unfettered. The legal, ethical and security risks are not being adequately addressed. 

In 2017, leading Australian AI-experts called for the Australian government to support a ban on lethal autonomous weapons. An open letter for global AI and robotics researchers and companies included high profile endorsements such as Elon Musk and American tech entrepreneur Steve Wozniak.  Recently, the Australian Human Rights Commissioner also urged for the prohibition of lethal autonomous weapons. Australia is yet to heed such calls which are echoed globally. 

In response to the many legal, ethical, security and humanitarian concerns raised by autonomous weapons, the international community have called for new international law to be established. This includes the United Nations Secretary-Generalthe International Committee of the Red Cross (ICRC), the tech sector, AI experts, and the Stop Killer Robots campaign (I’m the National Co-ordinator of the Australian Stop Killer Robots campaign). 

Recently, the UN Secretary-General’s New Agenda for Peace calls for negotiations of a new legally binding instrument to address autonomous weapons to conclude by 2026. 

A legally binding international instrument on autonomous weapons would establish specific prohibitions and other obligations. This may include prohibitions on weapons that select and apply force to targets without human control, or obligations regarding the duration of time and geographical space where a weapon with autonomous capabilities is used. 

These regulations would seek to address the legal challenges of accountability and international humanitarian law, such as ensuring distinction between combatants and civilians and the proportionality of an attack. These require inherently human evaluations and cannot be achieved by a machine. 

It would also establish a strong precedent for responding to ethical concerns, notably the delegation of life-death decisions to machines and digital dehumanisation. Digital dehumanisation is the process whereby humans are reduced to data, which is then used in automated decisions that may have negative effects. Autonomous weapons that decide to attack and kill illustrate the most acute harms. Such regulation would also mitigate security risks including the acceleration and intensification of conflict due to the potential pace and scale of these weapons or machine error.

International talks have sought to address autonomous weapons for almost a decade. Since 2014, each year dedicated diplomatic meetings have been held at the United Nations in Geneva under the framework of the Convention on Certain Conventional Weapons (CCW). To date, over 90 countries have called for new international law to be established.

However, the countries leading in AI development for the military, especially the US, Russia, Israel and India, have expressed their opposition to any regulation. Due to consensus rules, this has prevented the process from advancing towards any kind of concrete action. A stymied process in these diplomatic meetings mirrors past international disarmament processes such as landmines, where eventually international law was established in line with global momentum.

Regarding autonomous weapons, Australia along with the US and UK reject the need for new international law. These governments have often acted in a group at the diplomatic meetings, along with Canada, Japan and South Korea, offering proposals which disregard ethical concerns and obfuscate human control. 

Since the election of the Labor government in May 2022, Australia has started to participate more constructively, engaging with ideas presented by other countries, however they insist that any measures must not reflect a legal obligation. 

In previous cases such as with landmines, Australia’s position diverged most notably from the US to join global efforts to establish a new treaty when Australia signed the Mine Ban Treaty in 1997, whilst the US did not. Australia is currently out of step with global progress in favour of new international law to address autonomous weapons. 

Momentum is building towards a new legally binding instrument on autonomous weapons.  Last year, at the United Nations General Assembly, a multilateral joint statement on autonomous weapons was delivered. This was the first instance for such engagement outside of the continually stagnated meetings through the CCW.

In 2023, there has been a number of regional conferences hosted by countries including the Netherlands, Luxembourg and Costa Rica for the Latin America and Caribbean region. These were in addition to the continued but still stymied CCW meetings. The United Nations General Assembly is also approaching and it is likely many countries will use the opportunity to take further actions to address autonomous weapons such as advancing a resolution.

Policy and international law typically trail behind the advent of new technology, but decisions are always required around what can and should be pursued for humanity’s betterment. Autonomous weapons are no different. 

Before you go…
If you appreciate Declassified Australia’s investigations, remember that this all costs both time and money. Join over 5,000 followers and get our Newsletter for updates. And we’d really appreciate if you could subscribe to Declassified Australia to support our ongoing work. Thank you.

Matilda Byrne

Matilda Byrne is currently undertaking a PhD at RMIT’s Social and Global Studies Centre where she is a sessional lecturer in international relations, security, global governance and disarmament. She is the National Coordinator of the Australia Stop Killer Robots campaign, based at SafeGround, an Australian non-for-profit that seeks to reduce impacts of legacy and emerging weapons. View all posts by