As Australia resists new laws to restrain AI weapons development, Australian companies are using real wars to test their killing power.
Australian company DefendTex's kamikaze Drone 40 is being used and tested in Ukraine. Photo: British Ministry of Defence.

Since the invasion of Ukraine by Russia in February 2022, the conflict has become a testing ground for new weapons developments, particularly autonomous capabilities. Australia provided 300 of Australian arms company DefendTex’s Drone 40 loitering munitions to Ukraine in 2022, after field testing in Poland. 

The war in Ukraine has seen advancements transforming the battlefield, particularly in drone technology, with small aerial drones used to attack Russian personnel or vehicles while larger drones target radars or similar military installations, currently remote-controlled. Saker Scout drones, according to the Ukrainian company’s spokesperson, are able to identify and attack 64 different types of Russian ‘military objects’ autonomously. Use in this mode would be the start of autonomous weapons attacking on the battlefield.

Despite these military innovations, however, the Ukrainians are struggling to beat Russia and the most sophisticated weapons coming from the US are increasingly futile against Moscow. 

AI targeting systems are also being deployed in the Middle East. The Gospel/Hasbora, Lavender and Where’s Daddy?, systems which suggest, select or monitor human targets are being used by Israel in Gaza. A former Israeli intelligence officer said this system enables the army to run a “mass assassination factory.” At least 40,000 Palestinians have been killed in Gaza since 7 October, the vast majority of whom are civilians. 

The increasing autonomy in weapons, posing new risks for conducting warfare and dangers for humanity, demands urgent and decisive regulatory action by countries. 

Australia dragging its feet

Autonomy has been reiterated as a priority area for Australia’s defence development and strategic capabilities. At present, projects and partnerships between the Department of Defence, Australian Defence Force, industry including large arms manufacturers and other Australia-based companies and universities highlight the push towards innovation of AI in the military domain in Australia, without a responsible approach with adequate restraints. 

The Defence Cooperative Research Centre ‘Trusted Autonomous Systems’ is one way collaborative projects on autonomy have been undertaken since 2017. Earlier this year construction began on the Wellcamp aerospace hub in Toowoomba where Boeing’s new manufacturing facility will be established to deliver the autonomous aircraft ‘Ghost Bat’ for the Royal Australian Airforce. 

Boeing’s MQ-28 Ghost Bat in test flight. Photo: Australia’s Department of Defence.

Australian arms companies continue to develop a range of weapons systems and autonomous capabilities such as Cyborg Dynamics’s Warfighter Unmanned Ground Vehicle armed with various munitions in a collaboration with Australian robotics company BIA5. Boeing subsidiary Insitu Pacific and Australian company Innovaero have developed the one-way-loitering (OWL) munition known as the Owl. The Owl can travel 200km and loiter for 30 minutes in the air and is currently being trialled by an unspecified army special operations unit 

Many universities collaborate with Trusted Autonomous Systems, directly in arms company partnerships or through defence networks for research development. Autonomy and related technologies is a key area for these defence and arms manufacturer research collaborations. It’s an issue that barely receives any media or political attention and yet the stakes couldn’t be higher.

On the international level, the Australian government continues to insist that no new international law is needed regarding autonomous weapons. Instead, Australia remains committed to “building a shared understanding of how existing IHL [international humanitarian law] applies” to autonomous weapons within the ‘Group of Governmental Expert’ discussions. This is a particular set of diplomatic talks on autonomous weapons within the United Nations Convention on Certain Conventional Weapons (CCW) forum which usually addresses conventional weapons.

However, Australia’s approach does not address the unprecedented moral and ethical risks posed by autonomous weapons and the imperative to establish new international law to address them. Furthermore, any regulation should be inclusive and have diverse participation. Australia highlighted this in their statement at a recent global autonomous weapons conference in Vienna while insisting the CCW’s Group of Governmental Experts is the appropriate forum for autonomous weapons discussions. Yet 64 states are not party and among these 16 countries from our region are not represented including Indonesia, Malaysia, Myanmar, Thailand and the vast majority of Pacific small island states. 

There is no indication that the consensus-based group will reach any agreement, due to conduct from certain countries such as Russia. Countries need to look outside the CCW if they want to establish meaningful and urgent regulation and to negotiate a new legal instrument that will set clear new international norms.  

Some regulatory initiatives have been supported by Australia, including the Responsible AI in the Military Domain (REAIM) Call to Action and the Political Declaration on Responsible Military Use of Artificial Intelligence and Autonomy. But none of this is enough and the Albanese government shows no indication of taking this issue seriously enough.

The dangers of autonomous weapons

Technological developments have accelerated and transformed the military domain with increasingly autonomy in weapons used in current conflicts. The slippery slope to unregulated autonomous weapons is an ever-present threat.

Autonomous weapons are those that detect and apply force based on sensor data (such as heat imprint, facial recognition or acoustic signature), rather than direct human inputs. They raise moral, ethical, legal, humanitarian, and security concerns. If used to target people, this poses the most significant moral, ethical, human rights and digital dehumanisation risks. Digital dehumanisation refers to the reduction of people to data points, causing them harm, undermining their humanity and rights. Most acutely, killing decided by machines. 

Countries have recognised the concerns and the majority are calling for a new legally binding instrument to address autonomous weapons. Countries taking action to work towards regulation is mounting. At the end of April, 144 countries gathered with more than 1000 total participants including academia, industry and civil society for the “Vienna Conference on Autonomous Weapons Systems – Humanity at the crossroads: challenges for regulation.”

This was the first multilateral conference outside the United Nations (UN) on this topic, hosting high-level panels, expert discussions and the delivery of country statements on the last afternoon. The unprecedented participation highlights the critical mass building on this issue.

The Chair’s Summary reiterated some key messages from the conference. First, that the window to negotiate a legally binding instrument preventatively and by the end of 2026, as per the UN Secretary-General’s call, is closing. Second, this is an “Oppenheimer Moment” where, writes Mary Wareham, deputy director of the Crisis, Conflict and Arms Division at Human Rights Watch, “Military investments in autonomy and other emerging technologies are sending humanity down a dangerous path, and nations need to act together to meet the challenge with foresight and courageous political leadership.”

‘AI-powered Genocide’ in Gaza

There’s a profound disconnect between the use of AI targeting systems in Gaza to facilitate mass killings and the relative silence on these issues in debates or discussions in Vienna and elsewhere.

The representative of the State of Palestine opened their statement in Vienna by bringing “attention to the extremely urgent real-life case”, noting those who “champion efforts to regulate and prohibit these systems are more comfortable speaking in theoretical terms but reluctant to mention the real-life example of Gaza.”

Statement by ICRC President Mirjana Spoljaric to the ‘Vienna Conference on Autonomous Weapon Systems 2024: Humanity at the Crossroads’ in April 2024. Photo: ICRC.

The statement by the State of Palestine criticised the status quo, asserting, “We warned the GGE [diplomatic talks on autonomous weapons] cycle after cycle that AI-powered systems are likely to be used to accelerate international crimes including genocide, and to be tested on populations of the Global South”, adding governments can’t have it both ways with elongated discussions while actively exploring AI in weapons. The statement also called for a moratorium on autonomous weapons until the adoption of a legally binding instrument.

UN and region-based progress

During the (UN) General Assembly in October 2023, the first resolution on autonomous weapons was adopted. This step increased the number of countries engaged in multilateral activity on the issue with 164 in favour, 5 against and 8 abstentions. Even some countries such as Australia, UK and US, who do not support the notion of new international law, voted for this resolution.

The resolution requested that the Secretary General submits a substantive report to the UN General Assembly session in 2024, based on the views of states on “ways to address the related challenges and concerns they[autonomous weapons] raise from humanitarian, legal, security, technological and ethical perspectives and on the role of humans in the use of force.” The volume of submissions, from individual countries, groupings and civil society organisations further highlights the interest in addressing this issue urgently.

Several regional conferences on autonomous weapons have been held since early 2023 and highlight the political will for regional approaches. Regional conferences have been organised by Costa Rica, Luxembourg, Trinidad and Tobago, the Philippines and Sierra Leone. The Communiqué from Costa Rica was the first to express a regional commitment to a legally binding instrument, as has those from Trinidad and Tobago and Sierra Leone. The Manila Meeting calls for Indo-Pacific voices to address the risks of autonomous weapons.

Regional initiatives, particularly those involving small island developing states, Latin America, the Caribbean and the Pacific, have seen an emphasis on proliferation to non-state actors and concerns for illicit activity as well as environmental impacts.

Pacific island nations have commenced engagement on the issue of autonomous weapons over the last twelve months, such as Fiji identifying linkages between these other priority areas of non-state actor risks and environmental security. Historically, weapons have caused various environmental damage. Autonomous weapons could exacerbate environmental degradation in testing and by reducing human oversight when attacking targets, as well as malfunction, particularly in marine and land environments. 

While the energy cost of training and operating autonomous weapons is not fully known, studies indicate that the carbon footprint of AI technologies could substantially contribute to climate change. At the Manila Meeting, the Vice President of the Republic of Palau highlighted the need for more inclusive discussions that take into account primary concerns of small island developing nations such as climate change and potential environmental damage.

After the crossroads 

The milestone first UN General Assembly resolution on autonomous weapons was passed in 2023. The Secretary General was tasked to write a report based on the views of states on the concerns of autonomous weapons and how to address them. But if this is the crossroads, then action needs to be taken to negotiate new international law. This is the call of the UN Secretary General, the International Committee of the Red Cross, thousands of AI experts and global civil society organisations. It would be irresponsible for Australia to not be part of such a process to establish new international law. The path we take now is critical. 


Before you go…
If you appreciate Declassified Australia’s investigations, remember that it all costs both time and money. Join more than 8,000 followers and get our Newsletter for updates. And we’d really appreciate if you could subscribe to Declassified Australia to support our ongoing investigations. Thank you.

Matilda Byrne

Matilda Byrne is currently undertaking a PhD at RMIT’s Social and Global Studies Centre where she is a sessional lecturer in international relations, security, global governance and disarmament. She is the National Coordinator of the Australia Stop Killer Robots campaign, based at SafeGround, an Australian non-for-profit that seeks to reduce impacts of legacy and emerging weapons. View all posts by