With Australia failing to acknowledge the humanitarian risks of weaponising AI, the Australian government seems unwilling to place any limits on devastating autonomous weapons.
Matilda Byrne is a sessional academic at RMIT University’s School of Global Urban and Social Studies where she specialises in international relations, global security and multilateralism. She is the National Coordinator of the Australia Stop Killer Robots campaign, based at SafeGround, an Australian non-profit that seeks to reduce impacts of legacy and emerging weapons. She is also the Co-President of the Australian Peace and Security Forum. Find more at @matildabyrne.
With Australia failing to acknowledge the humanitarian risks of weaponising AI, the Australian government seems unwilling to place any limits on devastating autonomous weapons.
As Australia resists new laws to restrain AI weapons development, Australian companies are using real wars to test their killing power.
Australia lacks adequate ethical restraints on its booming development of autonomous weapons that kill with minimal human input.