The government has cancelled a multimillion-dollar contract with the university department it commissioned to devise ethical rules for "killer robots" deployed alongside Australian soldiers and fighter pilots in war.
The University of NSW in Canberra was working on how artificially intelligent machines capable of deciding when to use lethal force should be programmed so they wouldn't commit atrocities.
The five-year contract has been cut abruptly, with three years to run.
A dispute arose between the military and the civilian academic researchers who were unhappy at what they saw as attempts to control the research and its results. It's a tenet of universities that researchers should be free to produce whatever results their research leads them to.
A formal complaint about attempts to limit academic freedom was raised and then the contract was cut, according to one person familiar with the matter.
Two years ago, the Defence Department announced the $4 million contract with the UNSW for the research to be done at its site at the Australian Defence Force Academy in Canberra.
The Defence Department had set up a special unit to develop robots for use in warfare. Military planners feared that China was developing robotic technology quickly and Australia needed urgently to do the same.
The result was the Trusted Autonomous Systems Defence Cooperative Research Centre which would work with technology companies like Boeing and with universities on the robotic machines themselves, but also with universities to develop legal and ethical codes to be embedded in the battlefield robots.
The Centre agreed a $9 million contract to work on the rules. The UNSW would develop the ethics side at its Canberra campus and the University of Queensland would do the legal work.
The $4 million ethics part of the contract was ended abruptly.
The University of NSW said: "UNSW and the Trusted Autonomous Systems Defence Cooperative Research Centre have mutually agreed to terminate the contract effective Friday 12 February."
The Defence Department said: "Defence is aware that UNSW and the Trusted Autonomous Systems Defence Cooperative Research Centre (TAS DCRC) mutually agreed to the termination of their contract."
After the contract was cut, the Department of Defence said work on the ethics of military robots was "an ongoing priority and Defence is committed to developing, communicating, applying and evolving ethical AI (Artificial Intelligence) frameworks".
The ethics researchers in Canberra would not comment.
While the full background to the rift is not known, it is understood there had been personality clashes.
However, one source of friction is said to have been the academics' research into what officers felt about fighting alongside robot warriors. A survey of 8000 cadets indicated widespread unease.
The results published in respected journals indicated that substantial numbers of Australia's trainee officers at military colleges in Canberra would be "unwilling" to operate near robot warriors.
The officers of the future were asked "about their willingness to deploy in a team 'involving robots to achieve a combat mission'."
They were given different scenarios. When the machine with lethal power had the most ability to think and act on its own, most trainee officers said they would be "unwilling" or "somewhat unwilling".
"While a minority would be currently willing, the majority of this cohort harbors a discomfort with deploying alongside autonomous systems with the independent capability to apply force," the researchers said in their published paper.
The main reason given by 80 per cent of the trainee officers was the safety of robot warriors in battle.
Whatever the source, a bitter rift emerged between civilian researchers who were part of the university where academic freedom is paramount and people in the offshoot of the Defence Department.
The Defence Department is currently engaged in a big development and deployment of Robotic and Autonomous Systems - RAS, or "robot warriors", in plain English.
The department outlined its reasoning in a report, Concept for Robotic and Autonomous Systems.
Swarms of killer drones is one concept being discussed. Pilotless, "intelligent" fighter planes are already being developed by the Royal Australian Air Force and Boeing.
Proponents of robot warriors say they do not commit the bad actions humans are capable of in anger and revenge, but they need programming and that raises complex questions.
A human soldier who sees a target surrounded by children can make a decision about firing. Can a machine?
Can robot warriors be programmed to recognise combatants who are surrendering?
Can they distinguish reliably between a small aggressive adult and a large innocent child? And how would they act in each case?
The Defence Department says how ethics is incorporated into robots remains important, but it hasn't said how the necessary research will be carried out now that the existing contract has been cut.