An official report created for President Biden and Congress has urged the US government to reject calls for a global ban on AI-powered autonomous weapons, as any commitments from Russia or China “likely would be empty ones.”
The recommendation was made by US National Security Commission for AI, a panel headed by ex-Google CEO Eric Schmidt and former deputy secretary of defense Robert Work. Work had previously said that the US has a “moral imperative” to explore AI weapons.
The new report asserts that the US could use autonomous weapons in a safe and lawful manner:
Provided their use is authorized by a human commander or operator, properly designed and tested AI-enabled and autonomous weapon systems can be used in ways that are consistent with international humanitarian law.
However, critics were quick to dispute the claims.
“The most senior AI scientists on the planet have warned them about the consequences, and yet they continue,” Professor Noel Sharkey, spokesman for the Campaign To Stop Killer Robots, told the BBC. “This will lead to grave violations of international law.”
[Read: How do you build a pet-friendly gadget? We asked experts and animal owners]
The report was also criticized by the International Committee for Robot Arms Control. The NGO tweeted that the commission featured “all US tech companies looking for a big chunk of the US defense budget.”
Autonomous machines with the power and discretion to select targets and take lives without human involvement are politically unacceptable, morally repugnant and should be prohibited by international law. https://t.co/DhG1DqY8Rx
— António Guterres (@antonioguterres) March 25, 2019
Human Rights Watch and UN Secretary-General António Guterres have also called for prohibitions on fully autonomous weapons. However, only around 30 countries currently support the ban.
With China and the US among the many absentees, campaigners fear that lethal autonomous weapons could soon become the norm.
Get the TNW newsletter
Get the most important tech news in your inbox each week.