The US Department of Defense is closer to deploying drones powered by AI that can autonomously decide to kill human targets, according to The New York Times.

Autonomous killer robots are being developed by the US, China, and Israel that use AI technology to select targets and fire weaponry. The Pentagon aims to deploy AI-controlled drones in the near future, according to a notice that was released earlier in 2023.

Multiple governments are lobbying the United Nations (UN) to pass a binding resolution that would prohibit the use of such drones and other AI weaponry. But the US along with Russia, Australia, Israel, and other countries are fighting against it, urging for the resolution to be non-binding instead. China is also pushing to include massive loopholes in international laws on the matter.

Members of the US delegation to the UN have argued that current human rights law should just be extended to victims of drone attacks, and that new resolutions are not necessary.

“This is really one of the most significant inflection points for humanity,” said Austria’s chief negotiator on the issue. “What’s the role of human beings in the use of force — it’s an absolutely fundamental security issue, a legal issue and an ethical issue.”

Learn the benefits of becoming a Valuetainment Member and subscribe today!

In August, US Deputy Secretary of Defense Kathleen Hicks said AI-controlled drone swarms and similar technology is crucial for America’s hypothetical strategy against China. She cited the advantages that the People’s Liberation Army (PLA) have in terms of quantity of people and weapons, and claimed automatic drones would offset these strengths.

“We’ll counter the PLA’s mass with mass of our own, but ours will be harder to plan for, harder to hit, harder to beat,” she had said according to Reuters. She said America will be deploying “multiple thousands” of autonomous killing machines in the next two years.

US Air Force Secretary Frank Kendall told The New York Times that AI drones, for the sake of victory in war, must have the ability to make autonomous decisions on the elimination of targetsbut ensured the reporter that humans will still have control over when and why they are deployed.

“Individual decisions versus not doing individual decisions is the difference between winning and losing — and you’re not going to lose,” Kendall said. “I don’t think people we would be up against would do that, and it would give them a huge advantage if we put that limitation on ourselves.”

AI-controlled drones have already been deployed by Ukraine in its war against Russia, according to an October report from The New Scientist. It is not known if any have killed human targets without human oversight.

Add comment