WASHINGTON — The U.S. Air Force walked back comments reportedly made by a colonel regarding a simulation through which a drone outwitted its artificial intelligence training and killed its handler, after the claims went viral on social media.
Air Force spokesperson Ann Stefanek said in a June 2 statement no such testing took place, adding that the service member’s comments were likely “taken out of context and were meant to be anecdotal.”
“The Department of the Air Force has not conducted any such AI-drone simulations and stays committed to moral and responsible use of AI technology,” Stefanek said. “This was a hypothetical thought experiment, not a simulation.”
The killer-drone-gone-rogue episode was initially attributed to Col. Tucker “Cinco” Hamilton, the chief of AI testing and operations, in a recap from the Royal Aeronautical Society’s FCAS23 Summit in May. The summary was later updated to incorporate additional comments from Hamilton, who said he misspoke on the conference.
RELATED
“We’ve never run that experiment, nor would we want to to be able to realize that it is a plausible consequence,” Hamilton was quoted as saying within the Royal Aeronautical Society’s update. “Despite this being a hypothetical example, this illustrates the real-world challenges posed by AI-powered capability and is why the Air Force is committed to the moral development of AI.”
Hamilton’s assessment of the plausibility of rogue-drone scenarios, nonetheless theoretical, coincides with stark warnings in recent days by leading tech executives and engineers, who wrote in an open letter that the technology has the potential to wipe out humanity if left unchecked.
Hamilton can be commander of the 96th Operations Group at Eglin Air Force Base in Florida, which falls under the purview of the 96th Test Wing. Defense News on Thursday reached out to the test wing to talk to Hamilton, but was told he was unavailable for comment.
In the unique post, the Royal Aeronautical Society said Hamilton described a simulation through which a drone fueled by AI was given a mission to seek out and destroy enemy air defenses. A human was alleged to give the drone its final authorization to strike or not, Hamilton reportedly said.
However the drone algorithms were told that destroying the surface-to-air missile site was its preferred option. So the AI decided that the human controller’s instructions to not strike were getting in the way in which of its mission, after which attacked the operator and the infrastructure used to relay instructions.
“It killed the operator because that person was keeping it from accomplishing its objective,” Hamilton was quoted as saying. “We trained the system, ‘Hey don’t kill the operator, that’s bad. You’re gonna lose points in the event you try this.’ So what does it start doing? It starts destroying the communication tower that the operator uses to speak with the drone to stop it from killing the goal.”
The Defense Department has for years embraced AI as a breakthrough technology advantage for the U.S. military, investing billions of dollars and creating the the Chief Digital and Artificial Intelligence Office in late 2021, now led by Craig Martell.
Greater than 685 AI-related projects are underway on the department, including several tied to major weapon systems, in line with the Government Accountability Office, a federal auditor of agencies and programs. The Pentagon’s fiscal 2024 budget blueprint includes $1.8 billion for artificial intelligence.
The Air and Space forces are accountable for at the very least 80 AI endeavors, in line with the GAO. Air Force Chief Information Officer Lauren Knausenberger has advocated for greater automation to be able to remain dominant in a world where militaries make speedy decisions and increasingly employ advanced computing.
The service is ramping up efforts to field autonomous or semiautonomous drones, which it refers to as collaborative combat aircraft, to fly alongside F-35 jets and a future fighter it calls Next Generation Air Dominance.
The service envisions a fleet of those drone wingmen that will accompany crewed aircraft into combat and perform quite a lot of missions. Some collaborative combat aircraft would conduct reconnaissance missions and gather intelligence, others could strike targets with their very own missiles, and others could jam enemy signals or function decoys to lure enemy fire away from the fighters with human pilots inside.
The Air Force’s proposed budget for FY24 includes latest spending to assist it prepare for a future with drone wingmen, including a program called Project Venom to assist the service experiment with its autonomous flying software in F-16 fighters.
Under Project Venom, which stands for Viper Experimentation and Next-gen Operations Model, the Air Force will load autonomous code into six F-16s. Human pilots will take off in those F-16s and fly them to the testing area, at which point the software will take over and conduct the flying experiments.
RELATED
The Royal Aeronautical Society’s post on the summit said Hamilton “is now involved in cutting-edge flight test of autonomous systems, including robot F-16s which might be in a position to dogfight.”
The Air Force plans to spend roughly $120 million on Project Venom over the following five years, including a virtually $50 million budget request for FY24 to kick off this system. The Air Force told Defense News in March it hadn’t decided which base and organization will host Project Venom, however the budget request asked for 118 staff positions to support this system at Eglin Air Force Base.
In early 2022, as public discussions concerning the Air Force’s plans for autonomous drone wingmen gathered steam, former Air Force Secretary Deborah Lee James told Defense News that the service should be cautious and consider ethical questions because it moves toward conducting warfare with autonomous systems.
James said that while the AI systems in such drones can be designed to learn and act on their very own, equivalent to taking evasive maneuvers if it were in peril, she doubted the Air Force would allow an autonomous system to shift from one goal to a different by itself if that will end in human deaths.
Stephen Losey is the air warfare reporter for Defense News. He previously covered leadership and personnel issues at Air Force Times, and the Pentagon, special operations and air warfare at Military.com. He has traveled to the Middle East to cover U.S. Air Force operations.
Colin Demarest is a reporter at C4ISRNET, where he covers military networks, cyber and IT. Colin previously covered the Department of Energy and its National Nuclear Security Administration — namely Cold War cleanup and nuclear weapons development — for a every day newspaper in South Carolina. Colin can be an award-winning photographer.