Art by Julian Goblet, Getty Images
Back at the base, I head to the Robot Operations Center, where a group of humans monitor sensors scattered across the ocean. The ROC is a windowless room with several rows of tables and computer monitors, mostly featureless except for walls adorned with inspirational quotes from figures like Winston Churchill and Steve Jobs. It’s here that I meet Capt. Michael Brasseur, leader of Task Force 59. He has tanned skin, a shaven head, a constant smile and the narrowing of a sailor’s eyes. (Brasseur has since retired from the Navy.) As he strides between the tables, he cheerfully explains how the ROC operates. “This is where all the data coming from the unmanned systems is integrated. This is where we use AI and machine learning to get some really interesting insights,” Brasseur says, rubbing his hands together as he talks and grinning broadly.
Monitors are flashing with activity. Task Force 59’s AI is highlighting suspicious vessels in the area. Already today, the AI has flagged several vessels that didn’t match its identification signals, prompting the fleet to investigate further. Brasseur showed me a new interface in development that will let the team perform many tasks on a single screen, from reviewing camera footage from drone ships to guiding them to activity sites.
“They can engage autonomously, but we don’t recommend it because we don’t want to start World War III.”
Brasseur and others at the base emphasize that the autonomous systems they are testing are only intended for sensing and detection, not armed intervention. “Task Force 59’s focus right now is on improving visibility,” Brasseur says. “Everything we do here is to support crewed vessels.” But some of the robotic vessels that took part in the exercises have shown how short the distance is between unarmed and armed. It’s just a matter of swapping payloads and tweaking software. One autonomous speedboat, the Seagull, is designed to track mines and submarines while trailing sonar arrays in its wake. Amir Alon, a senior director at Elbit Systems, the Israeli defense company that developed the Seagull, told me it could also be equipped with remote-controlled machine guns and deck-launched torpedoes. “It could engage autonomously, but we don’t recommend that,” he said, smiling. “We don’t want to start World War III.”
No, it isn’t. But Aron’s quip touches on an important truth: autonomous systems with lethal capabilities already exist all over the world. In any major conflict short of World War III, either side will soon face the temptation not only to arm these systems, but also, in some cases, to remove human oversight and allow machines to fight at machine speed. The only ones who will die in this war of AI vs. AI are humans. So it’s fair to wonder: How do these machines, and the people who build them, think?
This article will appear in the September 2023 issue. Subscribe to WIRED. Photo: Sam Cannon
Hints of autonomous technology have been present in the U.S. military for decades, from autopilot software for planes and drones to automated deck guns that protect warships from incoming missiles. But these were limited systems, designed to perform specific functions in specific environments and situations. Autonomous they may be, but they weren’t intelligent. It wasn’t until 2014 that Defense Department leaders began looking at more capable autonomous technology as a solution to a much bigger problem.
Bob Work, then deputy secretary of defense, was concerned that the country’s geopolitical rivals were “approaching parity with the U.S. military.” Work says he wanted to know “how to regain the upper hand”—how to ensure the U.S. could prevail in a potential conflict even if it couldn’t field as many soldiers, planes, and ships as, say, China. So Work asked a group of scientists and engineers where the Pentagon should focus its efforts. “They said autonomy, powered by AI,” Work recalls. Work began working on a national defense strategy that would foster innovation coming from the technology sector, including new capabilities provided by machine learning.