Journalism of Courage
Advertisement
Premium

AI brings the robot wingman to aerial combat

A recently revised Pentagon policy on the use of AI in weapons systems allows for the autonomous use of lethal force.

AI powered plane | AI plane | AI Aerial CombatThe experimental Kratos XQ-58 unmanned combat aerial vehicle at Eglin Air Force Base near Fort Walton Beach, Florida on July 13, 2023. An Air Force program shows how the Pentagon is starting to embrace the potential of a rapidly emerging technology, with far-reaching implications for war-fighting tactics, military culture, and the defense industry. (Edmund D. Fountain/The New York Times)
Listen to this article Your browser does not support the audio element.

Written by Eric Lipton

It is powered into flight by a rocket engine. It can fly a distance equal to the width of China. It has a stealthy design and is capable of carrying missiles that can hit enemy targets far beyond its visual range.

But what really distinguishes the Air Force’s pilotless XQ-58A Valkyrie experimental aircraft is that it is run by artificial intelligence, putting it at the forefront of efforts by the U.S. military to harness the capacities of an emerging technology whose vast potential benefits are tempered by deep concerns about how much autonomy to grant to a lethal weapon.

Essentially a next-generation drone, the Valkyrie is a prototype for what the Air Force hopes can become a potent supplement to its fleet of traditional fighter jets, giving human pilots a swarm of highly capable robot wingmen to deploy in battle. Its mission is to marry artificial intelligence and its sensors to identify and evaluate enemy threats and then, after getting human sign-off, to move in for the kill.

On a recent day at Eglin Air Force Base on Florida’s Gulf coast, Maj. Ross Elder, 34, a test pilot from West Virginia, was preparing for an exercise in which he would fly his F-15 fighter alongside the Valkyrie.

“It’s a very strange feeling,” Elder said, as other members of the Air Force team prepared to test the engine on the Valkyrie. “I’m flying off the wing of something that’s making its own decisions. And it’s not a human brain.”

The Valkyrie program provides a glimpse into how the U.S. weapons business, military culture, combat tactics and competition with rival nations are being reshaped in possibly far-reaching ways by rapid advances in technology.

Story continues below this ad

The emergence of AI is helping to spawn a new generation of Pentagon contractors who are seeking to undercut, or at least disrupt, the long-standing primacy of the handful of giant firms who supply the armed forces with planes, missiles, tanks and ships.

The possibility of building fleets of smart but relatively inexpensive weapons that could be deployed in large numbers is allowing Pentagon officials to think in new ways about taking on enemy forces.

It also is forcing them to confront questions about what role humans should play in conflicts waged with software that is written to kill.

And gaining and maintaining an edge in AI is one element of an increasingly open race with China for technological superiority in national security.

Story continues below this ad

After decades of building fewer and fewer increasingly expensive combat aircraft — the F-35 fighter jet costs $80 million per unit — the Air Force now has the smallest and oldest fleet in its history.

That is where the new generation of AI drones, known as collaborative combat aircraft, will come in. The Air Force is planning to build 1,000 to 2,000 of them for as little as $3 million apiece, or a fraction of the cost of an advanced fighter, which is why some at the Air Force call the program “affordable mass.”

There will be a range of specialized types of these robot aircraft. Some will focus on surveillance or resupply missions, others will fly in attack swarms and still others will serve as a “loyal wingman” to a human pilot.

The drones, for example, could fly in front of piloted combat aircraft, doing early, high-risk surveillance. They could also play a major role in disabling enemy air defenses, taking risks to knock out land-based missile targets that would be considered too dangerous for a human-piloted plane.

Story continues below this ad

The AI — a more specialized version of the type of programming now best known for powering chat bots — would assemble and evaluate information from its sensors as it approaches enemy forces to identify other threats and high-value targets, asking the human pilot for authorization before launching any attack with its bombs or missiles.

The cheapest ones will be considered expendable, meaning they likely will only have one mission. The more sophisticated of these robot aircraft might cost as much as $25 million, according to an estimate by the House of Representatives, still far less than a piloted fighter jet.

“Is it a perfect answer? It is never a perfect answer when you look into the future,” said Maj. Gen. R. Scott Jobe, who until this summer was in charge of setting requirements for the air combat program, as the Air Force works to incorporate AI into its fighter jets and drones.

“But you can present potential adversaries with dilemmas — and one of those dilemmas is mass,” Jobe said in an interview at the Pentagon, referring to the deployment of large numbers of drones against enemy forces. “You can bring mass to the battle space with potentially fewer people.”

Story continues below this ad

The effort represents the beginning of a seismic shift in the way the Air Force buys some of its most important tools. After decades in which the Pentagon has focused on buying hardware built by traditional contractors like Lockheed Martin and Boeing, the emphasis is shifting to software that can enhance the capabilities of weapons systems, creating an opening for newer technology firms to grab pieces of the Pentagon’s vast procurement budget.

“Machines are actually drawing on the data and then creating their own outcomes,” said Brig. Gen. Dale White, the Pentagon official who has been in charge of the new acquisition program.

The Air Force realizes it must also confront deep concerns about military use of AI, whether fear that the technology might turn against its human creators (like Skynet in the “Terminator” film series) or more immediate misgivings about allowing algorithms to guide the use of lethal force.

“You’re stepping over a moral line by outsourcing killing to machines — by allowing computer sensors rather than humans to take human life,” said Mary Wareham, the advocacy director of the arms division of Human Rights Watch, which is pushing for international limits on so-called lethally autonomous weapons.

Story continues below this ad

A recently revised Pentagon policy on the use of AI in weapons systems allows for the autonomous use of lethal force — but any particular plan to build or deploy such a weapon must first be reviewed and approved by a special military panel.

“It is an awesome responsibility,” said Col. Tucker Hamilton, the Air Force chief of AI Test and Operations, who also helps oversee the flight-test crews at Eglin Air Force Base, noting that “dystopian storytelling and pop culture has created a kind of frenzy” around AI.

“We just need to get there methodically, deliberately, ethically — in baby steps,” he said.

Humans will continue to play a central role in the new vision for the Air Force, top Pentagon officials said, but they will increasingly be teamed with software engineers and machine learning experts, who will be constantly refining algorithms governing the operation of the robot wingmen that will fly alongside them.

Story continues below this ad

Almost every aspect of Air Force operations will have to be revised to embrace this shift. It’s a task that through this summer had been largely been entrusted to White and Jobe.

The Pentagon has already spent several years building prototypes like the Valkyrie and the software that runs it. But the experiment is now graduating to a so-called program of record, meaning if Congress approves, substantial taxpayer dollars will be allocated to buying the vehicles: a total of $5.8 billion over the next five years, according to the Air Force plan.

Back in 1947, Chuck Yeager, then a young test pilot from Myra, West Virginia, became the first human to fly faster than the speed of sound.

Seventy-six years later, another test pilot from West Virginia has become one of the first Air Force pilots to fly alongside an autonomous, AI-empowered combat drone.

Story continues below this ad

Tall and lanky, with a slight Appalachian accent, Elder last month flew his F-15 Strike Eagle within 1,000 feet of the experimental XQ-58A Valkyrie — watching closely, like a parent running alongside a child learning how to ride a bike, as the drone flew on its own, reaching certain assigned speeds and altitudes.

The basic functional tests of the drone were just the lead-up to the real show, where the Valkyrie gets beyond using advanced autopilot tools and begins testing the war-fighting capabilities of its AI. In a test slated for later this year, the combat drone will be asked to chase and then kill a simulated enemy target while out over the Gulf of Mexico, coming up with its own strategy for the mission.

An unusual team of Air Force officers and civilians has been assembled at Eglin, which is one of the largest Air Force bases in the world. They include Capt. Rachel Price from Glendale, Arizona, who is wrapping up a doctorate at the Massachusetts Institute of Technology on computer deep learning, as well as Maj. Trent McMullen from Marietta, Georgia, who has a master’s degree in machine learning from Stanford University.

One of the things Elder watches for is any discrepancies between simulations run by computer before the flight and the actions by the drone when it is actually in the air — a “sim to real” problem, they call it — or even more worrisome, any sign of “emergent behavior,” where the robot drone is acting in a potentially harmful way.

The hardest part of this task, Elder and other pilots said, is the vital trust building that is such a central element of the bond between a pilot and wingman — their lives depend on each other, and how each of them react. It is a concern back at the Pentagon too.

Officials estimate that it could take five to 10 years to develop a functioning AI-based system for air combat. Air Force commanders are pushing to accelerate the effort — but recognize that speed cannot be the only objective.

“We’re not going to be there right away, but we’re going to get there,” Jobe said. “It’s advanced and getting better every day as you continue to train these algorithms.”

 

Tags:
  • air force American military artificial intelligence
Edition
Install the Express App for
a better experience
Featured
Trending Topics
News
Multimedia
Follow Us
At Bengal borderSIR scare and a surge in people crossing over to Bangladesh: ‘They said we will be caught…'
X