skip to content
Premium
This is an archive article published on September 4, 2023
Premium

Opinion Lessons from US’s AI-powered fighter plane for Indian armed forces

The Indian Air Force has long been battling falling squadron numbers while trying to maintain enough operational airpower to safeguard two hostile borders. AI-powered drones with a degree of autonomy that can act as wingmen will greatly enhance the IAF’s deterrence capabilities

AI in war, fighter planeFixing accountability in military systems will also have a knock-on effect on civilian AI safety standards, where many of the same questions need to be answered.(Representative/ File)
September 4, 2023 06:13 PM IST First published on: Sep 4, 2023 at 06:13 PM IST

Later this year, the United States Air Force will test the combat capabilities of a new aircraft, including having it chase and destroy a simulated target. This test is unlike any other combat test the US Air Force has undertaken in the past. For one, the aircraft, dubbed “Valkyrie”, is an experimental machine that will have no human pilot. Instead, the aircraft will be “piloted” by artificial intelligence that will autonomously create its own flight path and strategy to destroy the simulated target without any human intervention whatsoever. The Valkyrie is undergoing tests to determine its flight capabilities. If it does pass the combat test later this year, along with other similar tests, it is likely to be inducted into the US Air Force to fly alongside human-piloted fighter jets as their “wingman”.

This moment has long been coming. Since the 1990s, a Revolution in Military Affairs (RMA) has been heralded, where traditional battlefields will undergo a fundamental change due to rapid advancements in technology. In the last decade, AI has come into focus as a potential catalyst for such an RMA, through the development of Lethal Autonomous Weapon Systems (LAWS). LAWS are weapon systems that can theoretically select and fire upon targets autonomously, without any human decision-making. While this might bring up images of killer robots and the Terminator, more practically, such autonomous systems are likely to be used to enhance the capabilities of human warfighters and decision-makers by rapidly synthesising battlefield data, helping select potential targets, and in the case of the air domain specifically — act as wingmen to human pilots by providing combat and intelligence support during live flying missions. This is exactly what Valkyrie is expected to do.

Advertisement

There are four advantages to having AI-powered wingmen. First, these machines and systems are relatively inexpensive, with even the most advanced systems costing a fraction of a traditional fighter aircraft. Second, because they are inexpensive, they can be deployed in greater numbers, allowing militaries to maintain air superiority or overwhelm enemy air defences through sheer brute force. They are also more expendable, allowing them to be used for missions that would be traditionally deemed too dangerous for human pilots. Third, because such machines do not have humans piloting them, they can execute manoeuvres and flight patterns that would be physically impossible for a human being, opening new possibilities for tactics and strategies. Fourth, as mentioned above, they can collect and process battlefield data in real time, helping provide actionable inputs in a fraction of the time humans can.

There are, of course, several concerns with the use of AI in warfare. Primarily, there is the risk of AI making mistakes by misidentifying targets, leading to unacceptable collateral damage. There is also the moral question of whether AI systems should be allowed to autonomously select and fire upon targets or should humans be “kept in the loop”. And if so, to what extent? In either scenario, accountability for any mishap or wrongdoing needs to be clearly outlined. Fixing such accountability in military systems will also have a knock-on effect on civilian AI safety standards, where many of the same questions need to be answered.

Though the degree to which such systems will be allowed to be autonomous will depend on the doctrines of the relevant military, it is unlikely that any military would deploy a fully autonomous weapon system whose behaviour cannot be accurately predicted. The fog of war is already thick enough that commanders would most likely not want to introduce a new unknowable factor into the battlefield that might further complicate their own decision-making abilities.

Advertisement

The US, for example, recently updated its policy directive on autonomy in weapon systems, allowing them to use lethal force, but with the caveat that such autonomous systems must be explicitly designed to allow commanders to “exercise appropriate levels of human judgement over the use of force”.

These developments hold immediate lessons for India. The use of autonomous systems in warfare will soon become an objective fact. With the US publicly encouraging developments in this space, it can be safely assumed that China is not far behind. The Indian Air Force has long been battling falling squadron numbers while trying to maintain enough operational airpower to safeguard two hostile borders. While the Indian military has been rapidly acquiring drones in the last couple of years, these tend to be smaller, manually operated and without much “intelligence”. On the other hand, AI-powered drones with a degree of autonomy that can act as wingmen will greatly enhance the IAF’s deterrence capabilities and make it easier to rapidly deploy air cover along forward positions in difficult terrains such as the Himalayan border. India, therefore, needs to, on priority, undertake an in-depth assessment of the benefits and limitations of AI for its military aims and begin putting in place doctrines and policies that would allow the Indian armed forces to adequately, and safely, take advantage of this RMA.

The writer is Managing Partner, Evam Law & Policy

Latest Comment
Post Comment
Read Comments