Close Menu
Stratnews GlobalStratnews Global
    Facebook X (Twitter) Instagram
    Trending
    • OpenAI Microsoft Revenue Cap Set at $38 Billion
    • Trump China Visit To Include Top US CEOs
    • U.S. Senate To Debate Clarity Act Crypto Bill
    • Santa Clara County Sues Meta Over Scam Ads
    • China Prepares Tianzhou-10 Mission Launch
    • Defence Applications and Dilemma of Artificial Intelligence
    • China High-Tech Expo Showcases AI Advances
    • China Develops Hydrogen Fuel Cell Stack
    • Support Us
    Stratnews GlobalStratnews Global
    Write for Us
    Tuesday, May 12
    • Space
    • Science
    • AI and Robotics
    • Industry News
    • Support Us
    Stratnews GlobalStratnews Global
    Home » Defence Applications and Dilemma of Artificial Intelligence

    Defence Applications and Dilemma of Artificial Intelligence

    Shailender Arya and Udaya ArunBy Shailender Arya and Udaya ArunMay 12, 2026 AI and Robotics No Comments6 Mins Read
    Military AI

    The AI Impact Summit in New Delhi focused our attention on AI, before the conflict in West Asia took it away. Even in this conflict, behind the missile and drone strikes, AI remains as force multiplier, distilling intelligence and enabling mission planning at unpreceded rates for the U.S. military. Earlier this year, Palantir’s Project Maven, an AI-powered surveillance and targeting system, was used by the U.S. military in operations targeting Venezuela.

    The defence application of AI are immense. Almost everything will change – surveillance & reconnaissance, targeting, command and control, electronic warfare, and logistics. By monitoring fuel levels, equipment wear and tear, and supply chain vulnerabilities in real-time, AI enables a perfect just-in-time supply model. This brings a shift from a reactive logistics to a predictive logistics situation ensuring that operational interruptions due to shortages become a thing of the past. Some change is simply efficient, such as enabling predictive maintenance of ships and aircraft using AI, but certain key combat functions are becoming irreversibly AI-dependent. On top of this irreversible list is ballistic missile defence, air defence, and counter-drone systems. With increased battlefield complexity, only AI can assign optimal sensors, track multiple incoming drones – swarm, kamikaze and surveillance, ballistic and cruise missiles, and aircraft in real time, assign shooters and interceptors, and not be overwhelmed. Only AI will not panic and assign a USD 5 million missile to shoot a USD 25,000 drone.

    However fascinating, this is not the ultimate use of AI in defence. The militaries do not adopt technologies – they reorganize around them with training, promotion, procurement, and doctrine realigning around the technology. Post the success of German Panzer divisions and Stuka bombers in World War II, the militaries around gave up horses and trench warfare, reorienting to mechanized warfare for next many decades.

    The Gulf War (1990–1991) led to the adoption of missile warfare, particularly precision-guided munitions (PGMs), theater ballistic missile defense, and deployment of advanced surveillance technologies. Drones were next to be adopted. They were the “Poor Man’s Air Force”, playing a key role in the Nagorno-Karabakh conflict. Russo-Ukrainian War cemented their role as cost-effective, scalable, and autonomous systems acting as force multipliers, transforming everything from tactical surveillance to high-value strikes.

    AI is next. The militaries will reorient around AI. The most impactful military role of AI is in decision making. It shall commence with AI-assisted decision support systems, presenting a quick, comprehensive picture to the commanders on battlefield. In due course, AI will take over by presenting operational situation and options in a manner that the response of commanders is largely predictable. It shall transit from AI-assisted to AI-enabled, and thence to AI-dependent. There will not be a Rubicon to cross, but by delegating the military decision-making on battlefield to AI, we are not upgrading; we are fundamentally altering the nature of human responsibility. The disagreement between Anthropic and Trump administration in U.S. centered around retaining legal safeguards in using Anthropic models in fully autonomous weapons. And that is the AI dilemma in defence.

    Diving deeper, increased AI complexity is leading us to a ‘Black Box’ problem. We often do not know what lies inside the system. We can no longer repair. There is limited explainability of emerging AI technologies. Coupled with algorithmic unreliability and unpredictable battlefields, these can be difficult to manage.

    The lack of predictability, or predictability as comprehended by human brain, shall expand in AI systems. The line between human-assisted AI and independent AI shall blur. In the global debate on military AI, often termed as Lethal Autonomous Weapon Systems (LAWS), and discussed under the Convention on Certain Conventional Weapons (CCW) in the United Nations, there is deep disagreement over the ‘Meaningful Human Control’ threshold.

    The next question is technical. AI in any system can run in the cloud or on the devices, called Edge AI, or in a combination. In military, there will always be Edge AI as a default option on all systems to enhance redundancy – drones, Unmanned Ground Vehicles (UGVs), and Unmanned Underwater Systems (UUS). If the networks are compromised, or cloud cannot be accessed due to network denial, would Edge AI on a drone make soldier-like decisions? Or will we see Killer Robots?

    We are not staring into the rise of machines. The military AI is manageable. The advent of nuclear weapons raised similar scenarios of uncontrolled escalation. It led to the concept of Mutually Assured Destruction (MAD), a Cold War-era military doctrine. However, MAD transformed the nature of warfare, making direct, full-scale conflict between major powers irrational and effectively preventing it. Useful nuclear treaties such as Anti-Ballistic Missile Treaty, 1972; Intermediate-Range Nuclear Forces Treaty, 1987; and Strategic Arms Reduction Treaty, 1991 enabled global nuclear arms control.

    Similarly, AI treaties may ban machines that reduce human beings to data points (stereotypes or patterns) and prohibit unpredictable military AI systems. Temporal limits which restrict on how long an AI system can operate autonomously and spatial limits which restrict the geographical area of operation could be set. A low-hanging fruit is banning AI for nuclear targeting and launch decisions.

    Safe growth of military AI needs close supervision of AI algorithm, increasing predictability, and minimizing AI hallucinations by grounding models in reliable data using Retrieval-Augmented Generation (RAG) and crafting specific prompts with military-specific context. International Humanitarian Law (IHL) including legal principles like distinction, proportionality, and precaution could be inserted into system design, training data, and algorithmic decision-making. Rigorous testing frameworks could verify IHL-compatibility. Countries could define rules of engagement that adapt to AI’s speed and set limits on algorithmic decision-making. Essentially, the military must build a legal framework that keeps pace with its technological reach. The ethical infrastructure is equally important.

    Mitigation also lies in not allowing the decline of practical skills, cognitive abilities, and critical thinking among the military leaders. Training the military with a combination of AI skills and human decision-making capabilities remains the way ahead. Military AI is a force-multiplier, it is not the force. The AI Impact Summit bought inclusive AI in conversations. The military AI should bring human intelligence and oversight over the battlefield AI.

    Author

    • Shailender Arya and Udaya Arun
      Shailender Arya and Udaya Arun

      Shailender Arya, is a military veteran, a former Advisor in the Ministry of Defence, is a Senior Advisor, The Asia Group.

      Udaya Arun is Director, Aerospace & Defense, US-India Business Council (USIBC).

      View all posts
    Featured
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Reddit Telegram WhatsApp
    Shailender Arya and Udaya Arun

      Shailender Arya, is a military veteran, a former Advisor in the Ministry of Defence, is a Senior Advisor, The Asia Group.Udaya Arun is Director, Aerospace & Defense, US-India Business Council (USIBC).

      Keep Reading

      OpenAI Microsoft Revenue Cap Set at $38 Billion

      Trump China Visit To Include Top US CEOs

      U.S. Senate To Debate Clarity Act Crypto Bill

      Santa Clara County Sues Meta Over Scam Ads

      China Prepares Tianzhou-10 Mission Launch

      China High-Tech Expo Showcases AI Advances

      Add A Comment
      Leave A Reply Cancel Reply

      Anti Drone System (CUAS)
      Latest Posts

      OpenAI Microsoft Revenue Cap Set at $38 Billion

      May 12, 2026

      Trump China Visit To Include Top US CEOs

      May 12, 2026

      U.S. Senate To Debate Clarity Act Crypto Bill

      May 12, 2026

      Santa Clara County Sues Meta Over Scam Ads

      May 12, 2026

      China Prepares Tianzhou-10 Mission Launch

      May 12, 2026

      Defence Applications and Dilemma of Artificial Intelligence

      May 12, 2026

      China High-Tech Expo Showcases AI Advances

      May 12, 2026

      China Develops Hydrogen Fuel Cell Stack

      May 12, 2026

      Airbound Hybrid Drone Offers 40 KM Range

      May 12, 2026

      PM Narendra Modi Greets Nation on National Technology Day

      May 11, 2026

      Subscribe to News

      Get the latest sports news from NewsSite about world, sports and politics.

      • Astronomical Events
      • Space Missions
      • Industry News
      • Science
      StratNewsGlobal Tech
      Facebook X (Twitter) Instagram LinkedIn YouTube
      © 2026 StratNews Global, A unit of BharatShakti Communications LLP
      • About Us
      • Contributors
      • Copyright
      • Contact
      • Write for Us

      Type above and press Enter to search. Press Esc to cancel.