
-
43
Views
-
0
Comments
-
1
Like
-
Bookmark
The fog of AI war: how drones are changing combat
The rise of AI drones in conflict zones creates a responsibility gap. New data shows how compressed targeting cycles limit vital human oversight
The accelerating deployment of AI in conflict zones
The landscape of modern warfare is undergoing a significant transformation, driven by the escalating deployment of AI-powered autonomous drones. This evolution is redefining military strategy, particularly concerning battlefield transparency and the extent of human oversight. The emerging paradigm is being termed the 'Fog of AI War,' a condition characterized by an overwhelming influx of machine-generated data and severely compressed targeting cycles. This phenomenon inherently limits the capacity for meaningful human intervention.
Traditionally, the 'fog of war' originated from an inherent scarcity of reliable information. In contrast, the 'Fog of AI War' stems from a diametrically opposed challenge: an excessive volume of machine-generated data combined with expedited targeting procedures. AI systems demonstrate an unparalleled ability to process terabytes of sensor data at speeds unachievable by human operators. Furthermore, their capacity to coordinate complex air defense systems at the requisite velocities for intercepting ballistic missiles offers demonstrable advantages in force protection. However, this inherent acceleration has a critical consequence: it compresses the entire cycle of sensing, analysis, and targeting. This compression significantly reduces the temporal window available for human operators to critically scrutinize machine-generated outputs.
The result is a scenario where human oversight, while potentially present in a procedural sense, may be substantively vacant, lacking sufficient time for informed judgment regarding target identification, proportionality assessments, and ultimate strike decisions. The rapid pace of AI development, coupled with less constrained deployment practices by various actors, further exacerbates concerns regarding the adequacy of current frameworks designed to ensure appropriate human control. Certain AI systems, particularly those employing 'black box' algorithms, possess the capability to interpret instructions rather than merely execute them. This capacity introduces the potential for such systems to select targets based solely on mission accomplishment criteria, potentially disregarding established moral or ethical limitations. Consequently, this raises an acute 'accountability problem,' where responsibility for actions can become fragmented and difficult to attribute definitively among a diverse group of stakeholders including developers, data engineers, procurement officials, system operators, and commanding officers.
Autonomous deployment case studies
Recent global conflicts provide empirical insights into the practical implications of AI-driven combat operations.
Operation Epic Fury: US-Iran conflict, February 2026
The conflict termed 'Operation Epic Fury' between the United States and Iran in February 2026 marked a significant milestone, representing the United States' initial combat deployment of Low-cost Uncrewed Combat Attack System (LUCAS) drones. The LUCAS drone, reportedly reverse-engineered from the Iranian Shahed-136, incorporates vision-based object recognition for enhanced precision, a notable departure from reliance on static satellite coordinates. These one-way attack drones are characterized by their substantial cost-effectiveness, with an approximate unit price of $35,000, in stark contrast to the $2 million cost of a Tomahawk cruise missile.
US Central Command reported the deployment of 'hundreds' of these unmanned platforms during Operation Epic Fury. Furthermore, large language models were reportedly utilized to process satellite imagery, evaluate signals intelligence, and conduct battle simulations throughout the engagement. In the initial four days of Operation Epic Fury, the United States and Israel claimed to have struck 4,000 targets, a number that surpasses the total strikes conducted during the first six months of the bombing campaign against ISIS. Reports indicated a strategic objective by the US to achieve 1,000 strikes within a single hour.
Russia-Ukraine war
The ongoing Russia-Ukraine war serves as a prominent contemporary illustration of the evolving nature of warfare, particularly highlighting the widespread deployment of drones. This conflict exemplifies a definitive shift towards low-cost, high-volume drone warfare and the effective democratization of precision strike capabilities. Since 2022, both Russia and Ukraine have deployed millions of one-way attack drones against each other. On April 16, 2026, Russian forces initiated a substantial aerial assault on Ukraine, launching nearly 700 drones alongside dozens of ballistic and cruise missiles. This coordinated attack resulted in a reported 16 fatalities and over 100 injuries. In response, Ukraine's air force reported successful interception or disabling of 667 out of 703 incoming targets, a total that included 636 Shahed-type drones and other uncrewed aerial vehicles. Furthermore, the tactical application of drones extends beyond direct strike capabilities.
In December 2025, Ukraine's 13th National Guard Brigade Khartiya reportedly secured a strategic position north of Kharkiv by utilizing dozens of land and aerial drones, reportedly achieving this objective without deploying infantry or sustaining any losses. Ukrainian President Zelenskyy has also stated that human-guided bots have conducted over 22,000 missions in the preceding three months alone. In parallel, Ukrainian forces have developed and implemented low-technology countermeasures, notably covering hundreds of kilometers of roads with nets specifically designed to entangle FPV (First Person View) drones. In an effort to counter advanced aerial threats, a Franco-Ukrainian startup named Alta Ares has developed AI interceptors that demonstrate a 54% success rate in downing Russian drones, including Shahed models, with the stated objective of mass-producing these systems at a significantly lower cost than traditional anti-aircraft missiles.
Ethical and strategic implications
The accelerated pace of decision-making enabled by AI systems introduces substantial ethical and strategic challenges, particularly concerning human control and accountability in military operations.
Human control and accountability
The integration of AI into military decision-making processes frequently creates what is termed a 'responsibility gap' concerning potential war crimes. Algorithmic mediation can obscure the direct attribution of criminal responsibility. While the Department of Defense (DoD) Directive 3000.09 governs autonomy in weapon systems, it does not explicitly prohibit autonomous weapons or mandate a human operator 'in the loop' for every use-of-force decision. Experts contend that a sole focus on 'humans in the loop' is insufficient, as the primary locus of control increasingly shifts to the system's design phase, where ethical constraints and operational realities must be embedded. AI-enabled systems can also induce 'cognitive overload' in human operators due to the sheer volume of data requiring assessment. Studies have indicated that human operators involved in recent AI-targeting operations in Gaza spent only seconds to verify and approve a target strike. This compressed verification period raises concerns that automated systems are outpacing the human capacity for thorough target assessment and approval.
Accelerating global AI arms race
The global landscape is witnessing an accelerating AI arms race, with nations including the United States, China, and Russia making substantial investments in AI-driven military systems and counter-drone technologies. The Pentagon has requested over $13 billion for autonomous systems for fiscal year 2026, signaling a significant commitment to this area. China has demonstrated advanced capabilities, including drones that can autonomously fly in formation with manned fighter jets. Russia is reportedly developing Lancet drones with autonomous target selection capabilities. The global counter-Unmanned Aerial System (UAS) market is projected to experience substantial growth, expanding from approximately $6.64 billion in 2025 to an estimated $20.31 billion by 2030, representing a compound annual growth rate of approximately 25.1%.
Military units, such as the US Army's XVIII Airborne Corps, are specifically focusing on counter-UAS capabilities and the integration of disparate sensors, response systems, and mission software into a cohesive operational picture. A critical development in this domain is 'edge AI,' which enables drones to process data, make decisions, and execute missions independently, without continuous internet connectivity. This capability mitigates the effectiveness of GPS denial and signal jamming tactics. Companies like ZenaTech's ZenaDrone are actively showcasing AI defense drones and advancements in counter-UAS defense systems, interceptors, and underwater drones, further illustrating the breadth of innovation in this rapidly expanding field.
Key takeaways
- The 'Fog of AI War' is characterized by an excess of machine-generated data and accelerated targeting processes, contrasting with traditional 'fog of war' caused by information scarcity.
- AI systems process terabytes of sensor data and coordinate air defenses at speeds exceeding human capabilities, enhancing soldier protection.
- Compressed sensing, analysis, and targeting cycles reduce the time for human scrutiny of AI-generated outputs, potentially rendering human oversight nominal rather than substantive.
- Operation Epic Fury (US-Iran Conflict, February 2026) saw the US deploy hundreds of Low-cost Uncrewed Combat Attack System (LUCAS) drones, reverse-engineered from Iranian Shahed-136s.
- LUCAS drones utilize vision-based object recognition for precision, costing approximately $35,000 per unit compared to $2 million for a Tomahawk missile.
- During the first four days of Operation Epic Fury, the US and Israel reported striking 4,000 targets, aiming for 1,000 strikes per hour, utilizing large language models for intelligence processing and battle simulations.
- The Russia-Ukraine war features widespread deployment of millions of one-way attack drones, indicating a shift towards high-volume, low-cost drone warfare.
- On April 16, 2026, Russia launched nearly 700 drones and dozens of ballistic/cruise missiles at Ukraine, resulting in 16 fatalities and over 100 injuries.
- Ukraine's air force reported downing or disabling 667 out of 703 incoming targets, including 636 Shahed-type drones and other uncrewed aerial vehicles, on April 16, 2026.
- Ukraine's 13th National Guard Brigade Khartiya successfully secured a position north of Kharkiv in December 2025 using land and aerial drones without infantry losses.
- Ukrainian forces have implemented low-tech countermeasures, such as covering roads with nets to entangle FPV drones.
- A Franco-Ukrainian startup, Alta Ares, developed AI interceptors with a 54% success rate against Russian drones, aiming for mass production at reduced costs.
- The integration of AI in military decision-making creates a 'responsibility gap' for potential war crimes due to algorithmic mediation.
- DoD Directive 3000.09 governs autonomy but does not prohibit autonomous weapons or mandate continuous human-in-the-loop for every use-of-force decision.
- Studies indicate human operators in recent AI-targeting scenarios spent mere seconds verifying and approving target strikes, leading to concerns about cognitive overload and verification capacity.
- Global investment in AI-driven systems and counter-drone technologies is accelerating, with the Pentagon requesting over $13 billion for autonomous systems for fiscal year 2026.
- China has demonstrated drones capable of autonomous flight alongside fighter jets; Russia is developing Lancet drones for autonomous target selection.
- The global counter-UAS market is projected to reach approximately $20.31 billion by 2030, growing at a compound annual rate of about 25.1%.
Sources
- carnegieendowment.orghttps://carnegieendowment.org/europe/strategic-europe/2026/04/the-fog-of-ai-war
- thenews.com.pkhttps://www.thenews.com.pk/latest/1399216-ai-warfare-can-humans-really-control-autonomous-weapons
- thetelegraph.comhttps://www.thetelegraph.com/news/article/one-way-attack-drones-low-cost-high-tech-22209684.php
- militarytimes.comhttps://www.militarytimes.com/news/your-military/2026/02/28/us-confirms-first-combat-use-of-lucas-one-way-attack-drone-in-iran-strikes/
- defensescoop.comhttps://defensescoop.com/2026/04/07/us-launches-more-attack-drones-iran-epic-fury-adm-cooper-centcom/

