In August 2021, a U.S. drone strike in Kabul killed 10 civilians – seven of them children – due to flawed data, a grim reminder of human error in war. Now, imagine that tragedy amplified by machines stripped of human judgment. The Pentagon’s headlong rush into AI-driven warfare – spanning autonomous drones, algorithmic targeting, and “agentic” systems – threatens to reshape conflict in ways that outpace ethical oversight and imperil global stability, particularly for the Global South.
The United States is betting billions on AI to dominate future battlefields. According to congressional analysts, the Pentagon has broadened AI spending across surveillance, targeting, and battlefield management platforms, although exact figures remain publicly unclear.
The Replicator initiative, launched in 2023, originally sought to field thousands of autonomous or semi-autonomous systems by mid 2025, a target that officials now concede is slipping due to delays and shifting requirements.
Projects like the XQ-58A Valkyrie, tested in 2024, integrate AI for semi-autonomous missions, while the Artificial Intelligence Rapid Capabilities Cell (AI RCC) pushes systems that initiate actions with reduced human intervention. Recent reporting indicates deeper collaboration with major AI companies, although confirmed contract values so far appear smaller than some industry claims.
The appeal is clear: AI’s speed in processing data and pinpointing targets promises tactical superiority. But the risks are staggering. AI systems, trained on incomplete or biased data, can misidentify civilians as threats or drones as hostile, risking escalations no human can reverse.
A 2024 Public Citizen report warns that Replicator’s drone swarms could overwhelm human oversight, increasing civilian casualties. In the Global South – often the stage for such conflicts – Western-centric data sets may misread local contexts, amplifying harm. If an autonomous or semi-autonomous system commits a fatal error, accountability becomes murky and the chain of responsibility remains unresolved.
The global stakes are rising fast. China is rapidly expanding its AI-enabled military programs, including reported testing of drone swarm coordination capabilities, according to regional analysts, while Russia’s drone innovations continue to reshape the war in Ukraine. Israel’s AI-assisted targeting in Gaza, documented in 2024, has raised serious ethical concerns. Yet a United Nations treaty on lethal autonomous weapons remains stalled, with the United States, China, and Russia all resisting binding constraints.
This unchecked arms race risks miscalculations that could spiral into global conflict, with the Global South bearing the heaviest burden. Washington’s policy debates, however, remain superficial. The 2025 Munich Security Report highlights global distrust in Western leadership, yet U.S. Senate hearings focus on AI’s “revolutionary” potential, sidestepping ethical perils. The Pentagon’s secrecy, evident in Replicator’s limited public documentation and shifting delivery timelines, restricts democratic
accountability. The Global South’s voice, essential given its disproportionate exposure to AI-driven errors, is almost entirely missing from the conversation.
This isn’t a far-off dystopia – it’s happening right now. The Pentagon’s haste, driven by fear of losing ground to rivals, overlooks AI’s fragility. Biased algorithms and cyber vulnerabilities could trigger unintended strikes, eroding the human restraint that averted Cold War disasters.
The 2024 Amnesty International report underscores the need for robust oversight to prevent AI from becoming a unchecked authority to use lethal force in regions like Africa and Asia. Recent developments have heightened these concerns. Defense sources disclosed in mid 2025 that AI-enabled targeting tools performed inconsistently in simulations involving dense urban terrain, raising questions about their reliability in real-world conflicts.
The Global South, still grappling with the aftermath of earlier drone campaigns, faces mounting risks as newer systems scale up. Community groups in countries affected by prior U.S. strikes, including in Somalia and Yemen, have warned that without transparency they may once again become testing grounds for unpredictable technologies. China’s advances add new urgency.
By mid 2025, Beijing’s accelerated testing of AI-powered drones in sensitive maritime zones highlighted a widening competitive gap that is increasing pressure on the Pentagon to deploy systems still under development. This dynamic raises the likelihood of catastrophic miscalculations in fragile regions such as the Horn of Africa. Without global norms, competing AI-driven military ecosystems could trigger unintended escalations particularly in complex environments such as Yemen and Afghanistan, where even human-directed operations often struggle to distinguish threat from civilian life.
The absence of accountability deepens the danger. A mid 2025 Human Rights Watch analysis criticized the opacity surrounding U.S. autonomous-system testing and noted that Replicator-related platforms still lack transparent and verifiable fail-safe mechanisms. In the Global South, where conflicts often blend civilian and combatant spaces, such opacity invites disaster. Involving local stakeholders, from civil society advocates to regional governments, is vital to ensure AI systems reflect local realities. Yet Washington’s centralized approach consistently overlooks this necessity.
The Global South, from Somalia to Pakistan, cannot afford to be a proving ground for experimental warfare. Washington must slow its reckless sprint and open an inclusive debate that brings legislators, ethicists, technologists, and affected communities into the discussion.
Can algorithms understand the sanctity of life? Can speed justify pushing humanity to the margins? Transparency, global norms, and strict oversight are non-negotiable. Without them, the Pentagon risks a future in which machines, not soldiers, shape the outcomes of war and the Global South pays the steepest price. History warns of the dangers of unchecked arms races. It is time to heed that warning or risk a world where humanity loses control.
Dear reader,
Opinions expressed in the op-ed section are solely those of the individual author and do not represent the official stance of our newspaper. We believe in providing a platform for a wide range of voices and perspectives, even those that may challenge or differ from our own. We remain committed to providing our readers with high-quality, fair, and balanced journalism. Thank you for your continued support.
