“This technology is our future threat,” warns Serhiy Beskrestnov as he inspects a recently captured Russian drone. Unlike conventional weapons, it uses artificial intelligence to identify and strike targets without human control.
Beskrestnov, a consultant for Ukraine’s defence forces, has analysed countless drones since the invasion began. This one stands out. It neither sends nor receives signals, making it impossible to jam or detect.
Both Ukrainian and Russian forces now experiment with AI on the battlefield. They use it to locate enemy positions, process intelligence, and clear mines faster than ever.
Artificial intelligence becomes a frontline force
AI has become indispensable for Ukraine’s army. “Our forces receive over 50,000 video streams from the front each month,” says Deputy Defence Minister Yuriy Myronenko. “Artificial intelligence analyses the footage, identifies threats, and maps them for commanders.”
The technology allows rapid decision-making, optimises resources, and reduces casualties. Its most dramatic impact is on unmanned systems. Ukrainian troops now operate drones that lock onto targets and complete the final approach autonomously.
These drones cannot be jammed and are extremely difficult to shoot down. Experts predict they will evolve into fully autonomous weapons capable of seeking and destroying targets independently.
Drones that act on their own
“All a soldier needs to do is press a button on a smartphone,” explains Yaroslav Azhnyuk, CEO of Ukrainian tech firm The Fourth Law. “The drone will locate the target, drop explosives, assess the damage, and return to base. No piloting skills are required.”
Azhnyuk says these drones could greatly strengthen Ukraine’s air defences against Russian long-range drones such as the Shaheds. “A computer-guided system can outperform humans,” he notes. “It reacts faster, sees better, and moves more precisely.”
Myronenko admits that fully autonomous systems are still under development but says Ukraine is close. “We have already integrated parts of the technology into some devices,” he adds. Azhnyuk predicts thousands of these drones could be operational by the end of 2026.
Innovation meets risk
Full automation carries serious dangers. “AI might not distinguish a Ukrainian soldier from a Russian one,” warns Vadym, a defence engineer who requested anonymity. “They often wear the same uniform.”
Vadym’s company, DevDroid, produces remotely controlled machine guns that use AI to detect and track people. Automatic firing is disabled to prevent friendly fire. “We could enable it,” he says, “but we need more field experience and feedback before trusting it fully.”
Legal and ethical questions remain. Can AI follow the laws of war? Will it recognise civilians or surrendering soldiers? Myronenko stresses humans must make the final decision, even if AI helps. Yet he warns that not every military will act responsibly.
The global AI arms race intensifies
AI is driving a new kind of arms race. Traditional defences—jamming, missiles, or tanks—struggle against swarms of intelligent drones.
Ukraine’s “Spider Web” operation last June, when 100 drones struck Russian air bases, reportedly relied on AI coordination. Many fear Moscow could copy the tactic, both on the front lines and deep inside Ukraine.
President Volodymyr Zelensky told the United Nations that AI is fuelling “the most destructive arms race in human history.” He called for urgent global rules on AI weapons, stressing the challenge is “as urgent as preventing the spread of nuclear arms.”

