UN struggles to agree rules on killer robots

“There is little time left.”

The warning, delivered to a room of diplomats at the Palais des Nations in Geneva, cut through hours of technical debate. Around the table, delegates leaned over heavily annotated draft texts, negotiating not only the future of warfare, but the limits of machines that may soon decide who lives and who dies.

After more than a decade of discussions, negotiations on lethal autonomous weapons systems — often described as “killer robots” — remain stalled. Yet outside the conference room, the technology is advancing rapidly, reshaping conflicts in real time. The result is a regulatory process trying to define limits for weapons that are already being deployed.

At the United Nations Convention on Certain Conventional Weapons (CCW), states are attempting to define rules for weapons capable of selecting and engaging targets with minimal human intervention. But progress is slow, and consensus remains elusive.

“This is a technology that already exists and is starting to be seen in combat,” Carolyne Mélanie Régimbal, Chief of Service at UNODA Geneva, said. “And we, as an international community, are trying to catch up” she added.

The gap between technological acceleration and diplomatic process now lies at the heart of the debate.

“The international system is always one step behind technological developments,” Régimbal said.

The world is trying to regulate a technology it is already using.

Inside the negotiations, divisions run deep.

At the centre of the discussions is a deceptively simple question: how much control should humans retain over machines capable of using lethal force?

Delegations disagree not only on principles, but on language. Some states advocate for strong, legally binding prohibitions, while others favour more flexible frameworks based on existing international law.

Even terminology has become contested. Phrases such as “human judgment and control” — seen by many as essential — are interpreted differently across delegations, with some resisting their formulation as strict legal obligations.

These disagreements reflect broader geopolitical tensions and competing military interests, making consensus difficult to achieve.

Beyond definitions, a more fundamental issue remains unresolved: accountability. If an autonomous system commits a violation of international law, who is responsible?

“We don’t know yet,” the UN official acknowledged.

Responsibility could lie with the state, the military operator, or the developer of the system — but no agreement currently exists. This legal uncertainty is one of the main obstacles to any future treaty.

While diplomats debate, the nature of warfare is already evolving.

Across recent conflicts, including in the Middle East, the growing use of drones and automated systems has transformed the battlefield. Autonomous and AI-assisted systems are already being deployed in conflicts such as Ukraine, Gaza and Iran. These technologies enable faster decision-making, lower costs, and the ability to deploy large numbers of systems simultaneously — challenging traditional military doctrines and defence systems.

What is emerging is a model of warfare defined by speed, scale and reduced human intervention.

The trend raises concerns among experts that autonomous capabilities may soon outpace the legal frameworks designed to regulate them.

Back in Geneva, negotiators are aware of the urgency. “There is little time left for discussing new ideas,” one diplomat warned, urging delegations to focus on compromise and build on existing proposals.

The immediate goal is modest: agree on a set of elements that could form the basis of a future international instrument. But even that remains uncertain.

Despite the complexity of the negotiations, the underlying objective is clear. “The ultimate goal is the protection of civilians,” the UN official said. Ensuring that human beings remain at the centre of decision-making is widely seen as essential to achieving that goal. Yet the definition of what constitutes “meaningful human control” — a cornerstone of the debate — remains unresolved. “If we had the answer,” the official said, “we would already have an agreement.”

After more than ten years of discussions, the process is approaching a critical moment. Negotiators hope to reach consensus on key elements by the end of the year — a first step towards a potential treaty. But with military technologies evolving faster than ever, the window for meaningful regulation may be closing.

The debate is no longer about whether autonomous weapons will shape war. It is about whether the rules will arrive before they do.

Suivant
Suivant

Le projet Lumen