Can we control the automatic firing of the weapon?

Autonomous weapons systemsWhen weapons make the difference between life and death

The opening of the Winter Olympics was a spectacle: Hundreds of drones painted the Olympic rings in the night sky, recalls Toby Walsh.

"That was impressive evidence of what is possible. Those drones weren't human-controlled. One human can control a single drone. But if someone asks you how many people control 1,000 drones, the answer is simple: none. Not even 1,000 people could control 1,000 drones in this way. An algorithm has to do that. "

How can you stop autonomous drones?

The Professor of Artificial Intelligence at the University of New South Wales in Sydney, Australia, is actually very open to such automatic systems. But to imagine what such drones could do, the researcher refers to the incident at London's Gatwick Airport, when a few drones paralyzed the airport. They could still be withdrawn from circulation relatively easily.

"These drones were still remote-controlled, that's their weakness. They were stopped by disrupting the radio. If they were autonomous, they would work without radio and would be much more difficult to fight. You would have to shoot each one. That's a lot bigger challenge - especially with a whole swarm. It would overwhelm even the most modern military weapons to shoot down a hundred such autonomous drones. "

The software is created - piece by piece

The example shows: The only thing missing is the software to turn common technology into autonomous weapon systems. And some software components even already exist. For example, software from Amazon is able to recognize faces. Car manufacturers are working on assistance systems that should recognize pedestrians - such software can be adapted to allow weapon systems to recognize people.

"There are a lot of technologies out there that can be pieced together. We really have to decide that if we gave a few lines of software code the right to decide who can live and who must die, that would cross a moral line. That's moral." unacceptable."

The third revolution in warfare

Researchers and activists consider the development of such weapon systems to be the third revolution in warfare after the invention of black powder and nuclear weapons. They also fear that the presence of these weapons alone could increase the likelihood of armed conflict. Peter Asaro of the New School of Media Studies in New York City and co-founder of the International Committee for the Control of Robotic Weapons believes that warring parties become less inhibited when combat robots come into play. Because they would have to fear less of their own losses.

The good side of technology: Bundeswehr robot "Theodor" can remove dangerous objects independently (picture alliance / Peter Steffen)

"In addition, there would be the psychological effect that one party wants to forestall the other, be it before the opponent can develop countermeasures against the combat robots, be it because one party wants to forestall the use of combat robots."

Prevent the negative side of technology

The devices could exacerbate conflicts, says Asaro. For example, when they independently guarded a border.

"If such systems meet, you could trigger conflicts without a military or political decision being behind it. We see parallels to the lightning crash on the stock market, when competing computer systems worked against each other, strengthened each other and within a few hours the share prices rose by 40 Percent fell. Back in 2010, the entire system had to be reset. "

It is not about demonizing the technology in principle, says Toby Walsh, the professor of artificial intelligence. You just have to choose the good side and avoid the negative.