Why you shouldn't be afraid of killer robots

Original author: Paul Scharre
  • Transfer

The dystopia in which killer robots kill innocent people sounds awful, but let's be clear: this is all science fiction



A still from the film Slaughterbots, in which in the dystopian future autonomous killer drones fall into the hands of terrorists

Drones killers who fall into the hands of terrorists massacre innocent people. Robotic weapons of mass destruction wreak havoc and fear. The short film, created by supporters of the prohibitions of autonomous weapons, was created to make you believe that this dystopia is already very close , and that measures must be taken today. The film Slaughterbots [Rezneboty] was released in November 2017, simultaneously with the UN conference on autonomous weapons. The UN meeting did not end with anything concrete, but the video is gaining popularity. It has already been viewed by more than 2 million people and it has fallen into dozens of articles.. As propaganda, it works perfectly. And with the conviction to take the ban on autonomous weapons, it does not cope at all.


Of course, a world in which terrorists could put a swarm of killer drones on innocent people would be terrible, but is the future shown in the film realistic? A beautiful picture helps to hide spaces in the logic. The film plunges the viewer into an anti-utopian nightmare, but let's be clear: this is all science fiction.

The main assumption of the film is that in the future the military will create autonomous microdrones with shaped charges that can fly up to the head of a person and activate explosives, thereby killing him. In the film, these killer bots quickly fall into the hands of terrorists, which leads to a massive number of deaths around the world.



The main concept has a foundation in reality. In our world, the Islamic state [a terrorist organization banned in the Russian Federation]used ready-made purchased quadcopters , equipped with a small amount of explosives, to attack Iraqi forces, as a result of which several dozen soldiers were killed and wounded. Today’s terrorist drones are mostly remotely controlled, but amateur drones are gaining more and more autonomy. Latest modelsalready know how to fly to a fixed goal on their own, skirting obstacles, as well as independently monitor and follow moving objects. A small drone equipped with a face recognition system can, in principle, be used to autonomously search for and destroy certain people, as the Slaughterbots movie shows. In just a few minutes of searching the Internet, I found resources where you can download and train a free neural network that will recognize faces. And although no one has yet combined these technologies, as shown in the movie, all the components are already real.

But I want to make a statement: we cannot in any way prevent this technology from falling into the hands of future terrorists. This is sad, but it is necessary to understand. Just as terrorists can and do use cars to ram down crowds of people, the technology necessary to turn amateur drones into coarse autonomous weapons is already too common for it to be stopped. This is a problem, and the best response to it will be to concentrate on defensive measures, which can resist drones and catch terrorists using surveillance systems. "

The film uses this problem, but inflates it beyond measure, arguing that drones can be used by terrorists as weapons of mass destruction, and kill thousands of people. Fortunately, this nightmarish scenario is as likely as HAL 9000blocking your entrance to the gateway. The technologies shown in the video are realistic, but everything else is complete nonsense. The following assumptions are made in the video:

  • Governments will launch mass production of microdrones for use as weapons of mass destruction.
  • There is no effective defense against deadly microdrones.
  • Governments are unable to protect military-level weapons from terrorists.
  • Terrorists are capable of launching large-scale coordinated attacks.

These assumptions vary from controversial to fantastic.

Of course, the video itself is fiction, and people who are responsible for developing defense often use fictional scenarios to help politicians think through the consequences of likely events. I am a defense analyst working in a think tank, and in previous work at the Pentagon I was engaged in strategic planning. And I used fictional scenarios to illustrate the choice of military technology in which the US military should invest. But for these scenarios to be useful, they must be at least plausible. It must be something that can happen. The script used in the film Slaughterbots is unable to take into account the political and strategic reality of the use of military technology by the government.

First, there is no evidence that governments are planning to mass-produce small drones to kill large numbers of civilians. In my upcoming book, Army of None: Autonomous Weapons and the Future of War , I’m exploring future generations of weapons built in defense laboratories around the world. Russia, China and the USA all compete in the race for autonomy and artificial intelligence. But they create weapons, mostly aimed at fighting with military forces. This weapon is aimed at the defeat of military objectives [ counterforce ], and not at the defeat of other assets, among which are civilians [ countervalue]. An autonomous weapon of the first type, of course, has its own problems, but it is not developed for the purpose of mass destruction of the civilian population, and it cannot be reconfigured for such use.

Secondly, the video refers to the drones that can overcome any "opposition". Television experts shout that "we cannot protect ourselves." It is not even fiction, but a farce. For each military technology, there is opposition, and opposition to petty drones cannot even be called hypothetical. The US government is actively developingmethods of shooting, jamming, broiling, hacking, trapping and other methods of countering small drone. Microscopes on video can successfully withstand something as simple as wire mesh. The video shows heavy drones piercing holes in walls through which others penetrate - but the simplest multilayer protection can help against this. Military analysts are looking at the price ratio of defense and attack, and in this case the advantage is clearly on the side of static defense.


In a world where terrorists periodically launch attacks using self-made drones, people are unlikely to put up with the inconvenience of building robust defenses, just as people don’t wear body armor to protect them from the unlikely impact of shooters. But if an unfriendly country builds hundreds of thousands of drones capable of obliterating a city, you can be sure that it won't do without grids. The video takes a real problem: terrorists attacking with drones - and scales it without taking into account the reaction of other parties. If the production of deadly microdrones began on an industrial scale, protection and countermeasures against them would become a state priority, and in this case, the counteractions would be simple. And any weapon from which you can defend yourself with a wire mesh is not considered a weapon of mass destruction.

Third, the video implies that the military is unable to prevent terrorists from gaining access to military-level weapons. But today we don’t give terrorists hand grenades, anti-tank guns or machine guns [however, this statement is denied even by the latest biopic with Tom Cruisestarring / approx. trans.]. Terrorists attacking with drones worry everyone because they use improvised explosives attached to the finished technology. This is a real problem, but again, the video scales this threat to unrealistic amounts. Even if the military began to make deadly microdrones, the terrorists would be able to get a large number of them no easier than any other military technology. Weapons do gradually fall into the hands of participants in hostilities who are at war on the wrong side, but only because Syria is full of anti-tank self-guided missiles, does not mean that they are easy to meet in New York. Terrorists use airplanes and trucks precisely because successfully smuggling military-type weapons into a western country is not so easy.


The killer microdrones from the film

Fourth, the video suggests that the terrorists are capable of carrying out an attack with incredibly precise coordination. In one episode, two people release a swarm of 50 drones from the doors of the van. This episode itself is quite realistic; One of the problems with autonomous weapons is that a small group of people can launch a larger scale attack than if they had conventional weapons. A van with 50 drones is a reasonable opportunity. But the film brings the idea to the point of absurdity. It is alleged that in simultaneous attacks, about 8,300 people were killed. Then, if people with a van portray a typical attack, then for such damage to the terrorists would have to spend about 160 attacks around the world. Terrorists do often carry out coordinated attacks, but their number is usuallydoes not exceed ten . And in the video it is assumed not only the presence of super-weapons, but also the fact that it fell into the hands of super-villains.

In the film, hype and fear are used to overcome critical assumptions, as a result of which it prevents rational discussion of the risks associated with terrorist access to autonomous weapons. From the video it becomes clear that we should be afraid. But why be afraid? Weapons, who chooses their own targets independently (by the way, everything is not clear about this, by the way)? Weapons without countermeasures? What terrorists can get their hands on? Opportunities for autonomous weapons to increase the scale of attacks? If you want to cause fear of killer robots, then this video will do for you. But when trying to thoroughly analyze a problem, it does not stand up to even the simplest study. The video does not give any arguments, but is engaged in the creation of cheap sensations and inducing fear.

Naturally, the purpose of the video is to encourage viewers to act with fear.
The video ends with the words of a professor at the University of California at Berkeley, Stuart Russell.warning of the dangers of autonomous weapons and calling on the viewer to act so that this nightmare does not become a reality. I have great respect for Stuart Russell, as a researcher in the field of AI, and a person who contributes to the controversy on autonomous weapons. I invited Russell to events at the New American Security Center , where I am involved in a research program related to AI and global security . I have no doubt that Russell’s views are sincere. But the video in an attempt to convince the viewer does applications that are not backed up.

Worse, the proposed solution is a treaty banning autonomous weapons.- will not solve the real problems facing humanity with the development of autonomy in weapons. The ban will not stop terrorists from making homemade robotic weapons. Also, the ban on such weapons, which is shown in the video, will not affect the risks associated with the emergence of autonomous weapons from the military. In fact, it is not even clear whether the weapon shown in the film will fall under such a ban, because there it acts very selectively.

Concentrating on extreme and unlikely scenarios, the film actually hinders progress in dealing with real-world problems with autonomous weapons. States that are among the leaders in the development of robotic weapons are likely to mark fears based on this film. The film pours water into the mill of those who approveas if the fear of autonomous weapons is irrational and subject to excessive hype.

Autonomous weapons raise important questions about obedience to the laws of war, risk and control, and the moral role of people in hostilities. These are important issues worthy of serious discussion. When Russell and others join in a lively debate on these issues, I welcome this discussion. But the specified film does not fall into this category. The video has successfully captured the attention of the media, but its tendency to sensation prevents serious intellectual discussions that need to be conducted on the topic of autonomous weapons.

Paul Sharr is a Senior Scientist, Director of Technology and National Security at the New American Security Center (CNAS). From 2009 to 2012, he led a working group of the US Department of Defense, which developed guidelines for the use of autonomy in armaments.

Also popular now: