Hackers can remotely control Tesla Model S using autopilot system

Original author: Elizabeth Montalbano
  • Transfer
Security researchers managed to gain remote control over the autopilot system of the Tesla Model S car and control it with the help of a game joystick. Thus, they drew attention to the potential security problems of modern driver assistance systems (Advanced Driver Assistance Systems, ADAS), the task of which is precisely to increase the safety of the one who is driving.

Researchers from the Tencent Keen Security Lab successfully activated the Tesla Tesla Autopilot autopilot system, gaining control over it, as reported in a new publication that details the details of the study.

A group that previously demonstrated their findings at the Black Hat USA 2018 Security Conference has posted a videoshowing hacking. The new report describes three ways to gain control of a car’s autopilot system by exploiting several vulnerabilities in an electronic control unit (ECU ).

Researchers highlight three major achievements in breaking the ECU autopilot system version 18.6.1. Firstly, using an error in the image recognition system of automatic wipers, they activated the windshield wipers. Secondly, by placing interfering stickers on the road that tricked the track's recognition system, they forced Tesla to maneuver into the oncoming traffic lane. Thirdly, they were able to remotely control the control of the car even if the autopilot was not activated by the driver.

“Thus, it was proved that by slightly changing the physical environment, we can control the car to a certain extent without a remote connection to it,” the researchers conclude in a report. “We hope that the potential defects revealed by a series of tests will attract attention with sides of manufacturers, which will increase the stability and reliability of their self-propelled machines. "

Risks of Progress


Of course, the researchers claim that they notified Tesla after successfully compromising the autopilot system and, according to Tencent, Tesla “immediately fixed” a number of errors.


Researchers at the Tencent Keen Security Lab have been able to compromise the autopilot system of Tesla Model S., a perfect driver assistance system. (Source: Tesla)

Regardless, the study demonstrates the relentless danger of potential hackers exploiting the openness and intelligence of modern cars as a ground for attacks; For the first time this opportunity was vividly demonstrated in the 2015 hack of a Jeep Cherokee , published in Wired.

“The average modern car contains hundreds of sensors and many on-board computers, each of which is potentially vulnerable to physical, software, and / or logical attacks,” said Jerry Gamblin, lead engineer for security intelligence at Kenna Security, in an interview with Security Ledger . “This fact creates an amazing ground for attacks that car manufacturers must prevent, and also creates an extensive target field for potential attackers.”

Since breaking into the Jeep, cars have become even more complex. So, in the automotive industry, ADAS technologies like Tesla Autopilot are rapidly developing

These systems should enhance driver capabilities and provide the car with smart security systems, such as collision avoidance systems, in order to increase safety. At the same time, the increased complexity makes such systems potentially destructive when compromised, which casts doubt on the safety of using ADAS technologies.

Privileges equal control


Researchers at Keen Security Labs said they used ROOT credentials ( obtained by remotely connecting by exploiting a number of vulnerabilities - approx. Transl. ) In carrying out the most frightening part of their hacking - taking control of the Tesla control system in a “contactless way”, as they themselves they write. Researchers used privileges to send autopilot control commands while the car was moving.

The possibility of influencing the wipers and the track control system was achieved due to the improved optimization algorithm used to create the so-called "warring elements", which were fed to the input of the corresponding vehicle systems.

Both the wipers and the road recognition system make their decisions based on camera data, as the researchers found. Thus, it was not very difficult to deceive them, forcing them to “see” conditions that did not actually exist.

Researchers have achieved this by sending images to the neural network of wipers and modifying road markings in the case of a road recognition system. In both experiments, the system responded to what it “saw” instead of the actual road conditions.

Competitor models are also closely monitored. While more and more systems rely on machine learning, more researchers are looking for ways to influence their work, giving false input .

Tesla's answer


On their blog, Tencent Keen published Tesla's response to the hack, which, surprisingly, was clearly defensive. The company rejected the compromise of wipers and road recognition systems due to the fact that they "will not happen in real life" and, therefore, should not be a cause for concern for drivers.

In their response, Tesla also emphasized that drivers, if desired, can turn off the automatic wiper system. In addition, they have the ability to "switch to manual control using the steering wheel or brake pedal and must be constantly prepared to do this," especially if there is a suspicion that the system is not working correctly.

Speaking about the use of ROOT privileges when hijacking a car, Tesla reminded the researchers that the company fixed the main vulnerability described in the report when updating the security system in 2017 and the subsequent extensive system update last year (the vulnerability was fixed in software version 2018.24 - approx. perev. ). Moreover, according to Tesla’s response, both of these updates were available even before Tencent Keen Security Lab told companies about its research, Tesla said.

“In the many years that we have been producing cars on the roads, we have never seen a single consumer who has been the victim of any vulnerability presented in the report,” the company added.

Leaving aside the company's protests, security experts are still not convinced that ADAS systems, like the Tesla Autopilot, will not cause chaos and damage if they fall under the control of intruders. “Manufacturers should take this into account when developing new systems,” said Jerry Gamblin.

“It is necessary to concentrate most of the attention on ensuring the security of systems that can cause serious harm to consumers and other passengers in case of compromise,” the expert advised. “Manufacturers must optimally distribute finances and correctly respond to any difficulties that arise from attacks on secondary systems and can affect the end user, who in no case should be at risk."

Also popular now: