Robotic cars must learn to understand people.



    Companies that are engaged in the development of robotic cars, more and more. It is understandable - after all, many scientists, futurologists, specialists in artificial intelligence predict that in the future people will move in vehicles that drive themselves, without driver control. Robotic cars, and it is about them, have already appeared. True, they are not perfectly dismantling the situation on the roads, but the hardware and software of such machines are constantly being improved and overall progress is noticeable.

    The motion of autonomous machines is controlled by algorithms that are rather complex. They analyze the situation on the roads and drive a vehicle. If the car sees the green light, the computer system allows the vehicle to move. At the traffic lights red? Then you need to stop. The set of traffic rules is quite clear, although not mathematically accurate. But, in principle, it is still possible to teach a computer to ride along the roads. Or not? Specialists from Stockholm University claimthat the developers of autonomous machine control systems do not take into account another factor - the social one. Driving a car, they say, is not just moving from point A to point B, but also social interaction between drivers. And without this factor, it is impossible to train a computer to drive along the road without incident.

    Barry Brown, a specialist in computer systems and human interaction, states the following: “Driving a car is not only a set of mechanical operations, but also a complex social activity. Until all cars become autonomous, autopilots will be safe and effective only if they can interact with human drivers. Autonomous car developers must consider the choices and influence of other drivers and passengers. ”

    The scientist is dissatisfied with the fact that the developers of robotic cars do not want to open their projects to the public. Nevertheless, some data can be collected from those sources that are still in the public domain. We are talking about videos from YouTube, which shows the movement of robotic cars. Most of the entries are made by Tesla drivers or by people who decide, for one reason or another, to remove Tesla while driving on the road. There are also videos with cars Google, Volvo and Honda.

    According to scientists, YouTube user entries are a great source of information about autonomous vehicle management systems. These videos also show how people interact with robotic cars. In total, experts analyzed 69 different clips uploaded to YouTube by 63 users from the USA, UK, Germany, France, Sweden, Hong Kong, Iceland and Canada. On average, the duration of one video was 9 minutes. But 7 videos are much longer - in this case, the duration of each video exceeds half an hour. In one of the cases, the author of the video filmed all his way along the road using a system of 8 recorders.

    At the very beginning, scientists have agreed on one important point - they will draw information about the operation of autonomous machines only from videos, and not from news resources or documents provided by companies that are developing ro-mobiles. And it gave its results - it was possible to study the work of the systems without a pre-formed opinion. It was also helped by the fact that many videos were supplemented with comments from shooting participants and YouTube users.

    The authors of the work studied two main aspects of the movement of rob-mobiles on the road. The first is the interaction of the driver with the computer control system. If there is such a system in the car, the driver learns to interact with it. Scientists tried to observe the important points of this interaction. The second aspect is the autopilot interaction with other drivers on the road.


    The situation on the road, studied by the project participants. The first frame is a sun flare on the Tesla camera. The second frame - the system begins to signal the danger. The third - the autopilot directs the car into the oncoming lane. Fourth - the driver intercepts the control, correcting the situation (frames - screenshots from the video are not of the best quality, - Ed.)

    As it turned out, in many cases the machines with the automatic control system worked perfectly. But problems were identified. For example, the same Autopilot in Tesla cars sometimes incorrectly recognized road markings, confused lanes or practically stopped functioning normally due to bright sunlight hitting the front (due to sun glare and a number of unsuccessful coincidences, an accident involving the Tesla car and the vanin which the owner of the electric car died).


    In general, it turned out that most of the video authors do not rely 100% on the control systems of their machines. And this is justified, since there were quite a few dangerous moments. One of them, for example, was related to the interaction of autopilot and other drivers. Motorcyclist, overtaking Tesla, showed a transition to another lane. The computer system did not recognize the gesture and clipped the motorcyclist.

    The second case showed the difference between the actions of the motorist and the computer control system. On the way, there were two robotic cars, between which there was an interval set by the rules of traffic regulations, assuming the safety of both vehicles and their drivers. But suddenly there was a third car driven by a man. His driver decided that the distance between the cars driving in front of him was just perfect for him and was rebuilt. Autopilots, on the other hand, always strive to maintain a safe distance, even at traffic lights, when human drivers push their cars close to each other, leaving almost no free space.


    In general, drivers on the road often use verbal signals, passing, for example, a pedestrian or another car. Autopilot, of course, does not recognize such signals. And the difference in the behavior of the robot and the driver on the road is very different.

    For this reason, the authors of the study believe that the developers of computer control systems for automobiles need to take into account a very important factor - the social component of road traffic. This is difficult to do, but necessary, because so far there are much more human drivers on the roads than robotic cars. The developers of the latter, in the literal sense of the word, need to teach their systems to “understand” people, and not just read road signs and markings. One can argue with that, but there is a rational grain here.

    Scientists hope that the results of their research will be used by developers of robots, in order to create more efficient control systems and full-fledged autopilot.

    Also popular now: