Singularity is approaching: AI starts to control robots
The introduction of the latest achievements in the field of artificial intelligence in robotics can change a lot - in particular, to modernize, modernize the process of production and warehousing of manufactured products. And this is only the beginning - the changes will affect a huge number of spheres.
But it all starts small. So, in one of the modern laboratories of robotics, one can observe such a picture: a robotic arm hangs over the cut pieces of cooked chicken, dives down and appears again with a single piece. A little later, the manipulator puts this piece in the package and closes it.
This robot, managed by specialized software developed by a company from San Francisco, is called Osaro. He is smarter than many other robots you have seen before. The software helps to perform the above operation in just five seconds. Osaro developers hope that soon their robot will be able to work in a Japanese food factory.
What does this mean? Do robots take over the world? Not. Anyone who cares about the uprising of robots should visit a modern factory in order to understand how far this uprising is. Most robots are powerful and accurate, yes. But they can do nothingthat is not programmed. Normal roboruka can take an object and move it where necessary. But this whole system is not too smart, in particular, it will not understand the difference between marshmallow and a lead cube, for it all of these are objects that can be transferred. In this regard, working with an asymmetrical piece of cooked chicken for a robot is something out of the ordinary.
Industrial robots for quite a long time remained "iron" without adding elements of artificial intelligence. And it is well noticeable. So, manipulators that are used in production, will not be able to open the door or pick up an apple. As far as AI is concerned, no one is pursuing its industrial use either. For the past few years, AI developers have been interested in board and computer games — more precisely, they have created systems for participating in such games. But the usual tasks that a person performs in the enterprise, passed unnoticed.
But all this will soon change. The artificial intelligence (its weak form) that controls the Osaro robots allows the manipulator to equally well perform a variety of tasks. This may be the opening of doors, sorting things, making decisions on warehouse operations (for example, where to shift). Like other AI algorithms, Osaro is no exception - it comes to “understanding” various tasks through learning. For this purpose, the camera is used, combined with AI-software and a powerful computing system located nearby.
The robot, which is controlled by AI, will be able to perform a wide range of production tasks. Including those that were considered until now the prerogative of man. It may well be that in the near future, robots will replace people in some areas, especially where it is necessary to perform routine operations. In particular, where objects need to be sorted, unpacked or pack. Such a robot is able to find where to move in an industrial workshop crammed with all sorts of things. “We are conducting a large number of experiments now, and people are trying to perform different tasks,” says Willie Schih, a specialist who studies production trends at Harvard Business School. “There are a lot of opportunities to perform routine tasks,” he says.
And the revolution will affect not only robots, but also artificial intelligence too. The “wrapping” of the AI into the physical shell allows the robot to start using pattern recognition, learn to speak and move in the real world. Artificial intelligence is getting smarter, it consumes more and more data. Well, the bundle of "AI + robot" is becoming more common.
“This could lead to a change, a breakthrough that would have been impossible without all this data,” said Pieter Abeel, a professor at the University of California, Berkeley. He is also the co-founder of Covariant.ai, a startup that uses machine learning and virtual reality to train robots.
Separated at birth
In the end, this one should have happened. In 1954, George Devlo, inventor, patented the design of a programmable mechanical arm. In 1961, entrepreneur Joseph Engelberger was able to transform all of this into a smart and unusual car that was first used on the General Motors assembly line in New Jersey.
From the very beginning, a tradition has been established to sing the “mind” of these simple machines. Engelberger chose the name "robot" for Unimate in honor of those robots that Isaac Asimov wrote about. Nevertheless, his cars were very far from those described by Azimov. They were simple mechanical devices that faced a simple production task. The accomplishment of this task was achieved with the help of specialized software. Even modern industrial robots have not gone too far from their ancestors. They need to be programmed to perform any action.
Artificial intelligence went the other way. In 1950, specialists began to develop separate tools that allowed copying human logic and arguments. Some scientists gave these systems and "body". So in the 48th and 49th years of the last century, William Gray Walter, a specialist from Bristol, developed two small autonomous machines, which he called Elsie and Elmer. These tortoise-like devices were equipped with simple software and hardware. They didn’t look too attractive, but they knew how to get to the source of light on their own. Walter built them in order to show how just a few neurons can provide the robot with a fairly complex pattern of behavior.
But the understanding of the essence of intelligence / mind is different for many scientists. While they broke spears in scientific disputes, AI practically did not develop. Nevertheless, it was clear that AI and robots are a great commonwealth, calculated, so to speak, for centuries.
Six years ago, researchers realized how AI can be made really smart. These scientists began to use neural networks - algorithms that can determine which way information travels and transmit it to its destination. Scientists after a while neural networks were developed that were able to cope with a huge amount of data.
The whole scope of AI has turned upside down. Deep learning (under this name, technology is known in the Russian Federation most of all) has been possible to use for solving such tasks as face recognition, speech, training of robotic vehicles driving on public roads. Now you could dream of a robot who can not only recognize you, but also talk to you or even bring you cold cola from the refrigerator.
One of the first skills that AI will give ordinary robots - increased accuracy of movement, agility. So, in the next few years, Amazon will use robots instead of humans to sort parcels. Many companies use machine learning, and the robots themselves are becoming more experienced.
I have been engaged in improving the coordination of robot movements for 35 years, during which time we have managed to achieve very great progress. Thanks to advances in the field of AI, we are ready to take a big step forward, ”says Ken Goldberg of UC Bekley.
AI body appears
AI and "iron" now found each other again, so to speak. Things are easy - to give AI a body, which can ensure the breakthrough development of both technologies in the near future. Robots can be a very important part of this puzzle.
In principle, this is correct, because a person learns, interacting with the outside world, and not just contemplating him. Our children begin to learn the world with games and toys. Over time, a person learns a huge amount of things about the world. According to experts, the training of the same industrial manipulators or other robots on the same principle has great potential.
It is worth remembering that due to the fact that human ancestors began to walk straight and interact more with the objects around them, the brain of our distant ancestors increased (although some anthropologists believe that this is not the case).
But will the AI succeed in repeating the human path? Until now, the evolution of AI occurred in a closed digital space, in virtual models of various parts of our world. If the robot under control of AI gets into the usual environment, then this may entail very interesting consequences. Of course, the robot will not become sensible at once, but who knows what might happen.