“Blind” Cheetah 3 robot can climb stairs littered with obstacles

Original author: Jennifer Chu
  • Transfer

Now the robot Cheetah (Cheetah) 3 MIT can jump and jump over rough terrain, climb stairs littered with garbage, and quickly restore balance.

The 90-pound mechanical beast - about the size of a large Labrador - is purposely designed for all of this, without relying on cameras or any external environmental sensors. Instead, he quickly “feels” his way through his environment in such a way that the engineers describe as “blind movement”, just like making his way through the black room.

“There are a lot of unexpected things that a robot has to cope with without relying too much on vision,” says robot designer Sangbae Kim, an associate professor of mechanical engineering at the Massachusetts Institute of Technology. “The vision can be noisy, slightly inaccurate, and sometimes inaccessible, and if you rely too much on the vision, your robot must be very accurate and ultimately slow. Therefore, we want the robot to rely more on tactile information. Thus, he can cope with unexpected obstacles and at the same time move quickly. "


In October at the International Conference on Intelligent Robots in Madrid, researchers will present the capabilities of a robot without vision. In addition to blind movement, the team will demonstrate improved robot equipment, including an expanded range of motion compared to its predecessor Cheetah 2, which allows the robot to stretch back and forth and spin from side to side, just like the video before the jump.

Kim suggests that over the next few years the robot will perform tasks that would otherwise be too dangerous or inaccessible to humans.

“Cheetah 3 is designed to perform versatile tasks, such as inspecting power plants, which includes various terrain conditions, including stairs, curbs, and obstacles on the ground,” says Kim. “I think there are countless cases where we wanted to send robots to perform simple tasks instead of humans. Dangerous, dirty and difficult work can be done much safer with the help of remote-controlled robots. "

Decision Algorithm


Cheetah 3 can blindly climb up stairs and through unstructured terrain and can quickly restore balance in the face of unexpected forces thanks to two new algorithms developed by the Kim team: contact detection algorithm and control prediction algorithm.

The contact detection algorithm helps the robot determine the best time for the current leg to switch from swinging in the air to getting to the ground.

“When it comes to switching from air to ground, switching should be very good,” says Kim.

The contact detection algorithm helps the robot determine the best time to move the leg between the vibrations and the step, constantly calculating for each leg three probabilities: the probability of contact of the leg with the ground, the probability of creating force after the leg hits the ground and the probability that the leg is in the middle . The algorithm calculates these probabilities based on the data of gyroscopes, accelerometers, and the positions of the leg joints, which fix the angle and height of the leg relative to the ground.

If, for example, a robot suddenly steps onto a wooden block, its body will suddenly tilt, moving the angle and height of the robot. This data will immediately be submitted to the calculation of three probabilities for each leg, which the algorithm will combine to evaluate whether each foot should make points downwards, or lift and discard, to maintain balance - all this time the robot is almost blind.

“If people close their eyes and take a step, we have a mental model for where the earth can be, and can prepare for it. But we also rely on the touch of the earth, says Kim. "We do the same thing by combining several sources of information to determine the transition time."

The researchers tested the algorithm in experiments with running Cheetah 3 on a laboratory treadmill and climbed the stairs. Both surfaces were littered with random objects, such as wooden blocks and tape rolls.

"He does not know the height of each step and does not know that there are obstacles on the stairs, but he just walks without losing his balance," says Kim. "Without this algorithm, the robot would be very unstable and easily fell."

Future plans


The blind movement of the robot was also partly caused by the control prediction algorithm, which predicts what effort should be applied to a given leg as soon as it takes a step.

“The contact detection algorithm will tell you:“ It is time to apply force to the ground, ”says Kim. "But as soon as you find yourself on the ground, now you need to calculate which forces to apply so that you can move the body correctly."

The prediction control algorithm of the model calculates the multiplicative positions of the body and legs of the robot for half a second in the future if any force is applied by any given leg when it contacts the ground.

“Let's say someone kicks the robot in the side,” says Kim. “When the leg is already on the ground, the algorithm decides which one should I apply force on the leg? Because I have unwanted speed on the left, so I want to use force in the opposite direction to kill this speed. If I apply 100 newtons in the opposite direction, what happens in half a second? ”

The algorithm is designed to perform these calculations for each stage every 50 milliseconds or 20 times per second. In the experiments, the researchers applied unexpected forces, kicking and pushing the robot as it trotted on the treadmill, and tugging at the leash as he climbed the obstacle ladder. They found that the prediction algorithm allows the robot to quickly create counter forces to restore balance and continue to move forward without leaning too far in the opposite direction.

“It is thanks to this proactive control that the right forces on the ground can be applied in combination with this contact transition algorithm, which makes each contact very fast and safe,” says Kim.

The team has already added cameras to the robot to give it a visual feedback about their surroundings. This will help in displaying the overall environment and give the robot visual information about large obstacles, such as doors and walls. But at the moment the team is working to further improve the blind movement of the robot.

“At first, we need a very good controller without vision,” says Kim. “And when we add sight, even if it can give you the wrong information, the foot should be able to cope with the obstacle. Because if this is what the camera does not see? What will he do? This is where blind walking can help. We do not want to trust our vision too much. ”

This study was supported, in particular, by Naver, Toyota Research Institute, Foxconn and the Air Force Scientific Research.

Also popular now: