What unites NASA, Oculus Rift and Kinect 2?

NASA's Jet Propulsion Laboratory has been looking for a simpler and more natural way to control robots in space for some time. As a result of the experiments, preference was given to the Leap Motion controller for remote control of the rover and the use of the Oculus Rift , plus Virtuix Omni for organizing a virtual tour of the Red Planet .

Therefore, it makes sense for JPL to subscribe to Kinect for Windows developers to get their hands on the new, most relevant Kinect 2 (which, oddly enough, is not available as a standalone device, separate from the Xbox One ), to watch Microsoft solutions in the field of robotics.
The laboratory received its Dev Kit at the end of November, and after several days of work, it was able to connect the Oculus Rift to Kinect 2 so as to manipulate the robot outside its remote control. According to our interview with a group of JPL engineers, the combination of the Oculus head-mounted display and the Kinect motion sensor led to the “most exciting interface” developed by JPL to date.
JPL took part in the first Kinect developer program, so she was already very familiar with how Kinect works. She developed a number of applications, worked with Microsoft to release a game where she was instructed to safely land Curiosity on Mars. The second Kinect, however, offers much more accuracy and accuracy than the first. “This allowed us to track open and closed states, the rotation of the wrist,” says engineer Victor Law. “Given all these new tracking points and rotational degrees of freedom, we were able to better manipulate the hand.”



Alex Menzies, also a human-machine interaction engineer, describes this combination of a head-mounted display with a Kinect motion sensor as nothing more than a revolution. “For the first time, with a consumer-grade sensor, we can control orientation rotation from a robotic limb. Plus, we can really immerse someone in the environment so that he feels the robot as a continuation of his own body - you can look at the scene with a person as a perspective with full stereo vision. The entire visual input is correctly displayed where your limbs are in the real world ... ”This, he said, is very different from simply observing yourself on the screen, because then it is very difficult to correlate the observation with its own movements. “I felt that there was a much better understanding of where the objects were located.”
As you can imagine, a very urgent problem is signal delay, since most of the robots are in space. Jeff Norris, leader of JPL's operational innovation mission, says the setup is mainly used to indicate the goals that robots are looking for.
Law and Menzies note, however, that there are ghosts of state indicating where your hand is and a solid color to show where the robot is currently located, so that a delay is displayed on the screen.
“It feels pretty natural, because the ghostly hand immediately moves, and sees that the robot is catching up with the position,” says Menzies. "You command a little ahead, but you do not feel lag."

image

“We are building partnerships with commercial companies that create devices, and the devices, perhaps, were primarily built not for space exploration,” says Law. “This helps us get a lot more money for space exploration than if we started from scratch. It also means that we could build systems accessible to the general public. Imagine how inspirational it would be for a seven-year-old child to control a space robot using a controller with which he is already familiar! ”
Of course, the ultimate goal is not only to control the robot arm, but space robots as a whole. As the demo shows, JPL hopes to bring the same technology to machines like Robonaut 2which are currently deployed aboard the ISS. “We want to integrate this development, ultimately expanding the management of robots such as Robonaut 2,” says Law. “There is work that is too boring, dirty or even dangerous for an astronaut to do it, but basically we still want to have control over the robot ... If we can make the robot more efficient for us, we can do more in less time” .

Original article: www.engadget.com/2013/12/23/nasa-jpl-control-robotic-arm-kinect-2 .

PS I’ll add something ... these are already distant almost like Lem’s in one of the books. Far seen.

Also popular now: