Like a real-life mech, MIT engineers use VR to put you in the head of a robot
Why it matters to you
This new telepresence tech lets humans carry out dangerous or unpleasant work remotely.
Researchers from the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Lab (CSAIL) have come up with a smart way to combine two of the most exciting emerging technologies — virtual reality (VR) and robotics — by creating a smart new telepresence control system. Using one of Rethink Robotics’ Baxter robots, a VR headset, and maybe a page or two from mecha anime like the classic Mobile Suit Gundam Wing, they created a smart head-mounted “sensor display” system which puts human operators in the head of the robot they are controlling.
“A lot of jobs are difficult to do remotely, particularly in manufacturing and industry,” Jeffrey Lipton, one of the postdoctoral researchers who worked on the project, told Digital Trends. “A system like this could one day allow humans to supervise robots from a distance. This would enable employees to work from home, and could even open up manufacturing jobs to people with physical limitations, [such as those] who can’t lift heavy or bulky objects. Many industrial jobs are also terrible for human health — imagine servicing the inside of an airplane or working out on an oil rig. They can be dangerous, cramped and uncomfortable, but right now they need a human mind to understand, make decisions, and do movements. We think this model of teleoperation could allow us to keep humans safe and away from these sites while leveraging human mental capabilities.”
Jason Dorfman/MIT CSAIL
MIT’s smart system embeds the user in a virtual reality control room with multiple sensor displays, allowing the user to see everything the robot is seeing at any given moment. To execute tasks, the human then employs gestures — picked up courtesy of hand controllers — which are mirrored by the robot. Controls accessible by the human user appear virtually, rather than being physical controls. This allows them to change depending on what the robot has to carry out at any given time.
“We hope to extend this work to many different robots and scale up the trials to tasks beyond assembly,” Lipton continued, describing future research the CSAIL scientists hope to carry out. For more on the Baxter project, you can check out a research paper published earlier this year, titled “Baxter’s Homunculus: Virtual Reality Spaces for Teleoperation in Manufacturing.”