The research will be presented in May at the IEEE International Conference on Robotics and Automation. The paper’s lead author is Tara Boroushaki, a research assistant in the Signal Kinetics Group at the MIT Media Lab. Her MIT co-authors include Adib, who is the director of the Signal Kinetics Group; and Alberto Rodriguez, the Class of 1957 Associate Professor in the Department of Mechanical Engineering. Other co-authors include Junshan Leng, a research engineer at Harvard University, and Ian Clester, a PhD student at Georgia Tech.,The researchers have developed a robot that uses radio waves, which can pass through walls, to sense occluded objects. The robot, called RF-Grasp, combines this powerful sensing with more traditional computer vision to locate and grasp items that might otherwise be blocked from view. The advance could one day streamline e-commerce fulfillment in warehouses or help a machine pluck a screwdriver from a jumbled toolkit.,In recent years, robots have gained artificial vision, touch, and even smell. “Researchers have been giving robots human-like perception,” says
MIT Associate Professor Fadel Adib. In a new paper, Adib’s team is pushing the technology a step further. “We’re trying to give robots superhuman perception,” he says.,System uses penetrative radio frequency to pinpoint items, even when they’re hidden from view.,MIT researchers developed a picking robot that combines vision with radio frequency (RF) sensing to find and grasps objects, even if they’re hidden from view. The technology could aid fulfilment in e-commerce warehouses. Credit: Courtesy of the researchers,
相关文章
为您推荐
各种观点