Clarkson Professors Receive Robot to Continue Human-Robot Handover Interaction Research


Potsdam, NY, Dec. 19, 2019 (GLOBE NEWSWIRE) -- Clarkson University Computer Science Assistant Professors Natasha and Sean Banerjee recently received a new mobile grasp robot through a joint Facebook and Carnegie Mellon grant that will be used to further their research related to intuitive human-robot handover interactions.

The Banerjees are one of just 30 grant proposals to be awarded one of the robots, known as a LoCoBot. The robot will help the professors continue with their research on augmenting robots to be human-aware by using deep learning to automatically detect where humans prefer to hold objects and provide assistance with human awareness built-in.

“The driving force behind this research was that we are very rapidly moving toward a world where robots are going to be a part of our daily interactions, so it is really important for those robots to collaborate and cooperate with humans because it does not make sense for them to just be independent,” said Natasha Banerjee. "We are spurring a new area of research on creating artificial intelligence (AI) algorithms for robots that are human-aware. There is a pretty broad research area on human-robot interaction or HRI, but a lot of this research has focused on experimental or toy problems. My research makes novel contributions to HRI by assessing how to ensure a robot hands over an object to a human such that a human is comfortable holding it."

Banerjee said she has recently presented work that focuses on detecting where humans prefer to hold cups, and that research can help determine where robots should be gripping objects to best interact with humans.

“Let’s say you have an elderly individual and they want assistance. A cup is at a height where they are not able to get it. If you had an assistive robot that had a gripper arm, then the robot should hold the cup around the body so the person can hold it around the handle, especially if there are hot contents. A robot’s gripper is able to handle that heat better than a human hand,” Banerjee said.

Banerjee said where her research is beginning to differ is that no one else is using a data-driven perspective, and most other researchers have only been looking at one object at a time, such as a bottle or a screwdriver.

“If you want these robots to be universally acceptable they have to be able to understand any object in your environment and predict where a human is likely to hold an object,” she said.

Being able to predict this requires machine learning. Banerjee said she and her team are using a special brand of computational neural networks that help predict a distribution map that can indicate where humans are more likely to hold an object.

The robot is equipped with a camera that can be used to create an image that combines color and depth to tell the robot where it should prefer to hold an object based on where a human would hold it. The robot will analyze how to hold the object in places where a human would tend not to. This method can be used to create predictions for any average object.

The Banerjees work with three students on the project. Yijun Jiang is a computer science graduate student, who works on research for the project and comes up with algorithms to support the work. Elim Schenck is a double major in computer science and computer engineering, who supports the work of Jiang by helping develop algorithms and has been in charge of learning the controls for the robot. Electrical engineering student Jack Lamuraglia has been in charge of assembling the robotic platform and getting it running.

Attachment


            
Left to right: Yijun Jiang, Graduate Student, Computer Science, Elim Schenck ‘21 Computer Science & Computer Engineering, Assistant Professors of Computer Science Natasha and Sean Banerjee

Coordonnées