Now Artificial Intelligence Can Feel the Objects Like Human

For humans, it is simple to predict how an object will feel by looking at it or tell what an object looks like by touching it; however, this can be a massive problem for machines. Now, a new robot developed by MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) is attempting to do that.

The group took a KUKA robot arm and added a tactile sensor known as GelSight, which was created by Ted Adelson’s group at CSAIL. The data collected by GelSight was then fed to an AI so it may study the relationship between visual and tactile info.

To show the AI how to identify objects by contact, the group recorded 12,000 videos of 200 objects like fabrics, tools, and household objects being touched. The video was broken down into still images, and the AI used this dataset to attach real and visual data.

“By looking on the scene, our model can imagine the feeling of touching a flat floor or a sharp edge,” says Yunzhu Li, CSAIL Ph.D. student and lead author on a new paper about the system. “By blindly touching round, our model can predict the interaction with the surroundings purely from real feelings. Bringing these two senses together may empower the robot and cut back the info we would want for tasks involving manipulating and grasping objects.”

For now, the robot can only identify objects in controlled surroundings. The subsequent step is to build a more significant data set so the robot can work in additional diverse settings.

“Methods like this have the potential to be very helpful for robotics, where you need to answer questions like ‘is this object hard or soft?’, or ‘if I raise this mug by its handle, how good will my grip be?’,” says Andrew Owens, a postdoctoral researcher on the University of California at Berkeley. “This can be a tough drawback because the signals are so totally different, and this model has demonstrated nice capability.”