MIT new Artificial Intelligence – A System to ‘see by touching, feel by seeing’
What is Artificial Intelligence?
Basically, Intelligence demonstrated by machines is known as Artificial Intelligence.
What is new with MIT Artificial Intelligence?
The team at MIT CSAIL created predictive artificial intelligence that could equip robots with the ability to link multiple senses and is able to learn how to see using its “sense” of touch, and vice versa.
Till now the robots that have been programmed to see or feel however could not use these signals simultaneously. But now it is possible. Because of the MIT Artificial Intelligence, the robots have learnt to see by touching, and feel by seeing thus, bridging the sensory gap.
The MIT Artificial Intelligence team has used a KUKA robot’s arm with a tactile sensor named GelSight to explain this. They used their system with a robot arm to know where an object would be without even seeing it, and then recognize it by touching — you can imagine this with a robot reaching for a switch or a lever to be picked up, and thereafter verify whether the robot has picked up the right object or not.
A lead author of a newspaper and a PhD student at MIT computer science - Yunzhu Li, said, “By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge”.
What are the advantages of MIT Artificial Intelligence?
MIT’s AI model purely showcases the interaction of tactile feelings with the environment by only blindly touching around. By bringing these two senses together they have empowered the robot and reduce the data that might be needed for tasks involving manipulating and grasping objects. This can be seen as a revolutionary change in the field of machine learning and artificial intelligence.