Please help transcribe this video using our simple transcription tool. You need to be logged in to do so.

Description

Advances in design and fabrication technologies are enabling the production and commercialization of sensor-rich robotic hands with skin-like sensor arrays. Robotic skin is poised to become a crucial interface between the robot embodied intelligence and the external world. The need to fuse and make sense out of data extracted from skin-like sensors is readily apparent. This paper presents a real-time sensor fusion algorithm that can be used to accurately estimate object position, translation and rotation during grasping. When an object being grasped moves across the sensor array, it creates a sliding sensation; the spatial-temporal sensations are estimated by computing localized slid vectors using an optical flow approach. These results were benchmarked against an L-inf Norm approach using a nominal known object trajectory generated by sliding and rotating an object over the sensor array using a second, high accuracy, industrial robot. Rotation and slid estimation can later be used to improve grasping quality and dexterity

Questions and Answers

You need to be logged in to be able to post here.