In-Hand Rolling

Human hand rolling
  1. What and why
  2. Previous works
  3. Status
  4. Going Further

What and Why

This is quite recent and an exploratory side project. Recently for another project, I trained a CNN to classify the objects being grasped by a hand which has GelSight based tactile sensors. It worked pretty good (~90% accuracy) on my test objects. Analysing the false classifications indicated that the tactile data may not be perfect during all the grasps. We humans also gets confused in the same way occasionally, if we grab objects with just two fingers. We would then either proceed to close the fingers to make more contact surface area with the object or roll the object between our fingers to classify it. This would then give us more data and increases our belief probability.

Previous Works

Much research has been done in tactile object recognition as well as in hand manipulation [1], [2]. Unlike those approaches. this work explores on learning a finger movement repertoire, that could maximize the in-hand object recognition/localization capabilities.

Status

The following video shows the prototype gripper classifying two test objects (geodesic spheres with hexagonal and triangular faces, that can be better felt by touch)

We can see that it falsely classifies objects once in a while.

A modular 3rd axis is inserted in between the finger and the gripper, which can rotate the object in hand.

The classification probabilities during this motion are averaged to get a more accurate estimate of the object.

(Note: it has been tested only with symmetric objects, which are easy to roll and is still an ongoing project)

Going Further

More experimentation, 3D reconstruction using techniques using ICP.

Achu Wilson

Roboticist, still learning!