The sense of touch is one of the most powerful and amazing senses that humans have. It provides rich information about the environment that we make contact with. It is so powerfull that the visually impaired people mainly explores the world using the sense of touch. Deaf-Blind people trained in Tadoma can even do tactile lipreading, by picking up vibrations from the lips and throat using their fingers. Even the normal humans depends on tactile sensing a lot unconsciously for manipulation and grasping in visually inaccessible environments. One can grab a coin or key from their pocket just by tactile senses alone, without even looking into it.
Project BlindGrasp aims to bring similar skills - human like tactile sensing based grasping to robots.
Tactile sensing in robots are not widely used unlike the vision systems. Probably too much dependance on the vision systems may have made us ignore the rich information provided by the tactile sensing.The unavailability of high resoultion tactile sensors may also have been a reason for this. Most of the approaches on using tactile sensing are just confined to the detection of slip or quality of grasp, estimating mechanical properties like hardness of objects etc.
I came to know about Gelsight at ICRA 2017 from a couple of MIT PhD students( Wenzhen Yuan, Greg Izzat & Geronimo Mirano) working on it. They were at ICRA to present their research on using Gelsight to augment the 3D pointcloud generated by the vision systems.
Blindgrasp needs both research into mechanical structure of the Gelsight gripper and the software algorithms to make sense of the data. The following are the two areas of concentration.
1. GelSight Finger
The original GelSight gripper consists of a planar sensing area of size 24x18mm. It works well for tasks like estimating hardness of objects, generating pointclouds for object recognition etc. But will be inefficient for exploring tasks in clutter, since it can only see forces coming from the normal direction. The traditional two finger gripper is good for grasping tasks, but would perform poor in exploring through the clutter because of its non-streamline shape. In case of human fingerrs, they are elliptical cylinder in shape with the most sensitive area as the inner flattened surface. So, one of the hardware enhancements would be to model and design a curved Gelsight sensor and gripper. It is best illustrated in the figure below
In addition to the mechanical design,this would also include non linear remapping of the touch information on the curved surface onto the planar camera sensor
2. Fingerprint like features
The original GelSight depends on the features on the object to generate the tactile data from the camera. It may fail with featureless smooth surfaces like glass. Adding features like ridges similar to human fingerprints on the outer sensing surface of GelSight can help in detecting contact with featureless surfaces. It could also help in generating vibrations when the finger is moved over surfaces, giving details on the surface quality and texture of the object.
3. Tactile exploration and non-prehensile manipulation using Deep Reinforcement Learning
This involves developing a tactile exploration policy for the control of a manipulator. The novel method will be to use a deep reinforcement learning agent to let the robot learn how to explore and reach the goal object. The agent will be given positive rewards when it makes contact with the goal object. This would also involve research into new reinforcement learning agents which can learn with high dimensional data and sparse rewards.The exploration policy would generate control commands (joint torques) for the manipulator based on tactile feedback such that it can search through a cluttered environment and pick up the desired object.
Once the robot is able to explore the environment and successfully reach the goal object, the next phase of research should focus on how to pick it up, without losing contact and thereby losing the positional information of the object. This would also involve development of novel physics based grasp planning algorithms involving non-prehensile manipulation.
As I am not having access to a good quality manipulator and the tactile sensors, the first phase is planned to be done entirely in simulation.The Bullet physics engine is chosen as it provides the best customizability and functionality. The simulation environment consists of a Kuka iiwa manipulator, a high resolution tactile sensor fitted to the gripper and a tray filled with small spherical marbles. The tray is also having a couple of gold coins buried under the small marbles. The task for the robot would be to explore the tray of marbles and pick up the coin, using tactile senses only.
This would be a good scenario to bring out the strengths of tactile exploration, as it would be impossible to detect the coin with vision when it is buried under the marbles. The robot will have to dig /poke its tactile finger into the marbles, move around and explore till it gets in contact with the gold coin
Update-1 : Nov-22-2017: Simulation of the original Planar Gelsight Sensor
I tried to simulate Gelsight when I was right back from the conference. The initial approach was to simulate it in Drake as directed by Greg, but installing Drake and getting it working was a real mess. So, I began trying in Gazebo. But that also did not turn out well since I could not get the contact sensing to work reliably. Then I tried the physics simulator Bullet. It had softbody simulation, which I thought would be useful for simulating the elastomer of GelSight. But it turned out that the softbody simulation was a basic and needed much more development. So, I indirectly modelled Gelsight using the raytest functionality in bullet. It returns the depth at which a ray makes contact with a solid body.
Following is the video of the simulated gelsight gripper. The Kuka iiwa has a WSG 50 gripper fitted with a gelsight sensor. The sensor has a sensing area of 24x24mm and has a resolution of 256x256 pixels in the planar sensing area. It outputs standard ROS 3D pointcloud data, which is displayed in RViz.The simulation was run in my laptop and the pointcloud could be generated at 5Hz
Update-2 : Jan-29-2018: Prototype 1
Over the weekend I thought of making a concept prototype of the curved finger and data acquisition using the camera. A simple and crude setup was made from the commonly available materials like a cheap webcam and a glass tumbler as shown below
The raw image from the camera is as follows
TThe camera output consists of the curved glass surface and the contact on it. It can be then de-warped to a rectangular image, which makes more sense and may aid in easier future processing
As this data consists of 360 degree imagery and the sensory surface has a much streamlined shape, I feel this can be used for exploratory finger movements in cluttered environment. Next steps would be to:
1)Add the elastomer to the curved surface, coat it with reflective paint, add the three color RGB LED to reconstruct the geometry of the contact surface.
2) Try out with a wide angle lens for a better field of view.
3) Make it much smaller- comparable to the size of human finger -preferably using a small laboratory test tube, with rounded bottom instead of the glass tumbler
The next steps would be to model, simulate and prototype the curved Gelsight gripper. At the same time, I have to strengthen
my reinforcement learning skills.
Update-3 : April-15-2018: Modular TActile Sensing Gripper– Poster accepted for ICRA-18 ActiveTouch Workshop
A new gripper design is considered tactile finger consists of reconfigurable modules, which can be assembled in various ways. The modules are cylindrical in shape, similar to human fingers. This streamlined shape can aid greatly in the smooth exploratory motion in clutter. It consists of a cylindrical glass tube coated with a layer of transparent silicone elastomer, as in Fig below
The deformable elastomer has colored markers on its outer surface, which is then tracked by a wide angle camera placed at one end of the glass tube. The camera is able to get a 360 degree image of the surface of the finger, which is then de-warped to rectangular images for better understanding and processing. The other end of the tube has a spherical mirror. The reflected image of the markers from the spherical mirror provides view from another angle and is intended to be used in future to reconstruct the three dimensional position of the markers using stereo reconstruction.
Using off the shelf components ensured that the cost of the modular tactile finger is kept low. A 75 x 12 mm laboratory test tube is used as the cylindrical tube. The elastomer has Shore A hardness of 15 and is made using high transparent platinum cure silicone, which is molded into a thin sheet and wrapped over the test tube. The parabolic mirror is made using a chrome coated metal ball bearing. The finger is compatible with commonly available webcams and better results were obtained using a wide angle Raspberry Pi Camera.
The modularity of the system is achieved using interlocking mechanism which can couple multiple fingers serially or to a fixed base. The coupling can be flexible or rigid. Fig. 2 shows the prototype modular tactile finger. Two configurations of grippers using the modular tactile fingers are explored. The first one being a simple two finger parallel gripper, using two modules and the next one a cable driven under actuated gripper using four such modules is also as shown below
The following image shows the raw output from the camera
It is then pre-processed and tracked as blobs, whose position is tracked over time
The system also may need an active lighting system. The next step would be to replace the markers with technologies like GelSight, which can provide high resolution tactile information, can bring human level tactile resolution to the system. In addition to the development of whole finger tactile sensing, novel algorithms must be developed to generate or learn exploratory movement of the manipulator, from the huge amount of data generated by the optical tactile sensors.
Follow me on Twitter or visit later for updates.