The robot finger has the ability to localize touch with high precision over a large, multi curved surface. (Image Credit: Columbia Engineering)
Researchers from Columbia Engineering University have developed a new robotic finger that has a sense of touch. The finger can localize touch with high precision to 1mm over a large, multi curved surface. It could be used for robotic manipulation in a variety of tasks, like manufacturing or nuclear decommissioning. The team published their findings in a paper on February 21, 2020.
“There has long been a gap between stand-alone tactile sensors and fully integrated tactile fingers--tactile sensing is still far from ubiquitous in robotic manipulation,” says Matei Ciocarlie, associate professor at Columbia Engineering University “In this paper, we have demonstrated a multi curved robotic finger with accurate touch localization and normal force detection over complex 3D surfaces.”
Current techniques for developing touch sensors have been difficult to merge into robot fingers, mainly due to a number of challenges, which included difficulty covering multi curved surfaces, high wire count, and difficulty fitting them into small fingertips, which prevented use in dexterous hands. The team used a different approach: overlapping signals from light emitters and receivers embedded in a transparent waveguide layer covering the functional areas of the finger.
By taking measurements of light transported between the emitters and receivers, the team demonstrated the ability to gather a very rich signal data set that changes in response to the finger deforming from touch. Afterward, they exhibited data-focused deep-learning methods that can obtain crucial information from the data, which includes contact location and applied normal force. All of this is done without using analytical models. This allowed them to create a fully integrated, sensor-laden robot finger with a low wire count. This was built using accessible manufacturing methods and designed for easy integration into dexterous hands.
For this project, the team used light to sense touch. Their robotic finger, which contains a layer of transparent silicone, had light beamed on it from over 30 LEDs. In order to measure how the light bounces around, the team placed more than 30 photodiodes on the finger. When the finger comes in contact with something, its skin deforms, causing the light to shift in the transparent layer. Researchers were able to collect approximately 1,000 signals that contained information about the touches the finger made. Since light is capable of bouncing around in a curved space, the signals can cover 3D shapes, like a fingertip.
The robotic finger going through its manufacturing stages. From left to right, 3D-printed finger, flexible circuit board, transparent silicone layer, and lastly, the reflective skin. (Image Credit: Columbia Engineering)
“The human finger provides incredibly rich contact information--more than 400 tiny touch sensors in every square centimeter of skin!” says Ciocarlie. “That was the model that pushed us to try and get as much data as possible from our finger. It was critical to be sure all contacts on all sides of the finger were covered--we essentially built a tactile robot finger with no blind spots.”
The team also designed the data so that it could be processed by machine learning algorithms. In their observations, they noticed that since there are a lot of signals, all of them partially overlapping each other, the data would be too complex to be interpreted by humans. Today’s machine learning techniques can be taught to extract the information that researchers need the most. The info includes where the finger is being touched, what’s touching the finger, the amount of force being applied, etc.
“Our results show that a deep neural network can extract this information with very high accuracy,” says Kymissis. “Our device is truly a tactile finger designed from the very beginning to be used in conjunction with AI algorithms.”
The team also built the finger so it can easily be placed onto robotic hands. It’s quite simple to integrate them. The new technology allows the finger to gather close to 1,000 signals, but it only requires a 14-wire cable connecting it to the hand and doesn’t need complex off-board electronics. They have already developed two dexterous hands in their lab that are being outfitted with these fingers, with one hand containing three fingers and the other one containing four. In the upcoming months, the team will be showcasing dexterous manipulation abilities with these hands, based on tactile and proprioceptive data.
“Dexterous robotic manipulation is needed now in fields such as manufacturing and logistics, and is one of the technologies that, in the longer term, are needed to enable personal robotic assistance in other areas, such as healthcare or service domains,” Ciocarlie adds.
Have a story tip? Message me at: cabe(at)element14(dot)com