Back

Haptic Intelligence Members Publications

Giving Touch to Soft Robot Fingertips Using Vision, Audio and Machine Learning

Minsight sab v3
Our fingertip-sized vision-based tactile sensor Minsight outputs the force distribution over its omnidirectional sensing surface [File IconFile IconFile Icon]. We also explore audio-based touch for fabric recognition by integrating a microphone into the same soft fingertip design.

Members

Thumb ticker sm 20241029 andrussow iris 2 2048pxwidth
Haptic Intelligence
  • Doctoral Researcher
Thumb ticker sm 20220401 huanbo sun 2 min
Autonomous Learning
  • Doctoral Researcher
Thumb ticker sm cuadrangular
Haptic Intelligence
  • Intern
Thumb ticker sm screenshot 2022 07 25 013108
Haptic Intelligence
  • Guest Scientist
Thumb ticker sm cropped 2
Haptic Intelligence
  • Intern
Thumb ticker sm ben
Haptic Intelligence
  • Postdoctoral Researcher
Thumb ticker sm georg 2018 crop small
Empirical Inference, Autonomous Learning
Senior Research Scientist
Thumb ticker sm thumb ticker kjk 2024
Haptic Intelligence
Director

Publications

Haptic Intelligence Autonomous Learning Empirical Inference Conference Paper Adding Internal Audio Sensing to Internal Vision Enables Human-Like In-Hand Fabric Recognition with Soft Robotic Fingertips Andrussow, I., Solano, J., Richardson, B. A., Martius, G., Kuchenbecker, K. J. In Proceedings of the IEEE-RAS International Conference on Humanoid Robots (Humanoids), 373-380, Seoul, South Korea, September 2025 (Published)
Distinguishing the feel of smooth silk from coarse cotton is a trivial everyday task for humans. When exploring such fabrics, fingertip skin senses both spatio-temporal force patterns and texture-induced vibrations that are integrated to form a haptic representation of the explored material. It is challenging to reproduce this rich, dynamic perceptual capability in robots because tactile sensors typically cannot achieve both high spatial resolution and high temporal sampling rate. In this work, we present a system that can sense both types of haptic information, and we investigate how each type influences robotic tactile perception of fabrics. Our robotic hand's middle finger and thumb each feature a soft tactile sensor: one is the open- source Minsight sensor that uses an internal camera to measure fingertip deformation and force at 50 Hz, and the other is our new sensor Minsound that captures vibrations through an internal MEMS microphone with a bandwidth from 50 Hz to 15 kHz. Inspired by the movements humans make to evaluate fabrics, our robot actively encloses and rubs folded fabric samples between its two sensitive fingers. Our results test the influence of each sensing modality on overall classification performance, showing high utility for the audio-based sensor. Our transformer-based method achieves a maximum fabric classification accuracy of 97% on a dataset of 20 common fabrics. Incorporating an external microphone away from Minsound increases our method's robustness in loud ambient noise conditions. To show that this audio-visual tactile sensing approach generalizes beyond the training data, we learn general representations of fabric stretchiness, thickness, and roughness.
DOI BibTeX

Haptic Intelligence Autonomous Learning Empirical Inference Miscellaneous Demonstration: Minsight - A Soft Vision-Based Tactile Sensor for Robotic Fingertips Andrussow, I., Sun, H., Martius, G., Kuchenbecker, K. J. Hands-on demonstration presented at the Conference on Robot Learning (CoRL), Munich, Germany, November 2024 (Published)
Beyond vision and hearing, tactile sensing enhances a robot's ability to dexterously manipulate unfamiliar objects and safely interact with humans. Giving touch sensitivity to robots requires compact, robust, affordable, and efficient hardware designs, especially for high-resolution tactile sensing. We present a soft vision-based tactile sensor engineered to meet these requirements. Comparable in size to a human fingertip, Minsight uses machine learning to output high-resolution directional contact force distributions at 60 Hz. Minsight's tactile force maps enable precise sensing of fingertip contacts, which we use in this hands-on demonstration to allow a 3-DoF robot arm to physically track contact with a user's finger. While observing the colorful image captured by Minsight's internal camera, attendees can experience how its ability to detect delicate touches in all directions facilitates real-time robot interaction.
BibTeX

Haptic Intelligence Autonomous Learning Empirical Inference Article Minsight: A Fingertip-Sized Vision-Based Tactile Sensor for Robotic Manipulation Andrussow, I., Sun, H., Kuchenbecker, K. J., Martius, G. Advanced Intelligent Systems, 5(8):2300042, August 2023, Inside back cover, DOI: 10.1002/aisy.202370035 (Published)
Intelligent interaction with the physical world requires perceptual abilities beyond vision and hearing; vibrant tactile sensing is essential for autonomous robots to dexterously manipulate unfamiliar objects or safely contact humans. Therefore, robotic manipulators need high-resolution touch sensors that are compact, robust, inexpensive, and efficient. The soft vision-based haptic sensor presented herein is a miniaturized and optimized version of the previously published sensor Insight. Minsight has the size and shape of a human fingertip and uses machine learning methods to output high-resolution maps of 3D contact force vectors at 60 Hz. Experiments confirm its excellent sensing performance, with a mean absolute force error of 0.07 N and contact location error of 0.6 mm across its surface area. Minsight's utility is shown in two robotic tasks on a 3-DoF manipulator. First, closed-loop force control enables the robot to track the movements of a human finger based only on tactile data. Second, the informative value of the sensor output is shown by detecting whether a hard lump is embedded within a soft elastomer with an accuracy of 98\%. These findings indicate that Minsight can give robots the detailed fingertip touch sensing needed for dexterous manipulation and physical human–robot interaction.
DOI BibTeX

Haptic Intelligence Autonomous Learning Empirical Inference Miscellaneous A Soft Vision-Based Tactile Sensor for Robotic Fingertip Manipulation Andrussow, I., Sun, H., Kuchenbecker, K. J., Martius, G. Workshop paper (1 page) presented at the IROS Workshop on Large-Scale Robotic Skin: Perception, Interaction and Control, Kyoto, Japan, October 2022 (Published)
For robots to become fully dexterous, their hardware needs to provide rich sensory feedback. High-resolution haptic sensing similar to the human fingertip can enable robots to execute delicate manipulation tasks like picking up small objects, inserting a key into a lock, or handing a cup of coffee to a human. Many tactile sensors have emerged in recent years; one especially promising direction is vision-based tactile sensors due to their low cost, low wiring complexity and high-resolution sensing capabilities. In this work, we build on previous findings to create a soft fingertip-sized tactile sensor. It can sense normal and shear contact forces all around its 3D surface with an average prediction error of 0.05 N, and it localizes contact on its shell with an average prediction error of 0.5 mm. The software of this sensor uses a data-efficient machine-learning pipeline to run in real time on hardware with low computational power like a Raspberry Pi. It provides a maximum data frame rate of 60 Hz via USB.
URL BibTeX