Haptic Intelligence Autonomous Learning Empirical Inference Conference Paper 2025

Adding Internal Audio Sensing to Internal Vision Enables Human-Like In-Hand Fabric Recognition with Soft Robotic Fingertips

Thumb ticker sm 20241029 andrussow iris 2 2048pxwidth
Haptic Intelligence
  • Doctoral Researcher
Thumb ticker sm cuadrangular
Haptic Intelligence
  • Intern
Thumb ticker sm ben
Haptic Intelligence
  • Postdoctoral Researcher
Thumb ticker sm georg 2018 crop small
Empirical Inference, Autonomous Learning
Senior Research Scientist
Thumb ticker sm thumb ticker kjk 2024
Haptic Intelligence
Director
Thumb xxl minsound

Distinguishing the feel of smooth silk from coarse cotton is a trivial everyday task for humans. When exploring such fabrics, fingertip skin senses both spatio-temporal force patterns and texture-induced vibrations that are integrated to form a haptic representation of the explored material. It is challenging to reproduce this rich, dynamic perceptual capability in robots because tactile sensors typically cannot achieve both high spatial resolution and high temporal sampling rate. In this work, we present a system that can sense both types of haptic information, and we investigate how each type influences robotic tactile perception of fabrics. Our robotic hand's middle finger and thumb each feature a soft tactile sensor: one is the open- source Minsight sensor that uses an internal camera to measure fingertip deformation and force at 50 Hz, and the other is our new sensor Minsound that captures vibrations through an internal MEMS microphone with a bandwidth from 50 Hz to 15 kHz. Inspired by the movements humans make to evaluate fabrics, our robot actively encloses and rubs folded fabric samples between its two sensitive fingers. Our results test the influence of each sensing modality on overall classification performance, showing high utility for the audio-based sensor. Our transformer-based method achieves a maximum fabric classification accuracy of 97% on a dataset of 20 common fabrics. Incorporating an external microphone away from Minsound increases our method's robustness in loud ambient noise conditions. To show that this audio-visual tactile sensing approach generalizes beyond the training data, we learn general representations of fabric stretchiness, thickness, and roughness.

Author(s): Andrussow, Iris and Solano Vega, Luis Jans and Richardson, Benjamin A. and Martius, Georg and Kuchenbecker, Katherine J.
Book Title: Proceedings of the IEEE-RAS International Conference on Humanoid Robots (Humanoids)
Year: 2025
Month: September
Project(s):
BibTeX Type: Conference Paper (inproceedings)
Address: Seoul, Korea
State: Accepted

BibTeX

@inproceedings{Andrussow25-HR-Minsound,
  title = {Adding Internal Audio Sensing to Internal Vision Enables Human-Like In-Hand Fabric Recognition with Soft Robotic Fingertips},
  booktitle = {Proceedings of the IEEE-RAS International Conference on Humanoid Robots (Humanoids)},
  abstract = {Distinguishing the feel of smooth silk from coarse cotton is a trivial everyday task for humans. When exploring such fabrics, fingertip skin senses both spatio-temporal force patterns and texture-induced vibrations that are integrated to form a haptic representation of the explored material. It is challenging to reproduce this rich, dynamic perceptual capability in robots because tactile sensors typically cannot achieve both high spatial resolution and high temporal sampling rate. In this work, we present a system that can sense both types of haptic information, and we investigate how each type influences robotic tactile perception of fabrics. Our robotic hand's middle finger and thumb each feature a soft tactile sensor: one is the open- source Minsight sensor that uses an internal camera to measure fingertip deformation and force at 50 Hz, and the other is our new sensor Minsound that captures vibrations through an internal MEMS microphone with a bandwidth from 50 Hz to 15 kHz. Inspired by the movements humans make to evaluate fabrics, our robot actively encloses and rubs folded fabric samples between its two sensitive fingers. Our results test the influence of each sensing modality on overall classification performance, showing high utility for the audio-based sensor. Our transformer-based method achieves a maximum fabric classification accuracy of 97% on a dataset of 20 common fabrics. Incorporating an external microphone away from Minsound increases our method's robustness in loud ambient noise conditions. To show that this audio-visual tactile sensing approach generalizes beyond the training data, we learn general representations of fabric stretchiness, thickness, and roughness.},
  address = {Seoul, Korea},
  month = sep,
  year = {2025},
  author = {Andrussow, Iris and Solano Vega, Luis Jans and Richardson, Benjamin A. and Martius, Georg and Kuchenbecker, Katherine J.},
  month_numeric = {9}
}