DEPARTMENTS

Emperical Interference

Haptic Intelligence

Modern Magnetic Systems

Perceiving Systems

Physical Intelligence

Robotic Materials

Social Foundations of Computation


Research Groups

Autonomous Vision

Autonomous Learning

Bioinspired Autonomous Miniature Robots

Dynamic Locomotion

Embodied Vision

Human Aspects of Machine Learning

Intelligent Control Systems

Learning and Dynamical Systems

Locomotion in Biorobotic and Somatic Systems

Micro, Nano, and Molecular Systems

Movement Generation and Control

Neural Capture and Synthesis

Physics for Inference and Optimization

Organizational Leadership and Diversity

Probabilistic Learning Group


Topics

Robot Learning

Conference Paper

2022

Autonomous Learning

Robotics

AI

Career

Award


Haptic Intelligence Robotics Materials Medical Systems Article Functional Gradients Facilitate Tactile Sensing in Elephant Whiskers Schulz, A. K., Kaufmann, L. V., Smith, L. T., Philip, D. S., David, H., Lazovic, J., Brecht, M., Richter, G., Kuchenbecker, K. J. Science, 391(6786):712-718, February 2026, Lena V. Kaufmann and Lawrence T. Smith contributed equally to this work (Published)
Keratin composites enable animals to hike with hooves, fly with feathers, and sense with skin. Mammalian whiskers are elongated keratin rods attached to tactile skin structures that extend the animal's sensory volume. We investigated the whiskers that cover Asian elephant (Elephas maximus) trunks and found that they are geometrically and mechanically tailored to facilitate tactile perception by encoding contact location in the amplitude and frequency of the vibrotactile signal felt at the whisker base. Elephant whiskers emerge from armored trunk skin and shift from a thick, circular, porous, stiff base to a thin, ovular, dense, soft tip. These functional gradients of geometry, porosity, and stiffness independently tune the neuromechanics of elephant trunk touch to facilitate highly dexterous manipulation while ensuring whisker durability.
MPI-IS News Article YouTube Video Highlight Whisker Simulation Toolkit Edmond Data Repository Download Paper for Free Press Coverage DOI BibTeX

Haptic Intelligence Robotics Article Open-Source Hardware and Software Platform for Vibrotactile Motion Guidance Rokhmanova, N., Martus, J., Faulkner, R., Fiene, J., Kuchenbecker, K. J. Device, 4(1):100966, January 2026 (Published)
Vibrotactile feedback can enhance motor learning, sports training, and rehabilitation, but a lack of standardized tools limits its adoption. We developed a modular open-source hardware and software platform for delivering vibrotactile feedback that is spatially and temporally precise. The prototype device uses medical adhesive, linear resonant actuators (LRAs), and rigid 3D-printed components to standardize skin contact, avoiding the variability introduced by straps. The platform was validated by using the device's built-in accelerometers to fit a dynamic model of mechanical actuator vibration and examine how the anatomical site and body composition affect perceived vibration strength in 20 participants. Then, the platform was integrated with an optical motion-capture system to teach six participants a toe-in gait, showing potential for real-time, tailored clinical studies. By openly sharing the platform's hardware and software, we provide tools for delivering standardized vibrations and benchmarking feedback strategies in diverse applications.
DOI BibTeX

Haptic Intelligence Robotics Embodied Vision Conference Paper ISyHand: A Dexterous Multi-finger Robot Hand with an Articulated Palm Richardson, B. A., Grüninger, F., Mack, L., Stueckler, J., Kuchenbecker, K. J. In Proceedings of the IEEE-RAS International Conference on Humanoid Robots (Humanoids), 720-727, Seoul, South Korea, September 2025, Benjamin A. Richardson, Felix Grueninger and Lukas Mack contributed equally to this publication (Published) DOI BibTeX

Haptic Intelligence Robotics Miscellaneous Soft Magnetic Fingertip Devices for Clear Vibrotactile Feedback Gertler, I., Ballardini, G., Grüninger, F., Kuchenbecker, K. J. Hands-on demonstration presented at the IEEE World Haptics Conference (WHC), Suwon, South Korea, July 2025 (Published) BibTeX

Haptic Intelligence Embodied Vision Robotics Conference Paper Visuo-Tactile Object Pose Estimation for a Multi-Finger Robot Hand with Low-Resolution In-Hand Tactile Sensing Mack, L., Grüninger, F., Richardson, B. A., Lendway, R., Kuchenbecker, K. J., Stueckler, J. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 12401-12407, Atlanta, USA, May 2025 (Published)
Accurate 3D pose estimation of grasped objects is an important prerequisite for robots to perform assembly or in-hand manipulation tasks, but object occlusion by the robot's own hand greatly increases the difficulty of this perceptual task. Here, we propose that combining visual information with binary, low-resolution tactile contact measurements from across the interior surface of an articulated robotic hand can mitigate this issue. The visuo-tactile object-pose-estimation problem is formulated probabilistically in a factor graph. The pose of the object is optimized to align with the two kinds of measurements using a robust cost function to reduce the influence of outlier readings. The advantages of the proposed approach are first demonstrated in simulation: a custom 15-DOF robot hand with one binary tactile sensor per link grasps 17 YCB objects while observed by an RGB-D camera. This low-resolution in-hand tactile sensing significantly improves object-pose estimates under high occlusion and also high visual noise. We also show these benefits through grasping tests with a preliminary real version of our tactile hand, obtaining reasonable visuo-tactile estimates of object pose at approximately 12.9 Hz on average.
DOI BibTeX

Haptic Intelligence Robotics Miscellaneous Bio-Inspired Gradient (BIG) Whiskers: Stiffness-Shifting Structures Provide Dynamic Functional Benefits for Contact Sensing Schulz, A. K., Andrussow, I., Farsijani, F., Faulkner, R., Kuchenbecker, K. J. Extended abstract (3 pages) presented at the IEEE-RAS International Conference on Soft Robotics (RoboSoft), Lausanne, Switzerland, April 2025 (Published)
Mammal whiskers have inspired many sensors that can help robots find obstacles, identify textures, or sense flow. Though they vary in geometry, past bio-inspired whisker sensors were primarily constructed from homogenous materials. Interestingly, animal whiskers tend to shift from a stiff root to a much softer point; this material stiffness gradient is hypothesized to provide functional benefits such as reduction of wear and amplification of contact sensations. We take inspiration from nature to fabricate bio-inspired gradient (BIG) whiskers via 3D printing, and we assess their performance compared to stiff, medium, and soft homogenous artificial whiskers with the same geometry. Tests with controlled quasi-static and dynamic perturbations allow us to measure the whisker point deflection and the reaction torque at the stationary whisker root, respectively. The dynamic results reveal that BIG whiskers uniquely encode contact location along their length through torque magnitude and frequency, features that are not seen in the homogenous whiskers. These exciting preliminary findings motivate further exploration of robotic whiskers and other sensing structures with bio-inspired stiffness gradients.
BibTeX

Haptic Intelligence Robotics Article Building Instructions You Can Feel: Edge-Changing Haptic Devices for Digitally Guided Construction Tashiro, N., Faulkner, R., Melnyk, S., Rosales Rodriguez, T., Javot, B., Tahouni, Y., Cheng, T., Wood, D., Menges, A., Kuchenbecker, K. J. ACM Transactions on Computer-Human Interaction, 32(1):1-40, April 2025 (Published)
Recent efforts to connect builders to digital designs during construction have primarily focused on visual augmented reality, which requires accurate registration and specific lighting, and which could prevent a user from noticing safety hazards. Haptic interfaces, on the other hand, can convey physical design parameters through tangible local cues that don't distract from the surroundings. We propose two edge-changing haptic devices that use small inertial measurement units (IMUs) and linear actuators to guide users to perform construction tasks in real time: Drangle gives feedback for angling a drill relative to gravity, and Brangle assists with orienting bricks in the plane. We conducted a study with 18 participants to evaluate user performance and gather qualitative feedback. All users understood the edge-changing cues from both devices with minimal training. Drilling holes with Drangle was somewhat less accurate but much faster and easier than with a mechanical guide; 89% of participants preferred Drangle over the mechanical guide. Users generally understood Brangle's feedback but found its hand-size-specific grip, palmar contact, and attractive tactile cues less intuitive than Drangle's generalized form factor, fingertip contact, and repulsive cues. After summarizing design considerations, we propose application scenarios and speculate how such devices could improve construction workflows.
DOI BibTeX

Autonomous Learning Robotics Conference Paper Learning Diverse Skills for Local Navigation under Multi-constraint Optimality Cheng, J., Vlastelica, M., Kolev, P., Li, C., Martius, G. In Learning Diverse Skills for Local Navigation under Multi-constraint Optimality, 5083-5089, ICRA, October 2024 (Published)
Despite many successful applications of data-driven control in robotics, extracting meaningful diverse behaviors remains a challenge. Typically, task performance needs to be compromised in order to achieve diversity. In many scenarios, task requirements are specified as a multitude of reward terms, each requiring a different trade-off. In this work, we take a constrained optimization viewpoint on the quality-diversity trade-off and show that we can obtain diverse policies while imposing constraints on their value functions which are defined through distinct rewards. In line with previous work, further control of the diversity level can be achieved through an attract-repel reward term motivated by the Van der Waals force. We demonstrate the effectiveness of our method on a local navigation task where a quadruped robot needs to reach the target within a finite horizon. Finally, our trained policies transfer well to the real 12-DoF quadruped robot, Solo12, and exhibit diverse agile behaviors with successful obstacle traversal.
Website DOI URL BibTeX

Autonomous Learning Robotics Article Identifying Terrain Physical Parameters from Vision-Towards Physical-Parameter-Aware Locomotion and Navigation Chen, J., Frey, J., Zhou, R., Miki, T., Martius, G., Hutter, M. IEEE Robotics and Automation Letters, Identifying Terrain Physical Parameters From Vision, 9(11):9279-9286, August 2024 (Published)
Identifying the physical properties of the surrounding environment is essential for robotic locomotion and navigation to deal with non-geometric hazards, such as slippery and deformable terrains. It would be of great benefit for robots to anticipate these extreme physical properties before contact; however, estimating environmental physical parameters from vision is still an open challenge. Animals can achieve this by using their prior experience and knowledge of what they have seen and how it felt. In this work, we propose a cross-modal self-supervised learning framework for vision-based environmental physical parameter estimation, which paves the way for future physical-property-aware locomotion and navigation. We bridge the gap between existing policies trained in simulation and identification of physical terrain parameters from vision. We propose to train a physical decoder in simulation to predict friction and stiffness from multi-modal input. The trained network allows the labeling of real-world images with physical parameters in a self-supervised manner to further train a visual network during deployment, which can densely predict the friction and stiffness from image data. We validate our physical decoder in simulation and the real world using a quadruped ANYmal robot, outperforming an existing baseline method. We show that our visual network can predict the physical properties in indoor and outdoor experiments while allowing fast adaptation to new environments.
DOI URL BibTeX

Haptic Intelligence Robotics Miscellaneous Modeling Shank Tissue Properties and Quantifying Body Composition with a Wearable Actuator-Accelerometer Set Rokhmanova, N., Martus, J., Faulkner, R., Fiene, J., Kuchenbecker, K. J. Extended abstract (1 page) presented at the American Society of Biomechanics Annual Meeting (ASB), Madison, USA, August 2024 (Published) BibTeX

Haptic Intelligence Robotics Miscellaneous GaitGuide: A Wearable Device for Vibrotactile Motion Guidance Rokhmanova, N., Martus, J., Faulkner, R., Fiene, J., Kuchenbecker, K. J. Workshop paper (3 pages) presented at the ICRA Workshop on Advancing Wearable Devices and Applications Through Novel Design, Sensing, Actuation, and AI, Yokohama, Japan, May 2024 (Published)
Wearable vibrotactile devices can provide salient sensations that attract the user's attention or guide them to change. The future integration of such feedback into medical or consumer devices would benefit from understanding how vibrotactile cues vary in amplitude and perceived strength across the heterogeneity of human skin. Here, we developed an adhesive vibrotactile device (the GaitGuide) that uses two individually mounted linear resonant actuators to deliver directional motion guidance. By measuring the mechanical vibrations of the actuators via small on-board accelerometers, we compared vibration amplitudes and perceived signal strength across 20 subjects at five signal voltages and four sites around the shank. Vibrations were consistently smallest in amplitude—but perceived to be strongest—at the site located over the tibia. We created a fourth-order linear dynamic model to capture differences in tissue properties across subjects and sites via optimized stiffness and damping parameters. The anterior site had significantly higher skin stiffness and damping; these values also correlate with subject-specific body-fat percentages. Surprisingly, our study shows that the perception of vibrotactile stimuli does not solely depend on the vibration magnitude delivered to the skin. These findings also help to explain the clinical practice of evaluating vibrotactile sensitivity over a bony prominence.
URL BibTeX

Haptic Intelligence Robotics Article IMU-Based Kinematics Estimation Accuracy Affects Gait Retraining Using Vibrotactile Cues Rokhmanova, N., Pearl, O., Kuchenbecker, K. J., Halilaj, E. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 32:1005-1012, February 2024 (Published)
Wearable sensing using inertial measurement units (IMUs) is enabling portable and customized gait retraining for knee osteoarthritis. However, the vibrotactile feedback that users receive directly depends on the accuracy of IMU-based kinematics. This study investigated how kinematic errors impact an individual's ability to learn a therapeutic gait using vibrotactile cues. Sensor accuracy was computed by comparing the IMU-based foot progression angle to marker-based motion capture, which was used as ground truth. Thirty subjects were randomized into three groups to learn a toe-in gait: one group received vibrotactile feedback during gait retraining in the laboratory, another received feedback outdoors, and the control group received only verbal instruction and proceeded directly to the evaluation condition. All subjects were evaluated on their ability to maintain the learned gait in a new outdoor environment. We found that subjects with high tracking errors exhibited more incorrect responses to vibrotactile cues and slower learning rates than subjects with low tracking errors. Subjects with low tracking errors outperformed the control group in the evaluation condition, whereas those with higher error did not. Errors were correlated with foot size and angle magnitude, which may indicate a non-random algorithmic bias. The accuracy of IMU-based kinematics has a cascading effect on feedback; ignoring this effect could lead researchers or clinicians to erroneously classify a patient as a non-responder if they did not improve after retraining. To use patient and clinician time effectively, future implementation of portable gait retraining will require assessment across a diverse range of patients.
DOI BibTeX

Haptic Intelligence Robotics Miscellaneous Strap Tightness and Tissue Composition Both Affect the Vibration Created by a Wearable Device Rokhmanova, N., Faulkner, R., Martus, J., Fiene, J., Kuchenbecker, K. J. Work-in-progress paper (1 page) presented at the IEEE World Haptics Conference (WHC), Delft, the Netherlands, July 2023 (Published)
Wearable haptic devices can provide salient real-time feedback (typically vibration) for rehabilitation, sports training, and skill acquisition. Although the body provides many sites for such cues, the influence of the mounting location on vibrotactile mechanics is commonly ignored. This study builds on previous research by quantifying how changes in strap tightness and local tissue composition affect the physical acceleration generated by a typical vibrotactile device.
BibTeX

Intelligent Control Systems Robotics Article The Wheelbot: A Jumping Reaction Wheel Unicycle Geist, A. R., Fiene, J., Tashiro, N., Jia, Z., Trimpe, S. IEEE Robotics and Automation Letters, 7(4):9683-9690, IEEE, October 2022 (Published)
Combining off-the-shelf components with 3D- printing, the Wheelbot is a symmetric reaction wheel unicycle that can jump onto its wheels from any initial position. With non-holonomic and under-actuated dynamics, as well as two coupled unstable degrees of freedom, the Wheelbot provides a challenging platform for nonlinear and data-driven control research. This letter presents the Wheelbot's mechanical and electrical design, its estimation and control algorithms, as well as experiments demonstrating both self-erection and disturbance rejection while balancing.
DOI URL BibTeX

Haptic Intelligence Robotics Article Endowing a NAO Robot with Practical Social-Touch Perception Burns, R. B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K. J. Frontiers in Robotics and AI, 9(840335):1-17, April 2022 (Published)
Social touch is essential to everyday interactions, but current socially assistive robots have limited touch-perception capabilities. Rather than build entirely new robotic systems, we propose to augment existing rigid-bodied robots with an external touch-perception system. This practical approach can enable researchers and caregivers to continue to use robotic technology they have already purchased and learned about, but with a myriad of new social-touch interactions possible. This paper presents a low-cost, easy-to-build, soft tactile-perception system that we created for the NAO robot, as well as participants' feedback on touching this system. We installed four of our fabric-and-foam-based resistive sensors on the curved surfaces of a NAO's left arm, including its hand, lower arm, upper arm, and shoulder. Fifteen adults then performed five types of affective touch-communication gestures (hitting, poking, squeezing, stroking, and tickling) at two force intensities (gentle and energetic) on the four sensor locations; we share this dataset of four time-varying resistances, our sensor patterns, and a characterization of the sensors' physical performance. After training, a gesture-classification algorithm based on a random forest identified the correct combined touch gesture and force intensity on windows of held-out test data with an average accuracy of 74.1\%, which is more than eight times better than chance. Participants rated the sensor-equipped arm as pleasant to touch and liked the robot's presence significantly more after touch interactions. Our promising results show that this type of tactile-perception system can detect necessary social-touch communication cues from users, can be tailored to a variety of robot body parts, and can provide HRI researchers with the tools needed to implement social touch in their own systems.
DOI BibTeX

Haptic Intelligence Robotics Miscellaneous Sensor Patterns Dataset for Endowing a NAO Robot with Practical Social-Touch Perception Burns, R. B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K. J. Dataset published as a companion to the journal article "Endowing a NAO Robot with Practical Social-Touch Perception" in Frontiers in Robotics and AI, March 2022 (Published) DOI BibTeX

Haptic Intelligence Robotics Miscellaneous User Study Dataset for Endowing a NAO Robot with Practical Social-Touch Perception Burns, R. B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K. J. Dataset published as a companion to the journal article "Endowing a NAO Robot with Practical Social-Touch Perception" in Frontiers in Robotics and AI, March 2022 (Published) DOI BibTeX

Empirical Inference Robotics Miscellaneous A Robot Cluster for Reproducible Research in Dexterous Manipulation Wüthrich*, M., Widmaier*, F., Bauer*, S., Funk, N., Urain, J., Peters, J., Watson, J., Chen, C., Srinivasan, K., Zhang, J., Zhang, J., Walter, M. R., Madan, R., Schaff, C., Maeda, T., Yoneda, T., Yarats, D., Allshire, A., Gordon, E. K., Bhattacharjee, T., et al. 2021, *equal contribution (Published) arXiv BibTeX

Autonomous Learning Haptic Intelligence Robotics Patent Method for Force Inference of a Sensor Arrangement, Methods for Training Networks, Force Inference Module and Sensor Arrangement Sun, H., Martius, G., Lee, H., Spiers, A., Fiene, J. (PCT/EP2020/083261), Max Planck Institute for Intelligent Systems, Max Planck Ring 4, November 2020
The present invention relates to a method for force inference of a sensor arrangement, to related methods for training of networks, to a force inference module for performing such methods, and to a sensor arrangement for sensing forces. When developing applications such as robots, sensing of forces applied on a robot hand or another part of a robot such as a leg or a manipulation device is crucial in giving robots increased capabilities to move around and/or manipulate objects. Known implementations for sensor arrangements that can be used in robotic applications in order to have feedback with regard to applied forces are quite expensive and do not have sufficient resolution. Sensor arrangements may be used to measure forces. However, known sensor arrangements need a high density of sensors to provide for a high special resolution. It is thus an object of the present invention to provide for a method for force inference of a sensor arrangement and related methods that are different or optimized with regard to the prior art. It is a further object to provide for a force inference module to perform such methods. It is a further object to provide for a sensor arrangement for sensing forces with such a force inference module.
BibTeX

Physical Intelligence Robotics Article Twisting and untwisting of twisted nematic elastomers Davidson, Z. S., Kapernaum, N., Fiene, J., Giesselmann, F., Sitti, M. Physical Review Materials, 4(10):105601, 2020 DOI URL BibTeX