Publications

DEPARTMENTS

Emperical Interference

Haptic Intelligence

Modern Magnetic Systems

Perceiving Systems

Physical Intelligence

Robotic Materials

Social Foundations of Computation


Research Groups

Autonomous Vision

Autonomous Learning

Bioinspired Autonomous Miniature Robots

Dynamic Locomotion

Embodied Vision

Human Aspects of Machine Learning

Intelligent Control Systems

Learning and Dynamical Systems

Locomotion in Biorobotic and Somatic Systems

Micro, Nano, and Molecular Systems

Movement Generation and Control

Neural Capture and Synthesis

Physics for Inference and Optimization

Organizational Leadership and Diversity

Probabilistic Learning Group


Topics

Robot Learning

Conference Paper

2022

Autonomous Learning

Robotics

AI

Career

Award


Haptic Intelligence Autonomous Learning Empirical Inference Miscellaneous A Sequential Group VAE for Robot Learning of Haptic Representations Richardson, B. A., Kuchenbecker, K. J., Martius, G. 1-11, Workshop paper (8 pages) presented at the CoRL Workshop on Aligning Robot Representations with Humans, Auckland, New Zealand, December 2022 (Published)
Haptic representation learning is a difficult task in robotics because information can be gathered only by actively exploring the environment over time, and because different actions elicit different object properties. We propose a Sequential Group VAE that leverages object persistence to learn and update latent general representations of multimodal haptic data. As a robot performs sequences of exploratory procedures on an object, the model accumulates data and learns to distinguish between general object properties, such as size and mass, and trial-to-trial variations, such as initial object position. We demonstrate that after very few observations, the general latent representations are sufficiently refined to accurately encode many haptic object properties.
URL BibTeX

Haptic Intelligence Ph.D. Thesis Multi-Timescale Representation Learning of Human and Robot Haptic Interactions Richardson, B. University of Stuttgart, Stuttgart, Germany, December 2022, Faculty of Computer Science, Electrical Engineering and Information Technology (Published)
The sense of touch is one of the most crucial components of the human sensory system. It allows us to safely and intelligently interact with the physical objects and environment around us. By simply touching or dexterously manipulating an object, we can quickly infer a multitude of its properties. For more than fifty years, researchers have studied how humans physically explore and form perceptual representations of objects. Some of these works proposed the paradigm through which human haptic exploration is presently understood: humans use a particular set of exploratory procedures to elicit specific semantic attributes from objects. Others have sought to understand how physically measured object properties correspond to human perception of semantic attributes. Few, however, have investigated how specific explorations are perceived. As robots become increasingly advanced and more ubiquitous in daily life, they are beginning to be equipped with haptic sensing capabilities and algorithms for processing and structuring haptic information. Traditional haptics research has so far strongly influenced the introduction of haptic sensation and perception into robots but has not proven sufficient to give robots the necessary tools to become intelligent autonomous agents. The work presented in this thesis seeks to understand how single and sequential haptic interactions are perceived by both humans and robots. In our first study, we depart from the more traditional methods of studying human haptic perception and investigate how the physical sensations felt during single explorations are perceived by individual people. We treat interactions as probability distributions over a haptic feature space and train a model to predict how similarly a pair of surfaces is rated, predicting perceived similarity with a reasonable degree of accuracy. Our novel method also allows us to evaluate how individual people weigh different surface properties when they make perceptual judgments. The method is highly versatile and presents many opportunities for further studies into how humans form perceptual representations of specific explorations. Our next body of work explores how to improve robotic haptic perception of single interactions. We use unsupervised feature-learning methods to derive powerful features from raw robot sensor data and classify robot explorations into numerous haptic semantic property labels that were assigned from human ratings. Additionally, we provide robots with more nuanced perception by learning to predict graded ratings of a subset of properties. Our methods outperform previous attempts that all used hand-crafted features, demonstrating the limitations of such traditional approaches. To push robot haptic perception beyond evaluation of single explorations, our final work introduces and evaluates a method to give robots the ability to accumulate information over many sequential actions; our approach essentially takes advantage of object permanence by conditionally and recursively updating the representation of an object as it is sequentially explored. We implement our method on a robotic gripper platform that performs multiple exploratory procedures on each of many objects. As the robot explores objects with new procedures, it gains confidence in its internal representations and classification of object properties, thus moving closer to the marvelous haptic capabilities of humans and providing a solid foundation for future research in this domain.
URL BibTeX

Haptic Intelligence Miscellaneous Semi-Automated Robotic Pleural Cavity Access in Space L’Orsa, R., de Lotbiniere-Bassett, M., Zareinia, K., Lama, S., Westwick, D., Sutherland, G., Kuchenbecker, K. J. Poster presented at the Canadian Space Health Research Symposium (CSHRS), Alberta, Canada, November 2022 (Published)
Astronauts are at risk for pneumothorax, a medical condition where air accumulating between the chest wall and the lungs impedes breathing and can result in fatality. Treatments include needle decompression (ND) and chest tube insertion (tube thoracostomy, TT). Unfortunately, the literature reports very high failure rates for ND and high complication rates for TT– especially whenn performed urgently, infrequently, or by inexperienced operators. These statistics are problematic in the context of skill retention for physician astronauts on long-duration exploration-class missions, or for non-medical astronauts if the physician astronaut is the one in need of treatment. We propose reducing the medical risk for exploration-class missions by improving ND/TT outcomes using a robot-based paradigm that automates tool depth control. Our goal is to produce a robotic system that improves the safety of pneumothorax treatments regardless of operator skill and without the use of ground resources. This poster provides an overview of our team's work toward this goal, including robot instrumentation schemes, tool-tissue interaction characterization, and automated puncture detection.
BibTeX

Haptic Intelligence Autonomous Learning Empirical Inference Miscellaneous A Soft Vision-Based Tactile Sensor for Robotic Fingertip Manipulation Andrussow, I., Sun, H., Kuchenbecker, K. J., Martius, G. Workshop paper (1 page) presented at the IROS Workshop on Large-Scale Robotic Skin: Perception, Interaction and Control, Kyoto, Japan, October 2022 (Published)
For robots to become fully dexterous, their hardware needs to provide rich sensory feedback. High-resolution haptic sensing similar to the human fingertip can enable robots to execute delicate manipulation tasks like picking up small objects, inserting a key into a lock, or handing a cup of coffee to a human. Many tactile sensors have emerged in recent years; one especially promising direction is vision-based tactile sensors due to their low cost, low wiring complexity and high-resolution sensing capabilities. In this work, we build on previous findings to create a soft fingertip-sized tactile sensor. It can sense normal and shear contact forces all around its 3D surface with an average prediction error of 0.05 N, and it localizes contact on its shell with an average prediction error of 0.5 mm. The software of this sensor uses a data-efficient machine-learning pipeline to run in real time on hardware with low computational power like a Raspberry Pi. It provides a maximum data frame rate of 60 Hz via USB.
URL BibTeX

Haptic Intelligence Miscellaneous Do-It-Yourself Whole-Body Social-Touch Perception for a NAO Robot Burns, R. B., Rosenthal, R., Garg, K., Kuchenbecker, K. J. Workshop paper (1 page) presented at the IROS Workshop on Large-Scale Robotic Skin: Perception, Interaction and Control, Kyoto, Japan, October 2022 (Published) Poster URL BibTeX

Haptic Intelligence Article Learning to Feel Textures: Predicting Perceptual Similarities from Unconstrained Finger-Surface Interactions Richardson, B. A., Vardar, Y., Wallraven, C., Kuchenbecker, K. J. IEEE Transactions on Haptics, 15(4):705-717, October 2022, Benjamin A. Richardson and Yasemin Vardar contributed equally to this publication (Published)
Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity. Our findings provide new perspectives on human texture perception during active touch, and our approach could benefit haptic surface assessment, robotic tactile perception, and haptic rendering.
DOI BibTeX

Haptic Intelligence Ph.D. Thesis Understanding the Influence of Moisture on Fingerpad-Surface Interactions Nam, S. University of Tübingen, Tübingen, Germany, October 2022, Department of Computer Science (Published)
People frequently touch objects with their fingers. The physical deformation of a finger pressing an object surface stimulates mechanoreceptors, resulting in a perceptual experience. Through interactions between perceptual sensations and motor control, humans naturally acquire the ability to manage friction under various contact conditions. Many researchers have advanced our understanding of human fingers to this point, but their complex structure and the variations in friction they experience due to continuously changing contact conditions necessitate additional study. Moisture is a primary factor that influences many aspects of the finger. In particular, sweat excreted from the numerous sweat pores on the fingerprints modifies the finger's material properties and the contact conditions between the finger and a surface. Measuring changes of the finger's moisture over time and in response to external stimuli presents a challenge for researchers, as commercial moisture sensors do not provide continuous measurements. This dissertation investigates the influence of moisture on fingerpad-surface interactions from diverse perspectives. First, we examine the extent to which moisture on the finger contributes to the sensation of stickiness during contact with glass. Second, we investigate the representative material properties of a finger at three distinct moisture levels, since the softness of human skin varies significantly with moisture. The third perspective is friction; we examine how the contact conditions, including the moisture of a finger, determine the available friction force opposing lateral sliding on glass. Fourth, we have invented and prototyped a transparent in vivo moisture sensor for the continuous measurement of finger hydration. In the first part of this dissertation, we explore how the perceptual intensity of light stickiness relates to the physical interaction between the skin and the surface. We conducted a psychophysical experiment in which nine participants actively pressed their index finger on a flat glass plate with a normal force close to 1.5 N and then detached it after a few seconds. A custom-designed apparatus recorded the contact force vector and the finger contact area during each interaction as well as pre- and post-trial finger moisture. After detaching their finger, participants judged the stickiness of the glass using a nine-point scale. We explored how sixteen physical variables derived from the recorded data correlate with each other and with the stickiness judgments of each participant. These analyses indicate that stickiness perception mainly depends on the pre-detachment pressing duration, the time taken for the finger to detach, and the impulse in the normal direction after the normal force changes sign; finger-surface adhesion seems to build with pressing time, causing a larger normal impulse during detachment and thus a more intense stickiness sensation. We additionally found a strong between-subjects correlation between maximum real contact area and peak pull-off force, as well as between finger moisture and impulse. When a fingerpad presses into a hard surface, the development of the contact area depends on the pressing force and speed. Importantly, it also varies with the finger's moisture, presumably because hydration changes the tissue's material properties. Therefore, for the second part of this dissertation, we collected data from one finger repeatedly pressing a glass plate under three moisture conditions, and we constructed a finite element model that we optimized to simulate the same three scenarios. We controlled the moisture of the subject's finger to be dry, natural, or moist and recorded 15 pressing trials in each condition. The measurements include normal force over time plus finger-contact images that are processed to yield gross contact area. We defined the axially symmetric 3D model's lumped parameters to include an SLS-Kelvin model (spring in series with parallel spring and damper) for the bulk tissue, plus an elastic epidermal layer. Particle swarm optimization was used to find the parameter values that cause the simulation to best match the trials recorded in each moisture condition. The results show that the softness of the bulk tissue reduces as the finger becomes more hydrated. The epidermis of the moist finger model is softest, while the natural finger model has the highest viscosity. In the third part of this dissertation, we focused on friction between the fingerpad and the surface. The magnitude of finger-surface friction available at the onset of full slip is crucial for understanding how the human hand can grip and manipulate objects. Related studies revealed the significance of moisture and contact time in enhancing friction. Recent research additionally indicated that surface temperature may also affect friction. However, previously reported friction coefficients have been measured only in dynamic contact conditions, where the finger is already sliding across the surface. In this study, we repeatedly measured the initial friction before full slip under eight contact conditions with low and high finger moisture, pressing time, and surface temperature. Moisture and pressing time both independently increased finger-surface friction across our population of twelve participants, and the effect of surface temperature depended on the contact conditions. Furthermore, detailed analysis of the recorded measurements indicates that micro stick-slip during the partial-slip phase contributes to enhanced friction. For the fourth and final part of this dissertation, we designed a transparent moisture sensor for continuous measurement of fingerpad hydration. Because various stimuli cause the sweat pores on fingerprints to excrete sweat, many researchers want to quantify the flow and assess its impact on the formation of the contact area. Unfortunately, the most popular sensor for skin hydration is opaque and does not offer continuous measurements. Our capacitive moisture sensor consists of a pair of inter-digital electrodes covered by an insulating layer, enabling impedance measurements across a wide frequency range. This proposed sensor is made entirely of transparent materials, which allows us to simultaneously measure the finger's contact area. Electrochemical impedance spectroscopy identifies the equivalent electrical circuit and the electrical component parameters that are affected by the amount of moisture present on the surface of the sensor. Most notably, the impedance at 1 kHz seems to best reflect the relative amount of sweat.
DOI BibTeX

Haptic Intelligence Conference Paper Towards Semi-Automated Pleural Cavity Access for Pneumothorax in Austere Environments L’Orsa, R., Lama, S., Westwick, D., Sutherland, G., Kuchenbecker, K. J. In Proceedings of the International Astronautical Congress (IAC), 1-7, Paris, France, September 2022 (Published)
Pneumothorax, a condition where injury or disease introduces air between the chest wall and lungs, can impede lung function and lead to respiratory failure and/or obstructive shock. Chest trauma from dynamic loads, hypobaric exposure from extravehicular activity, and pulmonary inflammation from celestial dust exposures could potentially cause pneumothoraces during spaceflight with or without exacerbation from deconditioning. On Earth, emergent cases are treated with chest tube insertion (tube thoracostomy, TT) when available, or needle decompression (ND) when not (i.e., pre-hospital). However, ND has high failure rates (up to 94\%), and TT has high complication rates (up to 37.9\%), especially when performed by inexperienced or intermittent operators. Thus neither procedure is ideal for a pure just-in-time training or skill refreshment approach, and both may require adjuncts for safe inclusion in Level of Care IV (e.g., short duration lunar orbit) or V (e.g., Mars transit) missions. Insertional complications are of particular concern since they cause inadvertent tissue damage that, while surgically repairable in an operating room, could result in (preventable) fatality in a spacecraft or other isolated, confined, or extreme (ICE) environments. Tools must be positioned and oriented correctly to avoid accidental insertion into critical structures, and they must be inserted no further than the thin membrane lining the inside of the rib cage (i.e., the parietal pleura). Operators identify pleural puncture via loss-of-resistance sensations on the tool during advancement, but experienced surgeons anecdotally describe a wide range of membrane characteristics: robust tissues require significant force to perforate, while fragile tissues deliver little-to-no haptic sensation when pierced. Both extremes can lead to tool overshoot and may be representative of astronaut tissues at the beginning (healthy) and end (deconditioned) of long duration exploration class missions. Given uncertainty surrounding physician astronaut selection criteria, skill retention, and tissue condition, an adjunct for improved insertion accuracy would be of value. We describe experiments conducted with an intelligent prototype sensorized system aimed at semi-automating tool insertion into the pleural cavity. The assembly would integrate with an in-mission medical system and could be tailored to fully complement an autonomous medical response agent. When coupled with minimal just-in-time training, it has the potential to bestow expert pleural access skills on non-expert operators without the use of ground resources, in both emergent and elective treatment scenarios.
URL BibTeX

Haptic Intelligence Miscellaneous Predicting Knee Adduction Moment Response to Gait Retraining Rokhmanova, N., Kuchenbecker, K. J., Shull, P. B., Ferber, R., Halilaj, E. Extended abstract presented at North American Congress of Biomechanics (NACOB), Ottawa, Canada, August 2022 (Published)
Personalized gait retraining has shown promise as a conservative intervention for slowing knee osteoarthritis (OA) progression [1,2]. Changing the foot progression angle is an easy-to-learn gait modification that often reduces the knee adduction moment (KAM), a correlate of medial joint loading. Deployment to clinics is challenging, however, because customizing gait retraining still requires gait lab instrumentation. Innovation in wearable sensing and vision-based motion tracking could bring lab-level accuracy to the clinic, but current markerless motion-tracking algorithms cannot accurately assess if gait retraining will reduce someone's KAM by a clinically meaningful margin. To assist clinicians in determining if a patient will benefit from toe-in gait, we built a predictive model to estimate KAM reduction using only measurements that can be easily obtained in the clinic.
BibTeX

Haptic Intelligence Conference Paper Wrist-Squeezing Force Feedback Improves Accuracy and Speed in Robotic Surgery Training Machaca, S., Cao, E., Chi, A., Adrales, G., Kuchenbecker, K. J., Brown, J. D. In Proceedings of the IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob), Seoul, South Korea, August 2022 (Published)
Current robotic minimally invasive surgery (RMIS) platforms provide surgeons with no haptic feedback of the robot's physical interactions. This limitation forces surgeons to rely heavily on visual feedback and can make it challenging for surgical trainees to manipulate tissue gently. Prior research has demonstrated that haptic feedback can increase task accuracy in RMIS training. However, it remains unclear whether these improvements represent a fundamental improvement in skill, or if they simply stem from re-prioritizing accuracy over task completion time. In this study, we provide haptic feedback of the force applied by the surgical instruments using custom wrist-squeezing devices. We hypothesize that individuals receiving haptic feedback will increase accuracy (produce less force) while increasing their task completion time, compared to a control group receiving no haptic feedback. To test this hypothesis, N=21 novice participants were asked to repeatedly complete a ring rollercoaster surgical training task as quickly as possible. Results show that participants receiving haptic feedback apply significantly less force (0.67 N) than the control group, and they complete the task no faster or slower than the control group after twelve repetitions. Furthermore, participants in the feedback group decreased their task completion times significantly faster (7.68 %) than participants in the control group (5.26 %). This form of haptic feedback, therefore, has the potential to help trainees improve their technical accuracy without compromising speed.
DOI BibTeX

Haptic Intelligence Article Contact Evolution of Dry and Hydrated Fingertips at Initial Touch Serhat, G., Vardar, Y., Kuchenbecker, K. J. PLOS ONE, 17(7):e0269722, July 2022, Gokhan Serhat and Yasemin Vardar contributed equally to this publication (Published)
Pressing the fingertips into surfaces causes skin deformations that enable humans to grip objects and sense their physical properties. This process involves intricate finger geometry, non-uniform tissue properties, and moisture, complicating the underlying contact mechanics. Here we explore the initial contact evolution of dry and hydrated fingers to isolate the roles of governing physical factors. Two participants gradually pressed an index finger on a glass surface under three moisture conditions: dry, water-hydrated, and glycerin-hydrated. Gross and real contact area were optically measured over time, revealing that glycerin hydration produced strikingly higher real contact area, while gross contact area was similar for all conditions. To elucidate the causes for this phenomenon, we investigated the combined effects of tissue elasticity, skin-surface friction, and fingerprint ridges on contact area using simulation. Our analyses show the dominant influence of elastic modulus over friction and an unusual contact phenomenon, which we call friction-induced hinging.
DOI BibTeX

Haptic Intelligence Article Perceptual Space of Algorithms for Three-to-One Dimensional Reduction of Realistic Vibrations Lee, H., Tombak, G. I., Park, G., Kuchenbecker, K. J. IEEE Transactions on Haptics, 15(3):521-534, July 2022 (Published)
Haptics researchers often endeavor to deliver realistic vibrotactile feedback through broad-bandwidth actuators; however, these actuators typically generate only single-axis vibrations, not 3D vibrations like those that occur in natural tool-mediated interactions. Several three-to-one (321) dimensional reduction algorithms have thus been developed to combine 3D vibrations into 1D vibrations. Surprisingly, the perceptual quality of 321-converted vibrations has never been comprehensively compared to rendering of the original 3D signals. In this study, we develop a multi-dimensional vibration rendering system using a magnetic levitation haptic interface. We verify the system's ability to generate realistic 3D vibrations recorded in both tapping and dragging interactions with four surfaces. We then conduct a study with 15 participants to measure the perceived dissimilarities between five 321 algorithms (SAZ, SUM, VM, DFT, PCA) and the original recordings. The resulting perceptual space is investigated with multiple regression and Procrustes analysis to unveil the relationship between the physical and perceptual properties of 321-converted vibrations. Surprisingly, we found that participants perceptually discriminated the original 3D vibrations from all tested 1D versions. Overall, our results indicate that spectral, temporal, and directional attributes may all contribute to the perceived similarities of vibration signals.
DOI BibTeX

Haptic Intelligence Miscellaneous A Sensorized Needle-Insertion Device for Characterizing Percutaneous Thoracic Tool-Tissue Interactions L’Orsa, R., Zareinia, K., Westwick, D., Sutherland, G., Kuchenbecker, K. J. 25-26, Short paper (2 pages) presented at the Hamlyn Symposium on Medical Robotics (HSMR), London, UK, June 2022 (Published)
Serious complications during chest tube insertion are relatively rare, but can have catastrophic repercussions. We propose semi-automating tool insertion to help protect against non-target tissue puncture, and report first steps collecting and characterizing needle-tissue interaction forces in a tissue phantom used for chest tube insertion training.
URL BibTeX

Haptic Intelligence Miscellaneous Dense 3D Reconstruction Through Lidar: A New Perspective on Computer-Integrated Surgery Caccianiga, G., Kuchenbecker, K. J. 63-64, Short paper (2 pages) presented at the Hamlyn Symposium on Medical Robotics (HSMR), London, UK, June 2022 (Published)
Technical innovations in sensing and computation are quickly advancing the field of computer-integrated surgery. In this fast-evolving panorama, we strongly believe there is still a need for robust geometric reconstruction of the surgical field. 3D reconstruction in surgery has been investigated almost only in the space of mono and stereoscopic visual imaging because surgeons always view the procedure through a clinical endoscope. Meanwhile, lidar (light detection and ranging) has greatly expanded in use, especially in SLAM for robotics, terrestrial vehicles, and drones. In parallel to these developments, the concept of multiple-viewpoint surgical imaging was proposed in the early 2010's in the context of magnetic actuation and micro-invasive surgery. We here propose an approach in which each surgical cannula can potentially hold a miniature lidar. Direct comparison between lidar from different viewpoints and a state-of-the-art 3D reconstruction method on stereoendoscope images showed that lidar-generated point clouds achieve better accuracy and scene coverage. This experiment especially hints at the potential of lidar imaging when deployed in a multiple-viewpoint approach.
URL BibTeX

Haptic Intelligence Miscellaneous Comparing Two Grounded Force-Feedback Haptic Devices Fazlollahi, F., Kuchenbecker, K. J. Hands-on demonstration presented at EuroHaptics, Hamburg, Germany, May 2022 (Published)
Even when they are not powered, grounded force-feedback haptic devices apply forces on the user's hand. These undesired forces stem from gravity, friction, and other nonidealities, and they still exist when the device renders a virtual environment. This demo invites users to compare how the 3D Systems Touch and Touch X devices render the same haptic content. Participants will try both devices in free space and touch a stiff frictionless virtual surface. After reflecting on the differences between the two devices, each person will receive a booklet showing the quantitative performance criteria we measured for both devices using Haptify, our benchmarking system.
BibTeX

Haptic Intelligence Miscellaneous Finger Contact during Pressing and Sliding on a Glass Plate Nam, S., Gueorguiev, D., Kuchenbecker, K. J. Poster presented at the EuroHaptics Workshop on Skin Mechanics and its Role in Manipulation and Perception, Hamburg, Germany, May 2022 (Published)
Light contact between the finger and the surface of an object sometimes causes an unanticipated slip. However, conditions causing this slip have not been fully understood, mainly because the biological components interact in complex ways to generate the skin-surface frictional properties. We investigated how the contact area starts slipping in various conditions of moisture, occlusion, and temperature during a lateral motion performed while pressing lightly on the surface.
BibTeX

Haptic Intelligence Miscellaneous HuggieBot: A Human-Sized Haptic Interface Block, A. E., Seifi, H., Christen, S., Javot, B., Kuchenbecker, K. J. Hands-on demonstration presented at EuroHaptics, Hamburg, Germany, May 2022, Award for best hands-on demonstration (Published)
How many people have you hugged in these past two years of social distancing? Unfortunately, many people we interviewed exchanged fewer hugs with friends and family since the onset of the COVID-19 pandemic. Hugging has several health benefits, such as improved oxytocin levels, lowered blood pressure, and alleviated stress and anxiety. We created a human-sized haptic interface called HuggieBot to provide the benefits of hugs in situations when receiving a hug from another person is difficult or impossible. In this demonstration, participants of all shapes and sizes can walk up to HuggieBot, enter an embrace, perform several intra-hug gestures (hold still, rub, pat, or squeeze the robot) if desired, feel the robot's response, and leave the hug when they are ready.
BibTeX

Haptic Intelligence Conference Paper Larger Skin-Surface Contact Through a Fingertip Wearable Improves Roughness Perception Gueorguiev, D., Javot, B., Spiers, A., Kuchenbecker, K. J. In Haptics: Science, Technology, Applications, 13235:171-179, Lecture Notes in Computer Science, (Editors: Seifi, Hasti and Kappers, Astrid M. L. and Schneider, Oliver and Drewing, Knut and Pacchierotti, Claudio and Abbasimoshaei, Alireza and Huisman, Gijs and Kern, Thorsten A.), Springer, Hamburg, Germany, International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (EuroHaptics), May 2022 (Published)
With the aim of creating wearable haptic interfaces that allow the performance of everyday tasks, we explore how differently designed fingertip wearables change the sensory threshold for tactile roughness perception. Study participants performed the same two-alternative forced-choice roughness task with a bare finger and wearing three flexible fingertip covers: two with a square opening (64 and 36 mm2, respectively) and the third with no opening. The results showed that adding the large opening improved the 75% JND by a factor of 2 times compared to the fully covered finger: the higher the skin-surface contact area, the better the roughness perception. Overall, the results show that even partial skin-surface contact through a fingertip wearable improves roughness perception, which opens design opportunities for haptic wearables that preserve natural touch.
DOI BibTeX

Haptic Intelligence Article Normal and Tangential Forces Combine to Convey Contact Pressure During Dynamic Tactile Stimulation Gueorguiev, D., Lambert, J., Thonnard, J., Kuchenbecker, K. J. Scientific Reports, 12(1):8215, May 2022 (Published)
Humans need to accurately process the contact forces that arise as they perform everyday haptic interactions such as sliding the fingers along a surface to feel for bumps, sticky regions, or other irregularities. Several different mechanisms are possible for how the forces on the skin could be represented and integrated in such interactions. In this study, we used a force-controlled robotic platform and simultaneous ultrasonic modulation of the finger-surface friction to independently manipulate the normal and tangential forces during passive haptic stimulation by a flat surface. To assess whether the contact pressure on their finger had briefly increased or decreased during individual trials in this broad stimulus set, participants did not rely solely on either the normal force or the tangential force. Instead, they integrated tactile cues induced by both components. Support-vector-machine analysis classified physical trial data with up to 75% accuracy and suggested a linear perceptual mechanism. In addition, the change in the amplitude of the force vector predicted participants' responses better than the change of the coefficient of dynamic friction, suggesting that intensive tactile cues are meaningful in this task. These results provide novel insights about how normal and tangential forces shape the perception of tactile contact.
DOI BibTeX

Haptic Intelligence Article Predicting Knee Adduction Moment Response to Gait Retraining with Minimal Clinical Data Rokhmanova, N., Kuchenbecker, K. J., Shull, P. B., Ferber, R., Halilaj, E. PLOS Computational Biology, 18(5):e1009500, May 2022 (Published)
Knee osteoarthritis is a progressive disease mediated by high joint loads. Foot progression angle modifications that reduce the knee adduction moment (KAM), a surrogate of knee loading, have demonstrated efficacy in alleviating pain and improving function. Although changes to the foot progression angle are overall beneficial, KAM reductions are not consistent across patients. Moreover, customized interventions are time-consuming and require instrumentation not commonly available in the clinic. We present a regression model that uses minimal clinical data-a set of six features easily obtained in the clinic-to predict the extent of first peak KAM reduction after toe-in gait retraining. For such a model to generalize, the training data must be large and variable. Given the lack of large public datasets that contain different gaits for the same patient, we generated this dataset synthetically. Insights learned from a ground-truth dataset with both baseline and toe-in gait trials (N = 12) enabled the creation of a large (N = 138) synthetic dataset for training the predictive model. On a test set of data collected by a separate research group (N = 15), the first peak KAM reduction was predicted with a mean absolute error of 0.134\% body weight * height (\%BW*HT). This error is smaller than the standard deviation of the first peak KAM during baseline walking averaged across test subjects (0.306\%BW*HT). This work demonstrates the feasibility of training predictive models with synthetic data and provides clinicians with a new tool to predict the outcome of patient-specific gait retraining without requiring gait lab instrumentation.
DOI BibTeX

Haptic Intelligence Article Design of Interactive Augmented Reality Functions for Robotic Surgery and Evaluation in Dry-Lab Lymphadenectomy Forte, M., Gourishetti, R., Javot, B., Engler, T., Gomez, E. D., Kuchenbecker, K. J. The International Journal of Medical Robotics and Computer Assisted Surgery, 18(2):e2351, April 2022 (Published)
Augmented reality (AR) has been widely researched for use in healthcare. Prior AR for robot-assisted minimally invasive surgery has mainly focused on superimposing preoperative 3D images onto patient anatomy. This paper presents alternative interactive AR tools for robotic surgery. We designed, built, and evaluated four voice-controlled functions: viewing a live video of the operating room, viewing two-dimensional preoperative images, measuring 3D distances, and warning about out-of-view instruments. This low-cost system was developed on a da Vinci Si, and it can be integrated into surgical robots equipped with a stereo camera and a stereo viewer. Eight experienced surgeons performed dry-lab lymphadenectomies and reported that the functions improved the procedure. They particularly appreciated the possibility of accessing the patient's medical records on demand, measuring distances intraoperatively, and interacting with the functions using voice commands. The positive evaluations garnered by these alternative AR functions and interaction methods provide support for further exploration.
DOI BibTeX

Haptic Intelligence Robotics Article Endowing a NAO Robot with Practical Social-Touch Perception Burns, R. B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K. J. Frontiers in Robotics and AI, 9(840335):1-17, April 2022 (Published)
Social touch is essential to everyday interactions, but current socially assistive robots have limited touch-perception capabilities. Rather than build entirely new robotic systems, we propose to augment existing rigid-bodied robots with an external touch-perception system. This practical approach can enable researchers and caregivers to continue to use robotic technology they have already purchased and learned about, but with a myriad of new social-touch interactions possible. This paper presents a low-cost, easy-to-build, soft tactile-perception system that we created for the NAO robot, as well as participants' feedback on touching this system. We installed four of our fabric-and-foam-based resistive sensors on the curved surfaces of a NAO's left arm, including its hand, lower arm, upper arm, and shoulder. Fifteen adults then performed five types of affective touch-communication gestures (hitting, poking, squeezing, stroking, and tickling) at two force intensities (gentle and energetic) on the four sensor locations; we share this dataset of four time-varying resistances, our sensor patterns, and a characterization of the sensors' physical performance. After training, a gesture-classification algorithm based on a random forest identified the correct combined touch gesture and force intensity on windows of held-out test data with an average accuracy of 74.1\%, which is more than eight times better than chance. Participants rated the sensor-equipped arm as pleasant to touch and liked the robot's presence significantly more after touch interactions. Our promising results show that this type of tactile-perception system can detect necessary social-touch communication cues from users, can be tailored to a variety of robot body parts, and can provide HRI researchers with the tools needed to implement social touch in their own systems.
DOI BibTeX

Haptic Intelligence Conference Paper Robot, Pass Me the Tool: Handle Visibility Facilitates Task-Oriented Handovers Ortenzi, V., Filipovica, M., Abdlkarim, D., Pardi, T., Takahashi, C., Wing, A. M., Di Luca, M., Kuchenbecker, K. J. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), 256-264, Sapporo, Japan, March 2022, Valerio Ortenzi and Maija Filipovica contributed equally to this publication (Published)
A human handing over an object modulates their grasp and movements to accommodate their partner's capabilities, which greatly increases the likelihood of a successful transfer. State-of-the-art robot behavior lacks this level of user understanding, resulting in interactions that force the human partner to shoulder the burden of adaptation. This paper investigates how visual occlusion of the object being passed affects the subjective perception and quantitative performance of the human receiver. We performed an experiment in virtual reality where seventeen participants were tasked with repeatedly reaching to take a tool from the hand of a robot; each of the three tested objects (hammer, screwdriver, scissors) was presented in a wide variety of poses. We carefully analysed the user's hand and head motions, the time to grasp the object, and the chosen grasp location, as well as participants' ratings of the grasp they just performed. Results show that initial visibility of the handle significantly increases the reported holdability and immediate usability of a tool. Furthermore, a robot that offers objects so that their handles are more occluded forces the receiver to spend more time in planning and executing the grasp and also lowers the probability that the tool will be grasped by the handle. Together these findings indicate that robots can more effectively support their human work partners by increasing the visibility of the intended grasp location of objects being passed.
DOI BibTeX

Haptic Intelligence Robotics Miscellaneous Sensor Patterns Dataset for Endowing a NAO Robot with Practical Social-Touch Perception Burns, R. B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K. J. Dataset published as a companion to the journal article "Endowing a NAO Robot with Practical Social-Touch Perception" in Frontiers in Robotics and AI, March 2022 (Published) DOI BibTeX

Haptic Intelligence Robotics Miscellaneous User Study Dataset for Endowing a NAO Robot with Practical Social-Touch Perception Burns, R. B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K. J. Dataset published as a companion to the journal article "Endowing a NAO Robot with Practical Social-Touch Perception" in Frontiers in Robotics and AI, March 2022 (Published) DOI BibTeX

Autonomous Learning Haptic Intelligence Article A Soft Thumb-Sized Vision-Based Sensor with Accurate All-Round Force Perception Sun, H., Kuchenbecker, K. J., Martius, G. Nature Machine Intelligence, 4(2):135-145, February 2022 (Published)
Vision-based haptic sensors have emerged as a promising approach to robotic touch due to affordable high-resolution cameras and successful computer-vision techniques. However, their physical design and the information they provide do not yet meet the requirements of real applications. We present a robust, soft, low-cost, vision-based, thumb-sized 3D haptic sensor named Insight: it continually provides a directional force-distribution map over its entire conical sensing surface. Constructed around an internal monocular camera, the sensor has only a single layer of elastomer over-molded on a stiff frame to guarantee sensitivity, robustness, and soft contact. Furthermore, Insight is the first system to combine photometric stereo and structured light using a collimator to detect the 3D deformation of its easily replaceable flexible outer shell. The force information is inferred by a deep neural network that maps images to the spatial distribution of 3D contact force (normal and shear). Insight has an overall spatial resolution of 0.4 mm, force magnitude accuracy around 0.03 N, and force direction accuracy around 5 degrees over a range of 0.03--2 N for numerous distinct contacts with varying contact area. The presented hardware and software design concepts can be transferred to a wide variety of robot parts.
DOI URL BibTeX

Haptic Intelligence Article Adaptive Optimal Measurement Algorithm for ERT-Based Large-Area Tactile Sensors Park, K., Lee, H., Kuchenbecker, K. J., Kim, J. IEEE/ASME Transactions on Mechatronics, 27(1):304-314, February 2022 (Published)
Electrical resistance tomography (ERT) is an inferential imaging technique that has shown promising results for enabling large-area tactile sensors constructed from a piezoresistive sheet. The performance of such sensors is improved by increasing the number of electrodes, but the number of measurements and the computational cost also increase. In this article, we propose a new measurement algorithm for ERT-based tactile sensors: it adaptively changes the measurement pattern to be optimal for the present external stimulus. Regions of normal pressure are first detected by a base measurement pattern that maximizes the distinguishability of local conductivity changes. When a new contact is detected, a set of local patterns is selectively recruited near the pressed region to acquire more detailed information. For fast and parallel execution, the proposed algorithm is implemented with a field-programmable gate array. It is validated through indentation experiments on an ERT-based sensor that has 32 electrodes. The optimized base pattern of 100 measurements enabled a frame rate five times faster than before. Transmitting only detected contact events reduced the idle data rate to 0.5\% of its original value. The pattern adapted to new contacts with a latency of only 80 μs and an accuracy of 99.5\%, enabling efficient, high-quality real-time reconstruction of complex multicontact conditions.
DOI BibTeX

Haptic Intelligence Autonomous Learning MPI Year Book Fingerspitzengefühl für Roboter / Sensitive fingertips for robots Sun, H., Kuchenbecker, K. J., Martius, G. January 2022 (Published)
Um den Tastsinn von Robotern zu verbessern, entwickelten Forschende des Max-Planck-Instituts für Intelligente Systeme einen Sensor, der einem Daumen gleicht und im Inneren mit einer Kamera ausgestattet ist. Das Team trainierte ein tiefes neuronales Netz, um aus den Kamerabildern Informationen abzuleiten, wo und wie stark der Sensor berührt wird. Aus den beobachteten Verformungen der flexiblen Außenhülle des Sensors generierte das neuronale Netz ein dreidimensionales Abbild der Kräfte, die auf den Daumen einwirken. Die Erfindung kommt dem Tastsinn unserer Haut einen wesentlichen Schritt näher, funktioniert allerdings ganz anders. ENGLISH: Striving to improve touch sensing in robotics, scientists at the Max Planck Institute for Intelligent Systems developed a thumb-shaped sensor with a camera hidden inside and trained a deep neural network to infer its haptic contact information. The system constantly constructs a force map – where and how things are touching the flexible outer shell of the sensor – and in this way “sees” the contact deformations. This research invention significantly improves a robot finger’s haptic perception, coming ever closer to the sense of touch of human skin, though it works in a completely different way.
URL BibTeX

Haptic Intelligence Article Evaluation of Vibrotactile Output from a Rotating Motor Actuator Gourishetti, R., Kuchenbecker, K. J. IEEE Transactions on Haptics, 15(1):39-44, January 2022, Presented at the IEEE Haptics Symposium (Published)
Specialized vibrotactile actuators are widely used to output haptic sensations due to their portability and robustness; some models are expensive and capable, while others are economical but weaker and less expressive. To increase the accessibility of high-quality haptics, we designed a cost-effective actuation approach called the rotating motor actuator (RMA): it uses a small DC motor to generate vibrotactile cues on a rigid stylus. We conducted a psychophysical experiment where eighteen volunteers matched the RMA's vibration amplitudes with those from a high-quality reference actuator (Haptuator Mark II) at twelve frequencies from 50 Hz to 450 Hz. The average error in matching acceleration magnitudes was 10.2\%. More current was required for the RMA than the reference actuator; a stronger DC motor would require less current. Participants also watched a video of a real tool-mediated interaction with playback of recorded vibrotactile cues from each actuator. 94.4\% of the participants agreed that the RMA delivered realistic vibrations and audio cues during this replay. 83.3\% reported that the RMA vibrations were pleasant, compared to 66.7\% for the reference. A possible cause for this significant difference may be that the reference actuator (which has a mechanical resonance) distorts low-frequency vibrations more than the RMA does.
DOI BibTeX