Publications

DEPARTMENTS

Emperical Interference

Haptic Intelligence

Modern Magnetic Systems

Perceiving Systems

Physical Intelligence

Robotic Materials

Social Foundations of Computation


Research Groups

Autonomous Vision

Autonomous Learning

Bioinspired Autonomous Miniature Robots

Dynamic Locomotion

Embodied Vision

Human Aspects of Machine Learning

Intelligent Control Systems

Learning and Dynamical Systems

Locomotion in Biorobotic and Somatic Systems

Micro, Nano, and Molecular Systems

Movement Generation and Control

Neural Capture and Synthesis

Physics for Inference and Optimization

Organizational Leadership and Diversity

Probabilistic Learning Group


Topics

Robot Learning

Conference Paper

2022

Autonomous Learning

Robotics

AI

Career

Award


Haptic Intelligence Article Comparing Placement and Polarity Configurations of a Two-Magnet Fingertip Vibrotactile Device Gertler, I., Ballardini, G., Tangolar, D., Serhat, G., Kuchenbecker, K. J. Scientific Reports, March 2026 (Published)
Vibrotactile feedback enriches the use of wearable technologies for entertainment, navigation, and healthcare. The actuators of these portable systems, particularly fingertip devices, need to be compact, comfortable, and easy to integrate. Multiple vibrating elements could enhance perceptual realism, but how should they be arranged and oriented on the fingerpad? Here, we evaluate a simple approach that uses an audio input signal to drive an air coil that vibrates two magnets embedded in a soft fingertip sheath; the magnets are arranged in the radial-ulnar or proximal-distal direction with either the same or opposite polarity. We explore the effects of these new device configurations on both dynamic response and haptic perception. Experimental results indicate that the vibrations were perceived well across frequencies, with stronger sensations between 180 and 360 Hz, which aligns with the high vibration magnitudes our computational simulation predicts in this frequency range. Interestingly, perceptual responses showed that participants mainly classified vibrations based on the excitation frequency rather than the polarity of the magnets. Participants also rated vibrotactile feedback derived from recorded sounds and replayed for different interactions. Their evaluations offer promising evidence that this actuation approach could be used in extended-reality applications to improve transient user interactions with virtual objects.
DOI BibTeX

Haptic Intelligence Robotics Materials Medical Systems Article Functional Gradients Facilitate Tactile Sensing in Elephant Whiskers Schulz, A. K., Kaufmann, L. V., Smith, L. T., Philip, D. S., David, H., Lazovic, J., Brecht, M., Richter, G., Kuchenbecker, K. J. Science, 391(6786):712-718, February 2026, Lena V. Kaufmann and Lawrence T. Smith contributed equally to this work (Published)
Keratin composites enable animals to hike with hooves, fly with feathers, and sense with skin. Mammalian whiskers are elongated keratin rods attached to tactile skin structures that extend the animal's sensory volume. We investigated the whiskers that cover Asian elephant (Elephas maximus) trunks and found that they are geometrically and mechanically tailored to facilitate tactile perception by encoding contact location in the amplitude and frequency of the vibrotactile signal felt at the whisker base. Elephant whiskers emerge from armored trunk skin and shift from a thick, circular, porous, stiff base to a thin, ovular, dense, soft tip. These functional gradients of geometry, porosity, and stiffness independently tune the neuromechanics of elephant trunk touch to facilitate highly dexterous manipulation while ensuring whisker durability.
MPI-IS News Article YouTube Video Highlight Whisker Simulation Toolkit Edmond Data Repository Download Paper for Free Press Coverage DOI BibTeX

Haptic Intelligence Robotics Article Open-Source Hardware and Software Platform for Vibrotactile Motion Guidance Rokhmanova, N., Martus, J., Faulkner, R., Fiene, J., Kuchenbecker, K. J. Device, 4(1):100966, January 2026 (Published)
Vibrotactile feedback can enhance motor learning, sports training, and rehabilitation, but a lack of standardized tools limits its adoption. We developed a modular open-source hardware and software platform for delivering vibrotactile feedback that is spatially and temporally precise. The prototype device uses medical adhesive, linear resonant actuators (LRAs), and rigid 3D-printed components to standardize skin contact, avoiding the variability introduced by straps. The platform was validated by using the device's built-in accelerometers to fit a dynamic model of mechanical actuator vibration and examine how the anatomical site and body composition affect perceived vibration strength in 20 participants. Then, the platform was integrated with an optical motion-capture system to teach six participants a toe-in gait, showing potential for real-time, tailored clinical studies. By openly sharing the platform's hardware and software, we provide tools for delivering standardized vibrations and benchmarking feedback strategies in diverse applications.
DOI BibTeX

Haptic Intelligence Article Creating an Affective Robot That Feels Both Touch and Emotion Burns, R. B., Richardson, B. A., Klingenberg, J., Kuchenbecker, K. J. IEEE Transactions on Affective Computing, 1-18, December 2025, Rachael Bevill Burns and Benjamin A. Richardson contributed equally to this publication (Published)
Despite the importance of sensitive skin for living creatures, most robots can feel contact on only a tiny fraction of their exterior, if at all. Furthermore, typical robot reactions to touch are limited to event-based acknowledgments, lacking perceptual richness, lifelike positive/negative responses, and temporal dynamics. We address these gaps by introducing a practical full-body tactile-perception system for social robots, turning a NAO robot into the Haptic Empathetic Robot Animal (HERA). The sixteen main regions of the robot's body are instrumented with soft resistive tactile sensors covered by a tailored koala suit. Windows of each time-varying sensor output are continually classified into five gestures at two intensities via a two-stage machine-learning model. On challenging testing data containing simultaneous contacts, touch detection achieves an F1 score of 0.773, and gesture recognition achieves 52.2% accuracy (5.2 times chance); considering the temporal, spatial, and semantic adjacency of the applied touches increases these metrics to 0.896 and 86.6%, respectively. In turn, each detected contact drives a real-time emotion model that represents the robot's affective state as a second-order dynamic system analogous to a mass-spring-damper. This model's parameters control the robot's disposition, stoicism, and calmness. We explain the connections between HERA's hardware and software subsystems and demonstrate their combined ability to create an affective robot that feels both touch and emotion.
DOI BibTeX

Haptic Intelligence Robotic Materials Article Wearable Electrohydraulic Actuation for Salient Full-Fingertip Haptic Feedback Shao, Y., Shagan Shomron, A., Javot, B., Keplinger, C., Kuchenbecker, K. J. Advanced Materials Technologies, 10(12):2401525, June 2025, Yitian Shao and Alona Shagan Shomron contributed equally to this publication. This article was selected for the front cover. https://doi.org/10.1002/admt.202570062 (Published)
Although essential for an immersive experience in extended reality (XR), providing salient and versatile touch feedback remains a technical challenge. Existing solutions restrict hand movements with bulky rigid structures, require a tethered energy source to power actuators worn on the hand, or output vibrations that lack expressiveness. This study introduces a design strategy for compact, lightweight, untethered haptic feedback centering on a 30-µm-thick inflatable chamber that naturally conforms to the fingertip; to minimize fluidic losses and enable high bandwidth, a soft electrohydraulic pump mounted on the hand actuates the chamber via a mechanically transparent fluidic channel. A 15.2-mm-diameter prototypical actuation chamber achieves 8 N peak force, 3 N steady-state force, stroke up to 5 mm, and bandwidth from 0 to 500 Hz. In contrast to these salient fingertip cues, the entire hydraulic system has a weight less than 8 g and a thickness less than 2 mm. Additionally, this study presents a validation approach that uses a commercial fingertip sensor to confirm that the haptic feedback created by the device imitates the touch signals generated during typical hand interactions. Together, this design strategy and validation method can enable a broad spectrum of haptic activities in diverse XR applications, including medical training, online shopping, and social interactions.
DOI BibTeX

Haptic Intelligence Article Comparing Puncture-Detection Approaches for Manual Needle Insertions Through the Parietal Pleura L’Orsa, R., Zareinia, K., Sutherland, G. R., Westwick, D., Kuchenbecker, K. J. IEEE Transactions on Medical Robotics and Bionics, 7(2):455-468, May 2025 (Published)
Tube thoracostomy (chest tube insertion) is a surgical procedure that treats pneumothorax, a potentially life-threatening condition where air accumulates between the chest wall and the lungs. The literature reports high complication rates for this procedure, including accidental fatality due to poor manual depth control during tool insertion. We hypothesize that an instrumented needle-holder could help operators recognize pleural puncture and improve depth control, and we present a puncture-detection experiment that contributes toward this goal. An operator manually inserted a bevel-tip needle into ex vivo porcine ribs and through the parietal pleura via a sensorized percutaneous device that records position, force, and videos. We use this rich dataset of 63 insertions to thoroughly test four previously published data-driven puncture-detection (DDPD) algorithms against two new real-time algorithms: a custom recursive digital filter with coefficients optimized for our application, and a difference equation that compares standard deviations between adjacent sliding windows. Our algorithms achieve a precision (true positives over total identified punctures) of 23% and 22%, respectively, while the precision of existing DDPD algorithms ranges from 0% to 21%. Despite these performance improvements, our results show the limitations of DDPD algorithms and motivate new methods for detecting pleural membrane punctures in thoracostomy.
DOI BibTeX

Haptic Intelligence Article Enhancing Needle Puncture Detection Using High-Pass Filtering and Diffuse Reflectance L’Orsa, R., Bisht, A., Yu, L., Murari, K., Sutherland, G. R., Westwick, D. T., Kuchenbecker, K. J. Frontiers in Robotics and AI, 12(1429327):1-16, May 2025 (Published)
Chest trauma or disease progression can lead to tension pneumothorax, a condition where mounting pressurization of the pleural cavity (the space between the chest wall and the lungs) leads rapidly to cardiac arrest. In pre-hospital settings, tension pneumothorax is treated by venting the pleural cavity via a needle introduced through the chest wall. Very high failure rates (up to 94.1%) have been reported for pre-hospital needle decompression, however, and the procedure can result in the accidental puncture of critical thoracic tissues because it is performed blind. Instrumented needles could help operators more reliably identify when the tool has entered the target space. This paper investigates technical approaches to provide such support; we created an experimental system that acquires needle force and position signals, as well as the diffuse backscattered reflectance from white light carried to and collected from the needle's tip via two in-bore optical fibers. Data collection occurred while two experimenters inserted a bevel-tipped percutaneous needle into an ex vivo porcine rib section simulating human chest anatomy. Four data-driven puncture-detection (DDPD) algorithms from the literature, which are appropriate for use with the variable tool velocities produced by manual insertions, were applied to the resulting data set offline. Grid search was performed across key signal-processing parameters, high-pass filters (HPFs) were applied to examine their impact on puncture detection, and a first exploration of multimodal (ensemble) methods was performed. Combining high-pass filters with DDPD methods resulted in a 2.7-fold improvement (from 8.2% to 21.9%) in the maximum overall precision (MOP) produced by force signals. Applying this HPF + DDPD scheme to reflectance data streams yielded a peak MOP of 36.4%, and combining reflectance with force generated the best MOP overall (42.1%); these results represent 4.4-fold and 5.1-fold improvements, respectively, over the best MOP produced by the traditional application of DDPD algorithms to force signals alone. These results strongly support the utility of high-pass filters combined with both reflectance-only and multimodal reflectance-plus-force data-driven puncture-detection schemes for needle decompression applications.
DOI BibTeX

Haptic Intelligence Robotics Article Building Instructions You Can Feel: Edge-Changing Haptic Devices for Digitally Guided Construction Tashiro, N., Faulkner, R., Melnyk, S., Rosales Rodriguez, T., Javot, B., Tahouni, Y., Cheng, T., Wood, D., Menges, A., Kuchenbecker, K. J. ACM Transactions on Computer-Human Interaction, 32(1):1-40, April 2025 (Published)
Recent efforts to connect builders to digital designs during construction have primarily focused on visual augmented reality, which requires accurate registration and specific lighting, and which could prevent a user from noticing safety hazards. Haptic interfaces, on the other hand, can convey physical design parameters through tangible local cues that don't distract from the surroundings. We propose two edge-changing haptic devices that use small inertial measurement units (IMUs) and linear actuators to guide users to perform construction tasks in real time: Drangle gives feedback for angling a drill relative to gravity, and Brangle assists with orienting bricks in the plane. We conducted a study with 18 participants to evaluate user performance and gather qualitative feedback. All users understood the edge-changing cues from both devices with minimal training. Drilling holes with Drangle was somewhat less accurate but much faster and easier than with a mechanical guide; 89% of participants preferred Drangle over the mechanical guide. Users generally understood Brangle's feedback but found its hand-size-specific grip, palmar contact, and attractive tactile cues less intuitive than Drangle's generalized form factor, fingertip contact, and repulsive cues. After summarizing design considerations, we propose application scenarios and speculate how such devices could improve construction workflows.
DOI BibTeX

Haptic Intelligence Article Simulation Training with Haptic Feedback of Instrument Vibrations Reduces Resident Workload During Live Robot-Assisted Sleeve Gastrectomy Gomez, E. D., Mat Husin, H., Dumon, K. R., Williams, N. N., Kuchenbecker, K. J. Surgical Endoscopy, 39(3):1523-1535, April 2025 (Published)
Background: New surgeons experience heavy workload during robot-assisted surgery partially because they must use vision to compensate for the lack of haptic feedback. We hypothesize that providing realistic haptic feedback during dry-lab simulation training may accelerate learning and reduce workload during subsequent surgery on patients. Methods: We conducted a single-blinded study with twelve general surgery residents (third and seventh post-graduate year, PGY) randomized into haptic and control groups. Participants performed five simulated bariatric surgeries on a custom inanimate simulator followed by live robot-assisted sleeve gastrectomies (RASGs) using da Vinci robots. The haptic group received naturalistic haptic feedback of instrument vibrations during their first four simulated procedures. Participants completed pre-/post-procedure STAI and post-procedure NASA-TLX questionnaires in both simulation and the operating room (OR). Results: Higher PGY level (simulation: p<0.001, OR p=0.004), shorter operative time (simulation: p<0.001, OR: p=0.003), and lower pre-procedure STAI (simulation: p=0.003, OR: p<0.001) were significantly associated with lower self-reported overall workload in both operative settings; PGY-7s reported about 10% lower workload than PGY-3s. The haptic group had significantly lower overall covariate-adjusted NASA-TLX during the fourth (p=0.03) and fifth (p=0.04) simulated procedures and across all OR procedures (p=0.047), though not for only the first three OR procedures. Haptic feedback reduced physical demand (simulation: p<0.001, OR: p=0.001) and increased perceived performance (simulation: p=0.031, OR: p<0.001) in both settings. Conclusion: Haptic feedback of instrument vibrations provided during robotic surgical simulation reduces trainee workload during both simulation and live OR cases. The implications of workload reduction and its potential effects on patient safety warrant further investigation.
DOI BibTeX

Haptic Intelligence Perceiving Systems Article Wrist-to-Wrist Bioimpedance Can Reliably Detect Discrete Self-Touch Forte, M., Vardar, Y., Javot, B., Kuchenbecker, K. J. IEEE Transactions on Instrumentation and Measurement, 74(4006511):1-11, April 2025 (Published)
Self-touch is crucial in human communication, psychology, and disease transmission, yet existing methods for detecting self-touch are often invasive or limited in scope. This study systematically investigates the feasibility of using non-invasive electrical bioimpedance for detecting discrete self-touch poses across individuals. While previous research has focused on classifying defined self-touch poses, our work explores how various poses cause bioimpedance changes, providing insights into the underlying physiological mechanisms. We thus created a dataset of 27 genuine self-touch poses, including skin-to-skin contact between the hands and face and skin-to-clothing contact between the hands and chest, alongside six adversarial mid-air gestures. We then measured the wrist-to-wrist bioimpedance of 30 adults (15 female, 15 male) across these poses, with each measurement preceded by a no-touch pose serving as a baseline. Statistical analysis of the measurements showed that skin-to-skin contacts cause significant changes in bioimpedance magnitude between 237.8 kHz and 4.1 MHz, while adversarial gestures do not; skin-to-clothing contacts cause less-significant changes due to the influence and variability of the clothing material. Furthermore, our analysis highlights the sensitivity of bioimpedance to the body parts involved, skin contact area, and individual's characteristics. Our contributions are two-fold: (1) we demonstrate that bioimpedance offers a practical, non-invasive solution for detecting self-touch poses involving skin-to-skin contact, (2) researchers can leverage insights from our study to determine whether a pose can be detected without extensive testing.
DOI BibTeX

Haptic Intelligence Article A Sleeve Alters the Pressure-Stretch Curve of a Hyperelastic Balloon to Enable Pre-Programmed Sequencing Gertler, I., Kuchenbecker, K. J. Advanced Materials Technologies, 10(6):2400993, March 2025 (Published)
Coupled hyperelastic balloons that anchor alternately against a lumen wall provide an appealing locomotion method for soft robots, especially for pipe inspection and medical interventions. However, it is still challenging to use a single fluid channel to obtain a practical balloon actuation sequence, where the rear anchor is both the first to inflate and the first to deflate. The common solution delays the front balloon's reaction using fluid dynamics, producing a slow and/or bulky system. This study presents a new method that utilizes an inextensible sleeve along with geometry and mechanical properties to set the pressure-stretch curve of two silicone-rubber balloons so they could serve as the rear and front anchors when driven from a single fluid supply. Experimental measurements and numerical simulations compare the characteristic curves of thin and thick spherical balloons with identical diameters to that of a thin balloon inside a rigid encasing sleeve that delays its initial expansion. Pairing this encased thin balloon with a non-encased thick balloon yields the desired asymmetric actuation sequence. A physical demonstration of the behavior needed for self-propelling robots is achieved by placing such balloons within rigid tubes, connecting them to a shared supply, and sequentially adding and removing fluid.
DOI BibTeX

Haptic Intelligence Robotic Materials Article Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues Sanchez-Tamayo, N., Yoder, Z., Rothemund, P., Ballardini, G., Keplinger, C., Kuchenbecker, K. J. Advanced Science, 11(48):2402461, December 2024, This article was selected for the inside front cover. https://doi.org/10.1002/advs.202470295 (Published)
By focusing on vibrations, current wearable haptic devices underutilize the skin's perceptual capabilities. Devices that provide richer haptic stimuli, including contact feedback and/or variable pressure, are typically heavy and bulky due to the underlying actuator technology and the low sensitivity of hairy skin, which covers most of the body. This paper presents a system architecture for compact wearable devices that deliver salient and pleasant broad-bandwidth haptic cues: Cutaneous Electrohydraulic (CUTE) devices combine a custom materials design for soft haptic electrohydraulic actuators that feature high stroke, high force, and electrical safety with a comfortable mounting strategy that places the actuator in a non-contact resting position. A prototypical wrist-wearable CUTE device produces rich tactile sensations by making and breaking contact with the skin (2.44 mm actuation stroke), applying high controllable forces (exceeding 2.3 N), and delivering vibrations at a wide range of amplitudes and frequencies (0-200 Hz). A perceptual study with fourteen participants achieved 97.9\% recognition accuracy across six diverse cues and verified their pleasant and expressive feel. This system architecture for wearable devices gives unprecedented control over the haptic cues delivered to the skin, providing an elegant and discreet way to activate the user's sense of touch.
Video DOI BibTeX

Haptic Intelligence Empirical Inference Optics and Sensing Laboratory Software Workshop Article Fiber-Optic Shape Sensing Using Neural Networks Operating on Multispecklegrams Cao, C. G. L., Javot, B., Bhattarai, S., Bierig, K., Oreshnikov, I., Volchkov, V. V. IEEE Sensors Journal, 24(17):27532-27540, September 2024 (Published)
Application of machine learning techniques on fiber speckle images to infer fiber deformation allows the use of an unmodified multimode fiber to act as a shape sensor. This approach eliminates the need for complex fiber design or construction (e.g., Bragg gratings and time-of-flight). Prior work in shape determination using neural networks trained on a finite number of possible fiber shapes (formulated as a classification task), or trained on a few continuous degrees of freedom, has been limited to reconstruction of fiber shapes only one bend at a time. Furthermore, generalization to shapes that were not used in training is challenging. Our innovative approach improves generalization capabilities, using computer vision-assisted parameterization of the actual fiber shape to provide a ground truth, and multiple specklegrams per fiber shape obtained by controlling the input field. Results from experimenting with several neural network architectures, shape parameterization, number of inputs, and specklegram resolution show that fiber shapes with multiple bends can be accurately predicted. Our approach is able to generalize to new shapes that were not in the training set. This approach of end-to-end training on parameterized ground truth opens new avenues for fiber-optic sensor applications. We publish the datasets used for training and validation, as well as an out-of-distribution (OOD) test set, and encourage interested readers to access these datasets for their own model development.
DOI BibTeX

Haptic Intelligence Article Fingertip Dynamic Response Simulated Across Excitation Points and Frequencies Serhat, G., Kuchenbecker, K. J. Biomechanics and Modeling in Mechanobiology, 23(4):1369-1376, August 2024 (Published)
Predicting how the fingertip will mechanically respond to different stimuli can help explain human haptic perception and enable improvements to actuation approaches such as ultrasonic mid-air haptics. This study addresses this goal using high-fidelity 3D finite element analyses. We compute the deformation profiles and amplitudes caused by harmonic forces applied in the normal direction at four locations: the center of the finger pad, the side of the finger, the tip of the finger, and the oblique midpoint of these three sites. The excitation frequency is swept from 2.5 to 260 Hz. The simulated frequency response functions (FRFs) obtained for displacement demonstrate that the relative magnitudes of the deformations elicited by stimulating at each of these four locations greatly depends on whether only the excitation point or the entire finger is considered. The point force that induces the smallest local deformation can even cause the largest overall deformation at certain frequency intervals. Above 225 Hz, oblique excitation produces larger mean displacement amplitudes than the other three forces due to excitation of multiple modes involving diagonal deformation. These simulation results give novel insights into the combined influence of excitation location and frequency on the fingertip dynamic response, potentially facilitating the design of future vibration feedback devices.
DOI BibTeX

Haptic Intelligence Intelligent Control Systems Article Multimodal Multi-User Surface Recognition with the Kernel Two-Sample Test Khojasteh, B., Solowjow, F., Trimpe, S., Kuchenbecker, K. J. IEEE Transactions on Automation Science and Engineering, 21(3):4432-4447, July 2024 (Published)
Machine learning and deep learning have been used extensively to classify physical surfaces through images and time-series contact data. However, these methods rely on human expertise and entail the time-consuming processes of data and parameter tuning. To overcome these challenges, we propose an easily implemented framework that can directly handle heterogeneous data sources for classification tasks. Our data-versus-data approach automatically quantifies distinctive differences in distributions in a high-dimensional space via kernel two-sample testing between two sets extracted from multimodal data (e.g., images, sounds, haptic signals). We demonstrate the effectiveness of our technique by benchmarking against expertly engineered classifiers for visual-audio-haptic surface recognition due to the industrial relevance, difficulty, and competitive baselines of this application; ablation studies confirm the utility of key components of our pipeline. As shown in our open-source code, we achieve 97.2\% accuracy on a standard multi-user dataset with 108 surface classes, outperforming the state-of-the-art machine-learning algorithm by 6\% on a more difficult version of the task. The fact that our classifier obtains this performance with minimal data processing in the standard algorithm setting reinforces the powerful nature of kernel methods for learning to recognize complex patterns. Note to Practitioners—We demonstrate how to apply the kernel two-sample test to a surface-recognition task, discuss opportunities for improvement, and explain how to use this framework for other classification problems with similar properties. Automating surface recognition could benefit both surface inspection and robot manipulation. Our algorithm quantifies class similarity and therefore outputs an ordered list of similar surfaces. This technique is well suited for quality assurance and documentation of newly received materials or newly manufactured parts. More generally, our automated classification pipeline can handle heterogeneous data sources including images and high-frequency time-series measurements of vibrations, forces and other physical signals. As our approach circumvents the time-consuming process of feature engineering, both experts and non-experts can use it to achieve high-accuracy classification. It is particularly appealing for new problems without existing models and heuristics. In addition to strong theoretical properties, the algorithm is straightforward to use in practice since it requires only kernel evaluations. Its transparent architecture can provide fast insights into the given use case under different sensing combinations without costly optimization. Practitioners can also use our procedure to obtain the minimum data-acquisition time for independent time-series data from new sensor recordings.
DOI BibTeX

Haptic Intelligence Article AiroTouch: Enhancing Telerobotic Assembly through Naturalistic Haptic Feedback of Tool Vibrations Gong, Y., Mat Husin, H., Erol, E., Ortenzi, V., Kuchenbecker, K. J. Frontiers in Robotics and AI, 11(1355205):1-15, May 2024 (Published)
Teleoperation allows workers to safely control powerful construction machines; however, its primary reliance on visual feedback limits the operator's efficiency in situations with stiff contact or poor visibility, hindering its use for assembly of pre-fabricated building components. Reliable, economical, and easy-to-implement haptic feedback could fill this perception gap and facilitate the broader use of robots in construction and other application areas. Thus, we adapted widely available commercial audio equipment to create AiroTouch, a naturalistic haptic feedback system that measures the vibration experienced by each robot tool and enables the operator to feel a scaled version of this vibration in real time. Accurate haptic transmission was achieved by optimizing the positions of the system's off-the-shelf accelerometers and voice-coil actuators. A study was conducted to evaluate how adding this naturalistic type of vibrotactile feedback affects the operator during telerobotic assembly. Thirty participants used a bimanual dexterous teleoperation system (Intuitive da Vinci Si) to build a small rigid structure under three randomly ordered haptic feedback conditions: no vibrations, one-axis vibrations, and summed three-axis vibrations. The results show that users took advantage of both tested versions of the naturalistic haptic feedback after gaining some experience with the task, causing significantly lower vibrations and forces in the second trial. Subjective responses indicate that haptic feedback increased the realism of the interaction and reduced the perceived task duration, task difficulty, and fatigue. As hypothesized, higher haptic feedback gains were chosen by users with larger hands and for the smaller sensed vibrations in the one-axis condition. These results elucidate important details for effective implementation of naturalistic vibrotactile feedback and demonstrate that our accessible audio-based approach could enhance user performance and experience during telerobotic assembly in construction and other application domains.
DOI BibTeX

Haptic Intelligence Article Closing the Loop in Minimally Supervised Human-Robot Interaction: Formative and Summative Feedback Mohan, M., Nunez, C. M., Kuchenbecker, K. J. Scientific Reports, 14(1):10564, May 2024 (Published)
Human instructors fluidly communicate with hand gestures, head and body movements, and facial expressions, but robots rarely leverage these complementary cues. A minimally supervised social robot with such skills could help people exercise and learn new activities. Thus, we investigated how nonverbal feedback from a humanoid robot affects human behavior. Inspired by the education literature, we evaluated formative feedback (real-time corrections) and summative feedback (post-task scores) for three distinct tasks: positioning in the room, mimicking the robot's arm pose, and contacting the robot's hands. Twenty-eight adults completed seventy-five 30-second-long trials with no explicit instructions or experimenter help. Motion-capture data analysis shows that both formative and summative feedback from the robot significantly aided user performance. Additionally, formative feedback improved task understanding. These results show the power of nonverbal cues based on human movement and the utility of viewing feedback through formative and summative lenses.
DOI BibTeX

Haptic Intelligence Robotics Article IMU-Based Kinematics Estimation Accuracy Affects Gait Retraining Using Vibrotactile Cues Rokhmanova, N., Pearl, O., Kuchenbecker, K. J., Halilaj, E. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 32:1005-1012, February 2024 (Published)
Wearable sensing using inertial measurement units (IMUs) is enabling portable and customized gait retraining for knee osteoarthritis. However, the vibrotactile feedback that users receive directly depends on the accuracy of IMU-based kinematics. This study investigated how kinematic errors impact an individual's ability to learn a therapeutic gait using vibrotactile cues. Sensor accuracy was computed by comparing the IMU-based foot progression angle to marker-based motion capture, which was used as ground truth. Thirty subjects were randomized into three groups to learn a toe-in gait: one group received vibrotactile feedback during gait retraining in the laboratory, another received feedback outdoors, and the control group received only verbal instruction and proceeded directly to the evaluation condition. All subjects were evaluated on their ability to maintain the learned gait in a new outdoor environment. We found that subjects with high tracking errors exhibited more incorrect responses to vibrotactile cues and slower learning rates than subjects with low tracking errors. Subjects with low tracking errors outperformed the control group in the evaluation condition, whereas those with higher error did not. Errors were correlated with foot size and angle magnitude, which may indicate a non-random algorithmic bias. The accuracy of IMU-based kinematics has a cascading effect on feedback; ignoring this effect could lead researchers or clinicians to erroneously classify a patient as a non-responder if they did not improve after retraining. To use patient and clinician time effectively, future implementation of portable gait retraining will require assessment across a diverse range of patients.
DOI BibTeX

Haptic Intelligence Article How Should Robots Exercise with People? Robot-Mediated Exergames Win with Music, Social Analogues, and Gameplay Clarity Fitter, N. T., Mohan, M., Preston, R. C., Johnson, M. J., Kuchenbecker, K. J. Frontiers in Robotics and AI, 10(1155837):1-18, January 2024 (Published)
The modern worldwide trend toward sedentary behavior comes with significant health risks. An accompanying wave of health technologies has tried to encourage physical activity, but these approaches often yield limited use and retention. Due to their unique ability to serve as both a health-promoting technology and a social peer, we propose robots as a game-changing solution for encouraging physical activity. This article analyzes the eight exergames we previously created for the Rethink Baxter Research Robot in terms of four key components that are grounded in the video-game literature: repetition, pattern matching, music, and social design. We use these four game facets to assess gameplay data from 40 adult users who each experienced the games in balanced random order. In agreement with prior research, our results show that relevant musical cultural references, recognizable social analogues, and gameplay clarity are good strategies for taking an otherwise highly repetitive physical activity and making it engaging and popular among users. Others who study socially assistive robots and rehabilitation robotics can benefit from this work by considering the presented design attributes to generate future hypotheses and by using our eight open-source games to pursue follow-up work on social-physical exercise with robots.
DOI BibTeX

Haptic Intelligence Article Robust Surface Recognition with the Maximum Mean Discrepancy: Degrading Haptic-Auditory Signals Through Bandwidth and Noise Khojasteh, B., Shao, Y., Kuchenbecker, K. J. IEEE Transactions on Haptics, 17(1):58-65, January 2024, Presented at the IEEE Haptics Symposium (Published)
Sliding a tool across a surface generates rich sensations that can be analyzed to recognize what is being touched. However, the optimal configuration for capturing these signals is yet unclear. To bridge this gap, we consider haptic-auditory data as a human explores surfaces with different steel tools, including accelerations of the tool and finger, force and torque applied to the surface, and contact sounds. Our classification pipeline uses the maximum mean discrepancy (MMD) to quantify differences in data distributions in a high-dimensional space for inference. With recordings from three hemispherical tool diameters and ten diverse surfaces, we conducted two degradation studies by decreasing sensing bandwidth and increasing added noise. We evaluate the haptic-auditory recognition performance achieved with the MMD to compare newly gathered data to each surface in our known library. The results indicate that acceleration signals alone have great potential for high-accuracy surface recognition and are robust against noise contamination. The optimal accelerometer bandwidth exceeds 1000 Hz, suggesting that useful vibrotactile information extends beyond human perception range. Finally, smaller tool tips generate contact vibrations with better noise robustness. The provided sensing guidelines may enable superhuman performance in portable surface recognition, which could benefit quality control, material documentation, and robotics.
DOI BibTeX

Haptic Intelligence Article Towards Semi-Automated Pleural Cavity Access for Pneumothorax in Austere Environments L’Orsa, R., Lama, S., Westwick, D., Sutherland, G., Kuchenbecker, K. J. Acta Astronautica, 212:48-53, November 2023 (Published)
Astronauts are at risk for pneumothorax, a condition where injury or disease introduces air between the chest wall and the lungs (i.e., the pleural cavity). In a worst-case scenario, it can rapidly lead to a fatality if left unmanaged and will require prompt treatment in situ if developed during spaceflight. Chest tube insertion is the definitive treatment for pneumothorax, but it requires a high level of skill and frequent practice for safe use. Physician astronauts may struggle to maintain this skill on medium- and long-duration exploration-class missions, and it is inappropriate for pure just-in-time learning or skill refreshment paradigms. This paper proposes semi-automating tool insertion to reduce the risk of complications in austere environments and describes preliminary experiments providing initial validation of an intelligent prototype system. Specifically, we showcase and analyse motion and force recordings from a sensorized percutaneous access needle inserted repeatedly into an ex vivo tissue phantom, along with relevant physiological data simultaneously recorded from the operator. When coupled with minimal just-in-time training and/or augmented reality guidance, the proposed system may enable non-expert operators to safely perform emergency chest tube insertion without the use of ground resources.
DOI BibTeX

Haptic Intelligence Perceiving Systems Article Learning to Estimate Palpation Forces in Robotic Surgery From Visual-Inertial Data Lee, Y., Mat Husin, H., Forte, M., Lee, S., Kuchenbecker, K. J. IEEE Transactions on Medical Robotics and Bionics, 5(3):496-506, August 2023, Young-Eun Lee and Haliza Mat Husin contributed equally to this work (Published)
Surgeons cannot directly touch the patient's tissue in robot-assisted minimally invasive procedures. Instead, they must palpate using instruments inserted into the body through trocars. This way of operating largely prevents surgeons from using haptic cues to localize visually undetectable structures such as tumors and blood vessels, motivating research on direct and indirect force sensing. We propose an indirect force-sensing method that combines monocular images of the operating field with measurements from IMUs attached externally to the instrument shafts. Our method is thus suitable for various robotic surgery systems as well as laparoscopic surgery. We collected a new dataset using a da Vinci Si robot, a force sensor, and four different phantom tissue samples. The dataset includes 230 one-minute-long recordings of repeated bimanual palpation tasks performed by four lay operators. We evaluated several network architectures and investigated the role of the network inputs. Using the DenseNet vision model and including inertial data best-predicted palpation forces (lowest average root-mean-square error and highest average coefficient of determination). Ablation studies revealed that video frames carry significantly more information than inertial signals. Finally, we demonstrated the model's ability to generalize to unseen tissue and predict shear contact forces.
DOI BibTeX

Haptic Intelligence Autonomous Learning Empirical Inference Article Minsight: A Fingertip-Sized Vision-Based Tactile Sensor for Robotic Manipulation Andrussow, I., Sun, H., Kuchenbecker, K. J., Martius, G. Advanced Intelligent Systems, 5(8):2300042, August 2023, Inside back cover, DOI: 10.1002/aisy.202370035 (Published)
Intelligent interaction with the physical world requires perceptual abilities beyond vision and hearing; vibrant tactile sensing is essential for autonomous robots to dexterously manipulate unfamiliar objects or safely contact humans. Therefore, robotic manipulators need high-resolution touch sensors that are compact, robust, inexpensive, and efficient. The soft vision-based haptic sensor presented herein is a miniaturized and optimized version of the previously published sensor Insight. Minsight has the size and shape of a human fingertip and uses machine learning methods to output high-resolution maps of 3D contact force vectors at 60 Hz. Experiments confirm its excellent sensing performance, with a mean absolute force error of 0.07 N and contact location error of 0.6 mm across its surface area. Minsight's utility is shown in two robotic tasks on a 3-DoF manipulator. First, closed-loop force control enables the robot to track the movements of a human finger based only on tactile data. Second, the informative value of the sensor output is shown by detecting whether a hard lump is embedded within a soft elastomer with an accuracy of 98\%. These findings indicate that Minsight can give robots the detailed fingertip touch sensing needed for dexterous manipulation and physical human–robot interaction.
DOI BibTeX

Haptic Intelligence Article Generating Clear Vibrotactile Cues with a Magnet Embedded in a Soft Finger Sheath Gertler, I., Serhat, G., Kuchenbecker, K. J. Soft Robotics, 10(3):624-635, June 2023 (Published)
Haptic displays act on the user's body to stimulate the sense of touch and enrich applications from gaming and computer-aided design to rehabilitation and remote surgery. However, when crafted from typical rigid robotic components, they tend to be heavy, bulky, and expensive, while sleeker designs often struggle to create clear haptic cues. This article introduces a lightweight wearable silicone finger sheath that can deliver salient and rich vibrotactile cues using electromagnetic actuation. We fabricate the sheath on a ferromagnetic mandrel with a process based on dip molding, a robust fabrication method that is rarely used in soft robotics but is suitable for commercial production. A miniature rare-earth magnet embedded within the silicone layers at the center of the finger pad is driven to vibrate by the application of alternating current to a nearby air-coil. Experiments are conducted to determine the amplitude of the magnetic force and the frequency response function for the displacement amplitude of the magnet perpendicular to the skin. In addition, high-fidelity finite element analyses of the finger wearing the device are performed to investigate the trends observed in the measurements. The experimental and simulated results show consistent dynamic behavior from 10 to 1000 Hz, with the displacement decreasing after about 300 Hz. These results match the detection threshold profile obtained in a psychophysical study performed by 17 users, where more current was needed only at the highest frequency. A cue identification experiment and a demonstration in virtual reality validate the feasibility of this approach to fingertip haptics.
DOI BibTeX

Haptic Intelligence Article In the Arms of a Robot: Designing Autonomous Hugging Robots with Intra-Hug Gestures Block, A. E., Seifi, H., Hilliges, O., Gassert, R., Kuchenbecker, K. J. ACM Transactions on Human-Robot Interaction, 12(2):1-49, June 2023, Special Issue on Designing the Robot Body: Critical Perspectives on Affective Embodied Interaction (Published)
Hugs are complex affective interactions that often include gestures like squeezes. We present six new guidelines for designing interactive hugging robots, which we validate through two studies with our custom robot. To achieve autonomy, we investigated robot responses to four human intra-hug gestures: holding, rubbing, patting, and squeezing. Thirty-two users each exchanged and rated sixteen hugs with an experimenter-controlled HuggieBot 2.0. The robot's inflated torso's microphone and pressure sensor collected data of the subjects' demonstrations that were used to develop a perceptual algorithm that classifies user actions with 88\% accuracy. Users enjoyed robot squeezes, regardless of their performed action, they valued variety in the robot response, and they appreciated robot-initiated intra-hug gestures. From average user ratings, we created a probabilistic behavior algorithm that chooses robot responses in real time. We implemented improvements to the robot platform to create HuggieBot 3.0 and then validated its gesture perception system and behavior algorithm with sixteen users. The robot's responses and proactive gestures were greatly enjoyed. Users found the robot more natural, enjoyable, and intelligent in the last phase of the experiment than in the first. After the study, they felt more understood by the robot and thought robots were nicer to hug.
DOI BibTeX

Haptic Intelligence Article Effects of Automated Skill Assessment on Robotic Surgery Training Brown, J. D., Kuchenbecker, K. J. The International Journal of Medical Robotics and Computer Assisted Surgery, 19(2):e2492, April 2023 (Published)
Background: Several automated skill-assessment approaches have been proposed for robotic surgery, but their utility is not well understood. This article investigates the effects of one machine-learning-based skill-assessment approach on psychomotor skill development in robotic surgery training. Methods: N=29 trainees (medical students and residents) with no robotic surgery experience performed five trials of inanimate peg transfer with an Intuitive Surgical da Vinci Standard robot. Half of the participants received no post-trial feedback. The other half received automatically calculated scores from five Global Evaluative Assessment of Robotic Skill (GEARS) domains post-trial. Results: There were no significant differences between the groups regarding overall improvement or skill improvement rate. However, participants who received post-trial feedback rated their overall performance improvement significantly lower than participants who did not receive feedback. Conclusions: These findings indicate that automated skill evaluation systems might improve trainee selfawareness but not accelerate early-stage psychomotor skill development in robotic surgery training.
DOI BibTeX

Haptic Intelligence Article Haptify: A Measurement-Based Benchmarking System for Grounded Force-Feedback Devices Fazlollahi, F., Kuchenbecker, K. J. IEEE Transactions on Robotics, 39(2):1622-1636, April 2023 (Published)
Grounded force-feedback (GFF) devices are an established and diverse class of haptic technology based on robotic arms. However, the number of designs and how they are specified make comparing devices difficult. We thus present Haptify, a benchmarking system that can thoroughly, fairly, and noninvasively evaluate GFF haptic devices. The user holds the instrumented device end-effector and moves it through a series of passive and active experiments. Haptify records the interaction between the hand, device, and ground with a seven-camera optical motion-capture system, a 60-cm-square custom force plate, and a customized sensing end-effector. We demonstrate six key ways to assess GFF device performance: workspace shape, global free-space forces, global free-space vibrations, local dynamic forces and torques, frictionless surface rendering, and stiffness rendering. We then use Haptify to benchmark two commercial haptic devices. With a smaller workspace than the 3D Systems Touch, the more expensive Touch X outputs smaller free-space forces and vibrations, smaller and more predictable dynamic forces and torques, and higher-quality renderings of a frictionless surface and high stiffness.
DOI BibTeX

Haptic Intelligence Article The S-BAN: Insights into the Perception of Shape-Changing Haptic Interfaces via Virtual Pedestrian Navigation Spiers, A. J., Young, E., Kuchenbecker, K. J. ACM Transactions on Computer-Human Interaction, 30(1):1-31, February 2023 (Published)
Screen-based pedestrian navigation assistance can be distracting or inaccessible to users. Shape-changing haptic interfaces can overcome these concerns. The S-BAN is a new handheld haptic interface that utilizes a parallel kinematic structure to deliver 2-DOF spatial information over a continuous workspace, with a form factor suited to integration with other travel aids. The ability to pivot, extend and retract its body opens possibilities and questions around spatial data representation. We present a static study to understand user perception of absolute pose and relative motion for two spatial mappings, showing highest sensitivity to relative motions in the cardinal directions. We then present an embodied navigation experiment in virtual reality. User motion efficiency when guided by the S-BAN was statistically equivalent to using a vision-based tool (a smartphone proxy). Although haptic trials were slower than visual trials, participants' heads were more elevated with the S-BAN, allowing greater visual focus on the environment.
DOI BibTeX

Autonomous Learning Haptic Intelligence Empirical Inference Article Predicting the Force Map of an ERT-Based Tactile Sensor Using Simulation and Deep Networks Lee, H., Sun, H., Park, H., Serhat, G., Javot, B., Martius, G., Kuchenbecker, K. J. IEEE Transactions on Automation Science and Engineering, 20(1):425-439, January 2023 (Published)
Electrical resistance tomography (ERT) can be used to create large-scale soft tactile sensors that are flexible and robust. Good performance requires a fast and accurate mapping from the sensor's sequential voltage measurements to the distribution of force across its surface. However, particularly with multiple contacts, this task is challenging for both previously developed approaches: physics-based modeling and end-to-end data-driven learning. Some promising results were recently achieved using sim-to-real transfer learning, but estimating multiple contact locations and accurate contact forces remains difficult because simulations tend to be less accurate with a high number of contact locations and/or high force. This paper introduces a modular hybrid method that combines simulation data synthesized from an electromechanical finite element model with real measurements collected from a new ERT-based tactile sensor. We use about 290,000 simulated and 90,000 real measurements to train two deep neural networks: the first (Transfer-Net) captures the inevitable gap between simulation and reality, and the second (Recon-Net) reconstructs contact forces from voltage measurements. The number of contacts, contact locations, force magnitudes, and contact diameters are evaluated for a manually collected multi-contact dataset of 150 measurements. Our modular pipeline's results outperform predictions by both a physics-based model and end-to-end learning.
DOI BibTeX

Haptic Intelligence Article The Utility of Synthetic Reflexes and Haptic Feedback for Upper-Limb Prostheses in a Dexterous Task Without Direct Vision Thomas, N., Fazlollahi, F., Kuchenbecker, K. J., Brown, J. D. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 31:169-179, January 2023 (Published)
Individuals who use myoelectric upper-limb prostheses often rely heavily on vision to complete their daily activities. They thus struggle in situations where vision is overloaded, such as multitasking, or unavailable, such as poor lighting conditions. Able-bodied individuals can easily accomplish such tasks due to tactile reflexes and haptic sensation guiding their upper-limb motor coordination. Based on these principles, we developed and tested two novel prosthesis systems that incorporate autonomous controllers and provide the user with touch-location feedback through either vibration or distributed pressure. These capabilities were made possible by installing a custom contact-location sensor on the fingers of a commercial prosthetic hand, along with a custom pressure sensor on the thumb. We compared the performance of the two systems against a standard myoelectric prosthesis and a myoelectric prosthesis with only autonomous controllers in a difficult reach-to-pick-and-place task conducted without direct vision. Results from 40 able-bodied participants in this between-subjects study indicated that vibrotactile feedback combined with synthetic reflexes proved significantly more advantageous than the standard prosthesis in several of the task milestones. In addition, vibrotactile feedback and synthetic reflexes improved grasp placement compared to only synthetic reflexes or pressure feedback combined with synthetic reflexes. These results indicate that autonomous controllers and haptic feedback together facilitate success in dexterous tasks without vision, and that the type of haptic display matters.
DOI BibTeX

Haptic Intelligence Article Learning to Feel Textures: Predicting Perceptual Similarities from Unconstrained Finger-Surface Interactions Richardson, B. A., Vardar, Y., Wallraven, C., Kuchenbecker, K. J. IEEE Transactions on Haptics, 15(4):705-717, October 2022, Benjamin A. Richardson and Yasemin Vardar contributed equally to this publication (Published)
Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity. Our findings provide new perspectives on human texture perception during active touch, and our approach could benefit haptic surface assessment, robotic tactile perception, and haptic rendering.
DOI BibTeX

Haptic Intelligence Article Contact Evolution of Dry and Hydrated Fingertips at Initial Touch Serhat, G., Vardar, Y., Kuchenbecker, K. J. PLOS ONE, 17(7):e0269722, July 2022, Gokhan Serhat and Yasemin Vardar contributed equally to this publication (Published)
Pressing the fingertips into surfaces causes skin deformations that enable humans to grip objects and sense their physical properties. This process involves intricate finger geometry, non-uniform tissue properties, and moisture, complicating the underlying contact mechanics. Here we explore the initial contact evolution of dry and hydrated fingers to isolate the roles of governing physical factors. Two participants gradually pressed an index finger on a glass surface under three moisture conditions: dry, water-hydrated, and glycerin-hydrated. Gross and real contact area were optically measured over time, revealing that glycerin hydration produced strikingly higher real contact area, while gross contact area was similar for all conditions. To elucidate the causes for this phenomenon, we investigated the combined effects of tissue elasticity, skin-surface friction, and fingerprint ridges on contact area using simulation. Our analyses show the dominant influence of elastic modulus over friction and an unusual contact phenomenon, which we call friction-induced hinging.
DOI BibTeX

Haptic Intelligence Article Perceptual Space of Algorithms for Three-to-One Dimensional Reduction of Realistic Vibrations Lee, H., Tombak, G. I., Park, G., Kuchenbecker, K. J. IEEE Transactions on Haptics, 15(3):521-534, July 2022 (Published)
Haptics researchers often endeavor to deliver realistic vibrotactile feedback through broad-bandwidth actuators; however, these actuators typically generate only single-axis vibrations, not 3D vibrations like those that occur in natural tool-mediated interactions. Several three-to-one (321) dimensional reduction algorithms have thus been developed to combine 3D vibrations into 1D vibrations. Surprisingly, the perceptual quality of 321-converted vibrations has never been comprehensively compared to rendering of the original 3D signals. In this study, we develop a multi-dimensional vibration rendering system using a magnetic levitation haptic interface. We verify the system's ability to generate realistic 3D vibrations recorded in both tapping and dragging interactions with four surfaces. We then conduct a study with 15 participants to measure the perceived dissimilarities between five 321 algorithms (SAZ, SUM, VM, DFT, PCA) and the original recordings. The resulting perceptual space is investigated with multiple regression and Procrustes analysis to unveil the relationship between the physical and perceptual properties of 321-converted vibrations. Surprisingly, we found that participants perceptually discriminated the original 3D vibrations from all tested 1D versions. Overall, our results indicate that spectral, temporal, and directional attributes may all contribute to the perceived similarities of vibration signals.
DOI BibTeX

Haptic Intelligence Article Normal and Tangential Forces Combine to Convey Contact Pressure During Dynamic Tactile Stimulation Gueorguiev, D., Lambert, J., Thonnard, J., Kuchenbecker, K. J. Scientific Reports, 12(1):8215, May 2022 (Published)
Humans need to accurately process the contact forces that arise as they perform everyday haptic interactions such as sliding the fingers along a surface to feel for bumps, sticky regions, or other irregularities. Several different mechanisms are possible for how the forces on the skin could be represented and integrated in such interactions. In this study, we used a force-controlled robotic platform and simultaneous ultrasonic modulation of the finger-surface friction to independently manipulate the normal and tangential forces during passive haptic stimulation by a flat surface. To assess whether the contact pressure on their finger had briefly increased or decreased during individual trials in this broad stimulus set, participants did not rely solely on either the normal force or the tangential force. Instead, they integrated tactile cues induced by both components. Support-vector-machine analysis classified physical trial data with up to 75% accuracy and suggested a linear perceptual mechanism. In addition, the change in the amplitude of the force vector predicted participants' responses better than the change of the coefficient of dynamic friction, suggesting that intensive tactile cues are meaningful in this task. These results provide novel insights about how normal and tangential forces shape the perception of tactile contact.
DOI BibTeX

Haptic Intelligence Article Predicting Knee Adduction Moment Response to Gait Retraining with Minimal Clinical Data Rokhmanova, N., Kuchenbecker, K. J., Shull, P. B., Ferber, R., Halilaj, E. PLOS Computational Biology, 18(5):e1009500, May 2022 (Published)
Knee osteoarthritis is a progressive disease mediated by high joint loads. Foot progression angle modifications that reduce the knee adduction moment (KAM), a surrogate of knee loading, have demonstrated efficacy in alleviating pain and improving function. Although changes to the foot progression angle are overall beneficial, KAM reductions are not consistent across patients. Moreover, customized interventions are time-consuming and require instrumentation not commonly available in the clinic. We present a regression model that uses minimal clinical data-a set of six features easily obtained in the clinic-to predict the extent of first peak KAM reduction after toe-in gait retraining. For such a model to generalize, the training data must be large and variable. Given the lack of large public datasets that contain different gaits for the same patient, we generated this dataset synthetically. Insights learned from a ground-truth dataset with both baseline and toe-in gait trials (N = 12) enabled the creation of a large (N = 138) synthetic dataset for training the predictive model. On a test set of data collected by a separate research group (N = 15), the first peak KAM reduction was predicted with a mean absolute error of 0.134\% body weight * height (\%BW*HT). This error is smaller than the standard deviation of the first peak KAM during baseline walking averaged across test subjects (0.306\%BW*HT). This work demonstrates the feasibility of training predictive models with synthetic data and provides clinicians with a new tool to predict the outcome of patient-specific gait retraining without requiring gait lab instrumentation.
DOI BibTeX

Haptic Intelligence Article Design of Interactive Augmented Reality Functions for Robotic Surgery and Evaluation in Dry-Lab Lymphadenectomy Forte, M., Gourishetti, R., Javot, B., Engler, T., Gomez, E. D., Kuchenbecker, K. J. The International Journal of Medical Robotics and Computer Assisted Surgery, 18(2):e2351, April 2022 (Published)
Augmented reality (AR) has been widely researched for use in healthcare. Prior AR for robot-assisted minimally invasive surgery has mainly focused on superimposing preoperative 3D images onto patient anatomy. This paper presents alternative interactive AR tools for robotic surgery. We designed, built, and evaluated four voice-controlled functions: viewing a live video of the operating room, viewing two-dimensional preoperative images, measuring 3D distances, and warning about out-of-view instruments. This low-cost system was developed on a da Vinci Si, and it can be integrated into surgical robots equipped with a stereo camera and a stereo viewer. Eight experienced surgeons performed dry-lab lymphadenectomies and reported that the functions improved the procedure. They particularly appreciated the possibility of accessing the patient's medical records on demand, measuring distances intraoperatively, and interacting with the functions using voice commands. The positive evaluations garnered by these alternative AR functions and interaction methods provide support for further exploration.
DOI BibTeX

Haptic Intelligence Robotics Article Endowing a NAO Robot with Practical Social-Touch Perception Burns, R. B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K. J. Frontiers in Robotics and AI, 9(840335):1-17, April 2022 (Published)
Social touch is essential to everyday interactions, but current socially assistive robots have limited touch-perception capabilities. Rather than build entirely new robotic systems, we propose to augment existing rigid-bodied robots with an external touch-perception system. This practical approach can enable researchers and caregivers to continue to use robotic technology they have already purchased and learned about, but with a myriad of new social-touch interactions possible. This paper presents a low-cost, easy-to-build, soft tactile-perception system that we created for the NAO robot, as well as participants' feedback on touching this system. We installed four of our fabric-and-foam-based resistive sensors on the curved surfaces of a NAO's left arm, including its hand, lower arm, upper arm, and shoulder. Fifteen adults then performed five types of affective touch-communication gestures (hitting, poking, squeezing, stroking, and tickling) at two force intensities (gentle and energetic) on the four sensor locations; we share this dataset of four time-varying resistances, our sensor patterns, and a characterization of the sensors' physical performance. After training, a gesture-classification algorithm based on a random forest identified the correct combined touch gesture and force intensity on windows of held-out test data with an average accuracy of 74.1\%, which is more than eight times better than chance. Participants rated the sensor-equipped arm as pleasant to touch and liked the robot's presence significantly more after touch interactions. Our promising results show that this type of tactile-perception system can detect necessary social-touch communication cues from users, can be tailored to a variety of robot body parts, and can provide HRI researchers with the tools needed to implement social touch in their own systems.
DOI BibTeX

Autonomous Learning Haptic Intelligence Article A Soft Thumb-Sized Vision-Based Sensor with Accurate All-Round Force Perception Sun, H., Kuchenbecker, K. J., Martius, G. Nature Machine Intelligence, 4(2):135-145, February 2022 (Published)
Vision-based haptic sensors have emerged as a promising approach to robotic touch due to affordable high-resolution cameras and successful computer-vision techniques. However, their physical design and the information they provide do not yet meet the requirements of real applications. We present a robust, soft, low-cost, vision-based, thumb-sized 3D haptic sensor named Insight: it continually provides a directional force-distribution map over its entire conical sensing surface. Constructed around an internal monocular camera, the sensor has only a single layer of elastomer over-molded on a stiff frame to guarantee sensitivity, robustness, and soft contact. Furthermore, Insight is the first system to combine photometric stereo and structured light using a collimator to detect the 3D deformation of its easily replaceable flexible outer shell. The force information is inferred by a deep neural network that maps images to the spatial distribution of 3D contact force (normal and shear). Insight has an overall spatial resolution of 0.4 mm, force magnitude accuracy around 0.03 N, and force direction accuracy around 5 degrees over a range of 0.03--2 N for numerous distinct contacts with varying contact area. The presented hardware and software design concepts can be transferred to a wide variety of robot parts.
DOI URL BibTeX

Haptic Intelligence Article Adaptive Optimal Measurement Algorithm for ERT-Based Large-Area Tactile Sensors Park, K., Lee, H., Kuchenbecker, K. J., Kim, J. IEEE/ASME Transactions on Mechatronics, 27(1):304-314, February 2022 (Published)
Electrical resistance tomography (ERT) is an inferential imaging technique that has shown promising results for enabling large-area tactile sensors constructed from a piezoresistive sheet. The performance of such sensors is improved by increasing the number of electrodes, but the number of measurements and the computational cost also increase. In this article, we propose a new measurement algorithm for ERT-based tactile sensors: it adaptively changes the measurement pattern to be optimal for the present external stimulus. Regions of normal pressure are first detected by a base measurement pattern that maximizes the distinguishability of local conductivity changes. When a new contact is detected, a set of local patterns is selectively recruited near the pressed region to acquire more detailed information. For fast and parallel execution, the proposed algorithm is implemented with a field-programmable gate array. It is validated through indentation experiments on an ERT-based sensor that has 32 electrodes. The optimized base pattern of 100 measurements enabled a frame rate five times faster than before. Transmitting only detected contact events reduced the idle data rate to 0.5\% of its original value. The pattern adapted to new contacts with a latency of only 80 μs and an accuracy of 99.5\%, enabling efficient, high-quality real-time reconstruction of complex multicontact conditions.
DOI BibTeX

Haptic Intelligence Article Evaluation of Vibrotactile Output from a Rotating Motor Actuator Gourishetti, R., Kuchenbecker, K. J. IEEE Transactions on Haptics, 15(1):39-44, January 2022, Presented at the IEEE Haptics Symposium (Published)
Specialized vibrotactile actuators are widely used to output haptic sensations due to their portability and robustness; some models are expensive and capable, while others are economical but weaker and less expressive. To increase the accessibility of high-quality haptics, we designed a cost-effective actuation approach called the rotating motor actuator (RMA): it uses a small DC motor to generate vibrotactile cues on a rigid stylus. We conducted a psychophysical experiment where eighteen volunteers matched the RMA's vibration amplitudes with those from a high-quality reference actuator (Haptuator Mark II) at twelve frequencies from 50 Hz to 450 Hz. The average error in matching acceleration magnitudes was 10.2\%. More current was required for the RMA than the reference actuator; a stronger DC motor would require less current. Participants also watched a video of a real tool-mediated interaction with playback of recorded vibrotactile cues from each actuator. 94.4\% of the participants agreed that the RMA delivered realistic vibrations and audio cues during this replay. 83.3\% reported that the RMA vibrations were pleasant, compared to 66.7\% for the reference. A possible cause for this significant difference may be that the reference actuator (which has a mechanical resonance) distorts low-frequency vibrations more than the RMA does.
DOI BibTeX

Haptic Intelligence Article Virtual Reality Treatment Displaying the Missing Leg Improves Phantom Limb Pain: A Small Clinical Trial Ambron, E., Buxbaum, L. J., Miller, A., Stoll, H., Kuchenbecker, K. J., Coslett, H. B. Neurorehabilitation and Neural Repair, 35(12):1100-1111, December 2021 (Published)
Background: Phantom limb pain (PLP) is a common and in some cases debilitating consequence of upper- or lower-limb amputation for which current treatments are inadequate. Objective: This small clinical trial tested whether game-like interactions with immersive VR activities can reduce PLP in subjects with transtibial lower-limb amputation. Methods: Seven participants attended 5–7 sessions in which they engaged in a visually immersive virtual reality experience that did not require leg movements (Cool! TM), followed by 10–12 sessions of targeted lower-limb VR treatment consisting of custom games requiring leg movement. In the latter condition, they controlled an avatar with 2 intact legs viewed in a head-mounted display (HTC Vive TM). A motion-tracking system mounted on the intact and residual limbs controlled the movements of both virtual extremities independently. Results: All participants except one experienced a reduction of pain immediately after VR sessions, and their pre session pain levels also decreased over the course of the study. At a group level, PLP decreased by 28% after the treatment that did not include leg movements and 39.6% after the games requiring leg motions. Both treatments were successful in reducing PLP. Conclusions: This VR intervention appears to be an efficacious treatment for PLP in subjects with lower-limb amputation.
DOI BibTeX

Haptic Intelligence Article A Brake-Based Overground Gait Rehabilitation Device for Altering Propulsion Impulse Symmetry Hu, S., Fjeld, K., Vasudevan, E. V., Kuchenbecker, K. J. Sensors, 21(19):6617, October 2021 (Published)
This paper introduces a new device for gait rehabilitation, the gait propulsion trainer (GPT). It consists of two main components (a stationary device and a wearable system) that work together to apply periodic stance-phase resistance as the user walks overground. The stationary device provides the resistance forces via a cable that tethers the user's pelvis to a magnetic-particle brake. The wearable system detects gait events via foot switches to control the timing of the resistance forces. A hardware verification test confirmed that the GPT functions as intended. We conducted a pilot study in which one healthy adult and one stroke survivor walked with the GPT with increasing resistance levels. As hypothesized, the periodic stance-phase resistance caused the healthy participant to walk asymmetrically, with greatly reduced propulsion impulse symmetry; as GPT resistance increased, the walking speed also decreased, and the propulsion impulse appeared to increase for both legs. In contrast, the stroke participant responded to GPT resistance by walking faster and more symmetrically in terms of both propulsion impulse and step length. Thus, this paper shows promising results of short-term training with the GPT, and more studies will follow to explore its long-term effects on hemiparetic gait.
DOI BibTeX

Haptic Intelligence Article Robotics for Occupational Therapy: Learning Upper-Limb Exercises From Demonstrations Hu, S., Mendonca, R., Johnson, M. J., Kuchenbecker, K. J. IEEE Robotics and Automation Letters, 6(4):7781-7788, October 2021 (Published)
We describe a learning-from-demonstration technique that enables a general-purpose humanoid robot to lead a user through object-mediated upper-limb exercises. It needs only tens of seconds of training data from a therapist teleoperating the robot to do the task with the user. We model the robot behavior as a regression problem, inferring the desired robot effort using the end-effector's state (position and velocity). Compared to the conventional approach of learning time-based trajectories, our strategy produces customized robot behavior and eliminates the need to tune gains to adapt to the user's motor ability. In our study, one occupational therapist and six people with stroke trained a Willow Garage PR2 on three example tasks (periodic 1D and 2D motions plus episodic pick-and-place). They then repeatedly did the tasks with the robot and blindly compared the state- and time-based controllers learned from the training data. Our results show that working models were reliably obtained to allow the robot to do the exercise with the user; that our state-based approach enabled users to be more actively involved, allowed larger excursion, and generated power outputs more similar to the therapist demonstrations; and that the therapist found our strategy more agreeable than the traditional time-based approach.
DOI BibTeX

Haptic Intelligence Article Piezoresistive Textile Layer and Distributed Electrode Structure for Soft Whole-Body Tactile Skin Lee, H., Park, K., Kim, J., Kuchenbecker, K. J. Smart Materials and Structures, 30(8):085036, July 2021, Hyosang Lee and Kyungseo Park contributed equally to this publication (Published)
Tactile sensors based on electrical resistance tomography (ERT) provide pressure sensing over a large area using only a few electrodes, which is a promising property for robotic tactile skin. Most ERT-based tactile sensors employ electrodes only on the sensor's edge to avoid undesirable artifacts caused by electrode contact. The distribution of these electrodes is critical, as electrode location largely determines the sensitive regions, but only a few studies have positioned electrodes in the sensor's central region to improve the sensitivity. Establishing the use of internal electrodes on a stretchable textile needs further investigation into piezoresistive structure fabrication, measurement strategy, and calibration. This article presents a comprehensive study of an ERT-based tactile sensor with distributed electrodes. We describe key fabrication details of a layered textile-based piezoresistive structure, an iterative method for choosing the current injection pathways that yields pairwise optimal patterns, and a calibration process to account for the spatially varying sensitivity of such sensors. We demonstrate two sample sensors with electrodes located only on the boundary or distributed across the surface, and we evaluate their performance via three methods widely used to test tactile sensing in biological systems: single-point localization, two-point discrimination, and contact force estimation.
DOI BibTeX

Haptic Intelligence Article Free and Forced Vibration Modes of the Human Fingertip Serhat, G., Kuchenbecker, K. J. Applied Sciences, 11(12):5709, June 2021 (Published)
Computational analysis of free and forced vibration responses provides crucial information on the dynamic characteristics of deformable bodies. Although such numerical techniques are prevalently used in many disciplines, they have been underutilized in the quest to understand the form and function of human fingers. We addressed this opportunity by building DigiTip, a detailed three-dimensional finite element model of a representative human fingertip that is based on prior anatomical and biomechanical studies. Using the developed model, we first performed modal analyses to determine the free vibration modes with associated frequencies up to about 250 Hz, the frequency at which humans are most sensitive to vibratory stimuli on the fingertip. The modal analysis results reveal that this typical human fingertip exhibits seven characteristic vibration patterns in the considered frequency range. Subsequently, we applied distributed harmonic forces at the fingerprint centroid in three principal directions to predict forced vibration responses through frequency-response analyses; these simulations demonstrate that certain vibration modes are excited significantly more efficiently than the others under the investigated conditions. The results illuminate the dynamic behavior of the human fingertip in haptic interactions involving oscillating stimuli, such as textures and vibratory alerts, and they show how the modal information can predict the forced vibration responses of the soft tissue.
DOI BibTeX

Haptic Intelligence Article Optimizing a Viscoelastic Finite Element Model to Represent the Dry, Natural, and Moist Human Finger Pressing on Glass Nam, S., Kuchenbecker, K. J. IEEE Transactions on Haptics, 14(2):303-309, IEEE, April 2021, Presented at the IEEE World Haptics Conference (WHC) (Published)
When a fingerpad presses into a hard surface, the development of the contact area depends on the pressing force and speed. Importantly, it also varies with the finger's moisture, presumably because hydration changes the tissue's material properties. Therefore, we collected data from one finger repeatedly pressing a glass plate under three moisture conditions, and we constructed a finite element model that we optimized to simulate the same three scenarios. We controlled the moisture of the subject's finger to be dry, natural, or moist and recorded 15 pressing trials in each condition. The measurements include normal force over time plus finger-contact images that are processed to yield gross contact area. We defined the axially symmetric 3D model's lumped parameters to include an SLS-Kelvin model (spring in series with parallel spring and damper) for the bulk tissue, plus an elastic epidermal layer. Particle swarm optimization was used to find the parameter values that cause the simulation to best match the trials recorded in each moisture condition. The results show that the softness of the bulk tissue reduces as the finger becomes more hydrated. The epidermis of the moist finger model is softest, while the natural finger model has the highest viscosity.
DOI BibTeX

Haptic Intelligence Article Finger Motion and Contact by a Second Finger Influence the Tactile Perception of Electrovibration Vardar, Y., Kuchenbecker, K. J. Journal of the Royal Society Interface, 18(176):20200783, March 2021 (Published)
Electrovibration holds great potential for creating vivid and realistic haptic sensations on touchscreens. Ideally, a designer should be able to control what users feel independent of the number of fingers they use, the movements they make, and how hard they press. We sought to understand the perception and physics of such interactions by determining the smallest 125 Hz electrovibration voltage that 15 participants could reliably feel when performing four different touch interactions at two normal forces. The results proved for the first time that both finger motion and contact by a second finger significantly affect what the user feels. At a given voltage, a single moving finger experiences much larger fluctuating electrovibration forces than a single stationary finger, making electrovibration much easier to feel during interactions involving finger movement. Indeed, only about 30% of participants could detect the stimulus without motion. Part of this difference comes from the fact that relative motion greatly increases the electrical impedance between a finger and the screen, as shown via detailed measurements from one individual. By contrast, threshold-level electrovibration did not significantly affect the coefficient of kinetic friction in any conditions. These findings help lay the groundwork for delivering consistent haptic feedback via electrovibration.
DOI BibTeX

Haptic Intelligence Article Getting in Touch with Children with Autism: Specialist Guidelines for a Touch-Perceiving Robot Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J. Paladyn. Journal of Behavioral Robotics, 12(1):115-135, January 2021 (Published)
Children with autism need innovative solutions that help them learn to master everyday experiences and cope with stressful situations. We propose that socially assistive robot companions could better understand and react to a child's needs if they utilized tactile sensing. We examined the existing relevant literature to create an initial set of six tactile-perception requirements, and we then evaluated these requirements through interviews with 11 experienced autism specialists from a variety of backgrounds. Thematic analysis of the comments shared by the specialists revealed three overarching themes: the touch-seeking and touch-avoiding behavior of autistic children, their individual differences and customization needs, and the roles that a touch-perceiving robot could play in such interactions. Using the interview study feedback, we refined our initial list into seven qualitative requirements that describe robustness and maintainability, sensing range, feel, gesture identification, spatial, temporal, and adaptation attributes for the touch-perception system of a robot companion for children with autism. Lastly, by utilizing the literature and current best practices in tactile sensor development and signal processing, we transformed these qualitative requirements into quantitative specifications. We discuss the implications of these requirements for future HRI research in the sensing, computing, and user research communities.
DOI BibTeX

Haptic Intelligence Article Using a Variable-Friction Robot Hand to Determine Proprioceptive Features for Object Classification During Within-Hand-Manipulation Spiers, A. J., Morgan, A. S., Srinivasan, K., Calli, B., Dollar, A. M. IEEE Transactions on Haptics, 13(3):600-610, July 2020 (Published)
Interactions with an object during within-hand manipulation (WIHM) constitutes an assortment of gripping, sliding, and pivoting actions. In addition to manipulation benefits, the re-orientation and motion of the objects within-the-hand also provides a rich array of additional haptic information via the interactions to the sensory organs of the hand. In this article, we utilize variable friction (VF) robotic fingers to execute a rolling WIHM on a variety of objects, while recording "proprioceptive" actuator data, which is then used for object classification (i.e., without tactile sensors). Rather than hand-picking a select group of features for this task, our approach begins with 66 general features, which are computed from actuator position and load profiles for each object-rolling manipulation, based on gradient changes. An Extra Trees classifier performs object classification while also ranking each feature's importance. Using only the six most-important "Key Features" from the general set, a classification accuracy of 86% was achieved for distinguishing the six geometric objects included in our data set. Comparatively, when all 66 features are used, the accuracy is 89.8%.
DOI BibTeX

Haptic Intelligence Article Physical Variables Underlying Tactile Stickiness during Fingerpad Detachment Nam, S., Vardar, Y., Gueorguiev, D., Kuchenbecker, K. J. Frontiers in Neuroscience, 14:1-14, April 2020 (Published)
One may notice a relatively wide range of tactile sensations even when touching the same hard, flat surface in similar ways. Little is known about the reasons for this variability, so we decided to investigate how the perceptual intensity of light stickiness relates to the physical interaction between the skin and the surface. We conducted a psychophysical experiment in which nine participants actively pressed their finger on a flat glass plate with a normal force close to 1.5 N and detached it after a few seconds. A custom-designed apparatus recorded the contact force vector and the finger contact area during each interaction as well as pre- and post-trial finger moisture. After detaching their finger, participants judged the stickiness of the glass using a nine-point scale. We explored how sixteen physical variables derived from the recorded data correlate with each other and with the stickiness judgments of each participant. These analyses indicate that stickiness perception mainly depends on the pre-detachment pressing duration, the time taken for the finger to detach, and the impulse in the normal direction after the normal force changes sign; finger-surface adhesion seems to build with pressing time, causing a larger normal impulse during detachment and thus a more intense stickiness sensation. We additionally found a strong between-subjects correlation between maximum real contact area and peak pull-off force, as well as between finger moisture and impulse.
DOI BibTeX