Publications

DEPARTMENTS

Emperical Interference

Haptic Intelligence

Modern Magnetic Systems

Perceiving Systems

Physical Intelligence

Robotic Materials

Social Foundations of Computation


Research Groups

Autonomous Vision

Autonomous Learning

Bioinspired Autonomous Miniature Robots

Dynamic Locomotion

Embodied Vision

Human Aspects of Machine Learning

Intelligent Control Systems

Learning and Dynamical Systems

Locomotion in Biorobotic and Somatic Systems

Micro, Nano, and Molecular Systems

Movement Generation and Control

Neural Capture and Synthesis

Physics for Inference and Optimization

Organizational Leadership and Diversity

Probabilistic Learning Group


Topics

Robot Learning

Conference Paper

2022

Autonomous Learning

Robotics

AI

Career

Award


Haptic Intelligence Article Virtual Reality Treatment Displaying the Missing Leg Improves Phantom Limb Pain: A Small Clinical Trial Ambron, E., Buxbaum, L. J., Miller, A., Stoll, H., Kuchenbecker, K. J., Coslett, H. B. Neurorehabilitation and Neural Repair, 35(12):1100-1111, December 2021 (Published)
Background: Phantom limb pain (PLP) is a common and in some cases debilitating consequence of upper- or lower-limb amputation for which current treatments are inadequate. Objective: This small clinical trial tested whether game-like interactions with immersive VR activities can reduce PLP in subjects with transtibial lower-limb amputation. Methods: Seven participants attended 5–7 sessions in which they engaged in a visually immersive virtual reality experience that did not require leg movements (Cool! TM), followed by 10–12 sessions of targeted lower-limb VR treatment consisting of custom games requiring leg movement. In the latter condition, they controlled an avatar with 2 intact legs viewed in a head-mounted display (HTC Vive TM). A motion-tracking system mounted on the intact and residual limbs controlled the movements of both virtual extremities independently. Results: All participants except one experienced a reduction of pain immediately after VR sessions, and their pre session pain levels also decreased over the course of the study. At a group level, PLP decreased by 28% after the treatment that did not include leg movements and 39.6% after the games requiring leg motions. Both treatments were successful in reducing PLP. Conclusions: This VR intervention appears to be an efficacious treatment for PLP in subjects with lower-limb amputation.
DOI BibTeX

Haptic Intelligence Article A Brake-Based Overground Gait Rehabilitation Device for Altering Propulsion Impulse Symmetry Hu, S., Fjeld, K., Vasudevan, E. V., Kuchenbecker, K. J. Sensors, 21(19):6617, October 2021 (Published)
This paper introduces a new device for gait rehabilitation, the gait propulsion trainer (GPT). It consists of two main components (a stationary device and a wearable system) that work together to apply periodic stance-phase resistance as the user walks overground. The stationary device provides the resistance forces via a cable that tethers the user's pelvis to a magnetic-particle brake. The wearable system detects gait events via foot switches to control the timing of the resistance forces. A hardware verification test confirmed that the GPT functions as intended. We conducted a pilot study in which one healthy adult and one stroke survivor walked with the GPT with increasing resistance levels. As hypothesized, the periodic stance-phase resistance caused the healthy participant to walk asymmetrically, with greatly reduced propulsion impulse symmetry; as GPT resistance increased, the walking speed also decreased, and the propulsion impulse appeared to increase for both legs. In contrast, the stroke participant responded to GPT resistance by walking faster and more symmetrically in terms of both propulsion impulse and step length. Thus, this paper shows promising results of short-term training with the GPT, and more studies will follow to explore its long-term effects on hemiparetic gait.
DOI BibTeX

Haptic Intelligence Article Robotics for Occupational Therapy: Learning Upper-Limb Exercises From Demonstrations Hu, S., Mendonca, R., Johnson, M. J., Kuchenbecker, K. J. IEEE Robotics and Automation Letters, 6(4):7781-7788, October 2021 (Published)
We describe a learning-from-demonstration technique that enables a general-purpose humanoid robot to lead a user through object-mediated upper-limb exercises. It needs only tens of seconds of training data from a therapist teleoperating the robot to do the task with the user. We model the robot behavior as a regression problem, inferring the desired robot effort using the end-effector's state (position and velocity). Compared to the conventional approach of learning time-based trajectories, our strategy produces customized robot behavior and eliminates the need to tune gains to adapt to the user's motor ability. In our study, one occupational therapist and six people with stroke trained a Willow Garage PR2 on three example tasks (periodic 1D and 2D motions plus episodic pick-and-place). They then repeatedly did the tasks with the robot and blindly compared the state- and time-based controllers learned from the training data. Our results show that working models were reliably obtained to allow the robot to do the exercise with the user; that our state-based approach enabled users to be more actively involved, allowed larger excursion, and generated power outputs more similar to the therapist demonstrations; and that the therapist found our strategy more agreeable than the traditional time-based approach.
DOI BibTeX

Haptic Intelligence Conference Paper Sensorimotor-Inspired Tactile Feedback and Control Improve Consistency of Prosthesis Manipulation in the Absence of Direct Vision Thomas, N., Fazlollahi, F., Brown, J. D., Kuchenbecker, K. J. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 6174-6181, Prague, Czech Republic, September 2021 (Published)
The lack of haptically aware upper-limb prostheses forces amputees to rely largely on visual cues to complete activities of daily living. In contrast, non-amputees inherently rely on conscious haptic perception and automatic tactile reflexes to govern volitional actions in situations that do not allow for constant visual attention. We therefore propose a myoelectric prosthesis system that reflects these concepts to aid manipulation performance without direct vision. To implement this design, we constructed two fabric-based tactile sensors that measure contact location along the palmar and dorsal sides of the prosthetic fingers and grasp pressure at the tip of the prosthetic thumb. Inspired by the natural sensorimotor system, we use the measurements from these sensors to provide vibrotactile feedback of contact location and implement a tactile grasp controller with reflexes that prevent over-grasping and object slip. We compare this tactile system to a standard myoelectric prosthesis in a challenging reach-to-pick-and-place task conducted without direct vision; 17 non-amputee adults took part in this single-session between-subjects study. Participants in the tactile group achieved more consistent high performance compared to participants in the standard group. These results show that adding contact-location feedback and reflex control increases the consistency with which objects can be grasped and moved without direct vision in upper-limb prosthetics
DOI BibTeX

Haptic Intelligence Miscellaneous Teaching Safe Social Touch Interactions Using a Robot Koala Burns, R. B. Workshop paper (1 page) presented at the IROS Workshop on Proximity Perception in Robotics: Increasing Safety for Human-Robot Interaction Using Tactile and Proximity Perception, Prague, Czech Republic, September 2021 (Published) URL BibTeX

Haptic Intelligence Ph.D. Thesis HuggieBot: An Interactive Hugging Robot With Visual and Haptic Perception Block, A. E. ETH Zürich, Zürich, Switzerland, August 2021, Department of Computer Science (Published)
Hugs are one of the first forms of contact and affection humans experience. Receiving a hug is one of the best ways to feel socially supported, and the lack of social touch can have severe adverse effects on an individual's well-being. Due to the prevalence and health benefits of hugging, roboticists are interested in creating robots that can hug humans as seamlessly as humans hug other humans. However, hugs are complex affective interactions that need to adapt to the height, body shape, and preferences of the hugging partner, and they often include intra-hug gestures like squeezes. This dissertation aims to create a series of hugging robots that use visual and haptic perception to provide enjoyable interactive hugs. Each of the four presented HuggieBot versions is evaluated by measuring how users emotionally and behaviorally respond to hugging it; HuggieBot 4.0 is explicitly compared to a human hugging partner using physiological measures. Building on research both within and outside of human-robot interaction (HRI), this thesis proposes eleven tenets of natural and enjoyable robotic hugging. These tenets were iteratively crafted through a design process combining user feedback and experimenter observation, and they were evaluated through user studies. A good hugging robot should (1) be soft, (2) be warm, (3) be human-sized, (4) autonomously invite the user for a hug when it detects someone in its personal space, and then it should wait for the user to begin walking toward it before closing its arms to ensure a consensual and synchronous hugging experience. It should also (5) adjust its embrace to the user's size and position, (6) reliably release when the user wants to end the hug, and (7) perceive the user's height and adapt its arm positions accordingly to comfortably fit around the user at appropriate body locations. Finally, a hugging robot should (8) accurately detect and classify gestures applied to its torso in real time, regardless of the user's hand placement, (9) respond quickly to their intra-hug gestures, (10) adopt a gesture paradigm that blends user preferences with slight variety and spontaneity, and (11) occasionally provide unprompted, proactive affective social touch to the user through intra-hug gestures. We believe these eleven tenets are essential to delivering high-quality robot hugs. Their presence results in a hug that pleases the user, and their absence results in a hug that is likely to be inadequate. We present these tenets as guidelines for future hugging robot creators to follow when designing new hugging robots to ensure user acceptance. We tested the four versions of HuggieBot through six user studies. First, we analyzed data collected in a previous study with a modified Willow Garage Personal Robot 2 (PR2) to evaluate human responses to different robot physical characteristics and hugging behaviors. Participants experienced and evaluated twelve hugs with the robot, divided into three randomly ordered trials that focused on physical robot characteristics (single factor, three levels) and nine randomly ordered trials with low, medium, and high hug pressure and duration (two factors, three levels each). Second, we created an entirely new robotic platform, HuggieBot 2.0, according to our first six tenets. The new platform features a soft, warm, inflated body (HuggieChest) and uses visual and haptic sensing to deliver closed-loop hugging. We first verified the outward appeal of this platform compared to the previous PR2-based HuggieBot 1.0 via an online video-watching study involving 117 users. We then conducted an in-person experiment in which 32 users each exchanged eight hugs with HuggieBot 2.0, experiencing all combinations of visual hug initiation, haptic sizing, and haptic releasing. We then refine the original fourth tenet (visually perceive its user) and present the remaining five tenets for designing interactive hugging robots; we validate the full list of eleven tenets through more in-person studies with our custom robot. To enable perceptive and pleasing autonomous robot behavior, we investigated robot responses to four human intra-hug gestures: holding, rubbing, patting, and squeezing. The robot's inflated torso's microphone and pressure sensor collected data of 32 people repeatedly demonstrating these gestures, which were used to develop a perceptual algorithm that classifies user actions with 88% accuracy. From user preferences, we created a probabilistic behavior algorithm that chooses robot responses in real time. We implemented improvements to the robot platform to create a third version of our robot, HuggieBot 3.0. We then validated its gesture perception system and behavior algorithm in a fifth user study with 16 users. Finally, we refined the quality and comfort of the embrace by adjusting the joint torques and joint angles of the closed pose position, we further improved the robot's visual perception to detect changes in user approach, we upgraded the robot's response to users who do not press on its back, and we had the robot respond to all intra-hug gestures with squeezes to create our final version of the robotic platform, HuggieBot 4.0. In our sixth user study, we investigated the emotional and physiological effects of hugging a robot compared to the effects of hugging a friendly but unfamiliar person. We continuously monitored participant heart rate and collected saliva samples at seven time points across the 3.5-hour study to measure the temporal evolution of cortisol and oxytocin. We used an adapted Trier Social Stress Test (TSST) protocol to reliably and ethically induce stress in the participants. They then experienced one of five different hug intervention methods before all interacting with HuggieBot 4.0. The results of these six user studies validated our eleven hugging tenets and informed the iterative design of HuggieBot. We see that users enjoy robot softness, robot warmth, and being physically squeezed by the robot. Users dislike being released too soon from a hug and equally dislike being held by the robot for too long. Adding haptic reactivity definitively improves user perception of a hugging robot; the robot's responses and proactive intra-hug gestures were greatly enjoyed. In our last study, we learned that HuggieBot can positively affect users on a physiological level and is somewhat comparable to hugging a person. Participants have more favorable opinions about hugging robots after prolonged interaction with HuggieBot in all of our research studies.
DOI BibTeX

Haptic Intelligence Miscellaneous How Do Expert Hapticians Evaluate Grounded Force-Feedback Devices? Fazlollahi, F., Seifi, H., MacLean, K., Kuchenbecker, K. J. 341, Work-in-progress paper (1 page) presented at the IEEE World Haptics Conference (WHC), Montreal, Canada, July 2021 (Published)
The specifications typically reported for grounded force-feedback (GFF) devices do not capture performance quality in a consistent or meaningful way. We designed a study to identify the physical interrogations that expert hapticians employ when evaluating such a device. We report pilot data from one expert who tested three commercial GFF devices in unpowered and powered modes while demonstrating hand motions and describing the interactions. Finally, we outline how we will record expert interactions with a high-resolution apparatus and link measurements with interview data.
DOI BibTeX

Haptic Intelligence Article Piezoresistive Textile Layer and Distributed Electrode Structure for Soft Whole-Body Tactile Skin Lee, H., Park, K., Kim, J., Kuchenbecker, K. J. Smart Materials and Structures, 30(8):085036, July 2021, Hyosang Lee and Kyungseo Park contributed equally to this publication (Published)
Tactile sensors based on electrical resistance tomography (ERT) provide pressure sensing over a large area using only a few electrodes, which is a promising property for robotic tactile skin. Most ERT-based tactile sensors employ electrodes only on the sensor's edge to avoid undesirable artifacts caused by electrode contact. The distribution of these electrodes is critical, as electrode location largely determines the sensitive regions, but only a few studies have positioned electrodes in the sensor's central region to improve the sensitivity. Establishing the use of internal electrodes on a stretchable textile needs further investigation into piezoresistive structure fabrication, measurement strategy, and calibration. This article presents a comprehensive study of an ERT-based tactile sensor with distributed electrodes. We describe key fabrication details of a layered textile-based piezoresistive structure, an iterative method for choosing the current injection pathways that yields pairwise optimal patterns, and a calibration process to account for the spatially varying sensitivity of such sensors. We demonstrate two sample sensors with electrodes located only on the boundary or distributed across the surface, and we evaluate their performance via three methods widely used to test tactile sensing in biological systems: single-point localization, two-point discrimination, and contact force estimation.
DOI BibTeX

Haptic Intelligence Conference Paper PrendoSim: Proxy-Hand-Based Robot Grasp Generator Abdlkarim, D., Ortenzi, V., Pardi, T., Filipovica, M., Wing, A. M., Kuchenbecker, K. J., Di Luca, M. In Proceedings of the International Conference on Informatics in Control, Automation and Robotics (ICINCO), 60-68, (Editors: Gusikhin, Oleg and Nijmeijer, Henk and Madani, Kurosh), SciTePress, Virtual, July 2021 (Published)
The synthesis of realistic robot grasps in a simulated environment is pivotal in generating datasets that support sim-to-real transfer learning. In a step toward achieving this goal, we propose PrendoSim, an open-source grasp generator based on a proxy-hand simulation that employs NVIDIA's physics engine (PhysX) and the recently released articulated-body objects developed by Unity (https://prendosim.github.io). We present the implementation details, the method used to generate grasps, the approach to operationally evaluate stability of the generated grasps, and examples of grasps obtained with two different grippers (a parallel jaw gripper and a three-finger hand) grasping three objects selected from the YCB dataset (a pair of scissors, a hammer, and a screwdriver). Compared to simulators proposed in the literature, PrendoSim balances grasp realism and ease of use, displaying an intuitive interface and enabling the user to produce a large and varied dataset of stable grasps.
DOI BibTeX

Haptic Intelligence Miscellaneous Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures Gourishetti, R., Kuchenbecker, K. J. Hands-on demonstration presented at the IEEE World Haptics Conference (WHC), July 2021 (Published) BibTeX

Haptic Intelligence Article Free and Forced Vibration Modes of the Human Fingertip Serhat, G., Kuchenbecker, K. J. Applied Sciences, 11(12):5709, June 2021 (Published)
Computational analysis of free and forced vibration responses provides crucial information on the dynamic characteristics of deformable bodies. Although such numerical techniques are prevalently used in many disciplines, they have been underutilized in the quest to understand the form and function of human fingers. We addressed this opportunity by building DigiTip, a detailed three-dimensional finite element model of a representative human fingertip that is based on prior anatomical and biomechanical studies. Using the developed model, we first performed modal analyses to determine the free vibration modes with associated frequencies up to about 250 Hz, the frequency at which humans are most sensitive to vibratory stimuli on the fingertip. The modal analysis results reveal that this typical human fingertip exhibits seven characteristic vibration patterns in the considered frequency range. Subsequently, we applied distributed harmonic forces at the fingerprint centroid in three principal directions to predict forced vibration responses through frequency-response analyses; these simulations demonstrate that certain vibration modes are excited significantly more efficiently than the others under the investigated conditions. The results illuminate the dynamic behavior of the human fingertip in haptic interactions involving oscillating stimuli, such as textures and vibratory alerts, and they show how the modal information can predict the forced vibration responses of the soft tissue.
DOI BibTeX

Haptic Intelligence Conference Paper Robot Interaction Studio: A Platform for Unsupervised HRI Mohan, M., Nunez, C. M., Kuchenbecker, K. J. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 3330-3336, Xi’an, China, May 2021 (Published)
Robots hold great potential for supporting exercise and physical therapy, but such systems are often cumbersome to set up and require expert supervision. We aim to solve these concerns by combining Captury Live, a real-time markerless motion-capture system, with a Rethink Robotics Baxter Research Robot to create the Robot Interaction Studio. We evaluated this platform for unsupervised human-robot interaction (HRI) through a 75-minute-long user study with seven adults who were given minimal instructions and no feedback about their actions. The robot used sounds, facial expressions, facial colors, head motions, and arm motions to sequentially present three categories of cues in randomized order while constantly rotating its face screen to look at the user. Analysis of the captured user motions shows that the cue type significantly affected the distance subjects traveled and the amount of time they spent within the robot’s reachable workspace, in alignment with the design of the cues. Heat map visualizations of the recorded user hand positions confirm that users tended to mimic the robot’s arm poses. Despite some initial frustration, taking part in this study did not significantly change user opinions of the robot. We reflect on the advantages of the proposed approach to unsupervised HRI as well as the limitations and possible future extensions of our system.
DOI BibTeX

Haptic Intelligence Master Thesis Robotic Surgery Training in AR: Multimodal Record and Replay Krauthausen, F. University of Stuttgart, Stuttgart, Germany, May 2021, Study Program in Software Engineering (Published) BibTeX

Haptic Intelligence Conference Paper Ungrounded Vari-Dimensional Tactile Fingertip Feedback for Virtual Object Interaction Young, E. M., Kuchenbecker, K. J. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems, (217)1-14, Yokohama, Japan, May 2021 (Published)
Compared to grounded force feedback, providing tactile feedback via a wearable device can free the user and broaden the potential applications of simulated physical interactions. However, neither the limitations nor the full potential of tactile-only feedback have been precisely examined. Here we investigate how the dimensionality of cutaneous fingertip feedback affects user movements and virtual object recognition. We combine a recently invented 6-DOF fingertip device with motion tracking, a head-mounted display, and novel contact-rendering algorithms to enable a user to tactilely explore immersive virtual environments. We evaluate rudimentary 1-DOF, moderate 3-DOF, and complex 6-DOF tactile feedback during shape discrimination and mass discrimination, also comparing to interactions with real objects. Results from 20 naive study participants show that higher-dimensional tactile feedback may indeed allow completion of a wider range of virtual tasks, but that feedback dimensionality surprisingly does not greatly affect the exploratory techniques employed by the user.
DOI BibTeX

Haptic Intelligence Patent An electric machine with two-phase planar Lorentz coils and a ring-shaped Halbach array for high torque density and high-precision applications Nguyen, V., Javot, B., Kuchenbecker, K. J. (EP21170679.1), April 2021
An electric machine, in particular a motor or a generator, comprising a rotor and a stator, wherein the rotor comprises a planar, ring-shaped rotor base element and the stator comprises a planar ring-shaped stator base element, wherein the rotor base element and the stator base element are aligned along an axial axis (Z) of the electric machine, wherein a plurality of magnet elements are arranged around the circumference of the ring-shaped rotor base element forming a Halbach magnet-ring assembly, wherein the Halbach magnet-ring assembly generates a magnetic field (BR) with axial and azimuthal components, wherein a plurality of coils are arranged around the circumference (C) of the ring-shaped stator base element.
BibTeX

Haptic Intelligence Article Optimizing a Viscoelastic Finite Element Model to Represent the Dry, Natural, and Moist Human Finger Pressing on Glass Nam, S., Kuchenbecker, K. J. IEEE Transactions on Haptics, 14(2):303-309, IEEE, April 2021, Presented at the IEEE World Haptics Conference (WHC) (Published)
When a fingerpad presses into a hard surface, the development of the contact area depends on the pressing force and speed. Importantly, it also varies with the finger's moisture, presumably because hydration changes the tissue's material properties. Therefore, we collected data from one finger repeatedly pressing a glass plate under three moisture conditions, and we constructed a finite element model that we optimized to simulate the same three scenarios. We controlled the moisture of the subject's finger to be dry, natural, or moist and recorded 15 pressing trials in each condition. The measurements include normal force over time plus finger-contact images that are processed to yield gross contact area. We defined the axially symmetric 3D model's lumped parameters to include an SLS-Kelvin model (spring in series with parallel spring and damper) for the bulk tissue, plus an elastic epidermal layer. Particle swarm optimization was used to find the parameter values that cause the simulation to best match the trials recorded in each moisture condition. The results show that the softness of the bulk tissue reduces as the finger becomes more hydrated. The epidermis of the moist finger model is softest, while the natural finger model has the highest viscosity.
DOI BibTeX

Haptic Intelligence Miscellaneous A Haptic Empathetic Robot Animal for Children with Autism Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J. Companion of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), 583-585, Workshop paper (3 pages) presented at the HRI Pioneers Workshop, Virtual, March 2021 (Published)
Children with autism and their families could greatly benefit from increased support resources. While robots are already being introduced into autism therapy and care, we propose that these robots could better understand the child’s needs and provide enriched interaction if they utilize touch. We present our plans, both completed and ongoing, for a touch-perceiving robot companion for children with autism. We established and validated touch-perception requirements for an ideal robot companion through interviews with 11 autism specialists. Currently, we are evaluating custom fabric-based tactile sensors that enable the robot to detect and identify various touch communication gestures. Finally, our robot companion will react to the child’s touches through an emotion response system that will be customizable by a therapist or caretaker.
DOI BibTeX

Haptic Intelligence Miscellaneous Bimanual Wrist-Squeezing Haptic Feedback Changes Speed-Force Tradeoff in Robotic Surgery Training Cao, E., Machaca, S., Bernard, T., Chi, A., Wolfinger, B., Patterson, Z., Adrales, G. L., Kuchenbecker, K. J. Short paper presented at the ACS Surgeons and Engineers: A Dialogue on Surgical Simulation meeting, Virtual, March 2021 (Published) BibTeX

Haptic Intelligence Miscellaneous Dataset for Finger Motion and Contact by a Second Finger Influence the Tactile Perception of Electrovibration Vardar, Y., Kuchenbecker, K. J. Dataset, Dryad, March 2021 (Published)
Electrovibration holds great potential for creating vivid and realistic haptic sensations on touchscreens. Ideally, a designer should be able to control what users feel independent of the number of fingers they use, the movements they make, and how hard they press. We sought to understand the perception and physics of such interactions by determining the smallest 125 Hz electrovibration voltage that fifteen participants could reliably feel when performing four different touch interactions at two normal forces. The results proved for the first time that both finger motion and contact by a second finger significantly affect what the user feels. At a given voltage, a single moving finger experiences much larger fluctuating electrovibration forces than a single stationary finger, making electrovibration much easier to feel during interactions involving finger movement. Indeed, only about 30% of participants could detect the stimulus without motion. Part of this difference comes from the fact that relative motion greatly increases the electrical impedance between a finger and the screen, as shown via detailed measurements from one individual. In contrast, threshold-level electrovibration did not significantly affect the coefficient of kinetic friction in any conditions. These findings help lay the groundwork for delivering consistent haptic feedback via electrovibration.
DOI BibTeX

Haptic Intelligence Miscellaneous Evaluation of a Teleoperated Robotic Exercise Coach Mohan, M., Mat Husin, H., Kuchenbecker, K. J. Workshop paper (4 pages) presented at the HRI Workshop on Workshop YOUR study design! Participatory critique and refinement of participants’ studies, Virtual, March 2021 (Published) BibTeX

Haptic Intelligence Miscellaneous Evaluation of a Touch-Perceiving, Responsive Robot Koala for Children with Autism Burns, R. B., Seifi, H., Kuchenbecker, K. J. Workshop paper (4 pages) presented at the HRI Workshop on Workshop YOUR study design! Participatory critique and refinement of participants’ studies, Virtual, March 2021 (Published)
Social touch is a powerful component of human life, but current socially assistive robots have almost no touch-perception capabilities. In particular, there has been much interest in using socially assistive robots to help teach and assist children with autism. We propose that such robot companions could better understand and react to a child’s needs if they utilized augmented tactile sensing that captures the applied gesture and force intensity in addition to the more limited information measured by standard binary tactile sensors, which typically provide only contact location and timing. We present HERA, the Haptic Empathetic Robot Animal, as a touch-perceptive social robot for children with autism. In this paper, we propose a user study that aims to investigate whether HERA can help children with autism learn to use safe and appropriate touch behavior during social interaction.
BibTeX

Haptic Intelligence Article Finger Motion and Contact by a Second Finger Influence the Tactile Perception of Electrovibration Vardar, Y., Kuchenbecker, K. J. Journal of the Royal Society Interface, 18(176):20200783, March 2021 (Published)
Electrovibration holds great potential for creating vivid and realistic haptic sensations on touchscreens. Ideally, a designer should be able to control what users feel independent of the number of fingers they use, the movements they make, and how hard they press. We sought to understand the perception and physics of such interactions by determining the smallest 125 Hz electrovibration voltage that 15 participants could reliably feel when performing four different touch interactions at two normal forces. The results proved for the first time that both finger motion and contact by a second finger significantly affect what the user feels. At a given voltage, a single moving finger experiences much larger fluctuating electrovibration forces than a single stationary finger, making electrovibration much easier to feel during interactions involving finger movement. Indeed, only about 30% of participants could detect the stimulus without motion. Part of this difference comes from the fact that relative motion greatly increases the electrical impedance between a finger and the screen, as shown via detailed measurements from one individual. By contrast, threshold-level electrovibration did not significantly affect the coefficient of kinetic friction in any conditions. These findings help lay the groundwork for delivering consistent haptic feedback via electrovibration.
DOI BibTeX

Haptic Intelligence Miscellaneous Love, Actually? Robot Hugs, Oxytocin, and Cortisol Block, A. E., Kuchenbecker, S. Y., Lambercy, O., Gassert, R., Kuchenbecker, K. J. Workshop paper (5 pages) presented at the HRI Workshop on Workshop YOUR study design! Participatory critique and refinement of participants’ studies, Virtual, March 2021 (Published) BibTeX

Haptic Intelligence Conference Paper The Six Hug Commandments: Design and Evaluation of a Human-Sized Hugging Robot with Visual and Haptic Perception Block, A. E., Christen, S., Gassert, R., Hilliges, O., Kuchenbecker, K. J. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), 380-388, Virtual, March 2021 (Published)
Receiving a hug is one of the best ways to feel socially supported, and the lack of social touch can have severe negative effects on an individual's well-being. Based on previous research both within and outside of HRI, we propose six tenets (''commandments'') of natural and enjoyable robotic hugging: a hugging robot should be soft, be warm, be human sized, visually perceive its user, adjust its embrace to the user's size and position, and reliably release when the user wants to end the hug. Prior work validated the first two tenets, and the final four are new. We followed all six tenets to create a new robotic platform, HuggieBot 2.0, that has a soft, warm, inflated body (HuggieChest) and uses visual and haptic sensing to deliver closed-loop hugging. We first verified the outward appeal of this platform in comparison to the previous PR2-based HuggieBot 1.0 via an online video-watching study involving 117 users. We then conducted an in-person experiment in which 32 users each exchanged eight hugs with HuggieBot 2.0, experiencing all combinations of visual hug initiation, haptic sizing, and haptic releasing. The results show that adding haptic reactivity definitively improves user perception a hugging robot, largely verifying our four new tenets and illuminating several interesting opportunities for further improvement.
Block21-HRI-Commandments.pdf DOI BibTeX

Haptic Intelligence Article Getting in Touch with Children with Autism: Specialist Guidelines for a Touch-Perceiving Robot Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J. Paladyn. Journal of Behavioral Robotics, 12(1):115-135, January 2021 (Published)
Children with autism need innovative solutions that help them learn to master everyday experiences and cope with stressful situations. We propose that socially assistive robot companions could better understand and react to a child's needs if they utilized tactile sensing. We examined the existing relevant literature to create an initial set of six tactile-perception requirements, and we then evaluated these requirements through interviews with 11 experienced autism specialists from a variety of backgrounds. Thematic analysis of the comments shared by the specialists revealed three overarching themes: the touch-seeking and touch-avoiding behavior of autistic children, their individual differences and customization needs, and the roles that a touch-perceiving robot could play in such interactions. Using the interview study feedback, we refined our initial list into seven qualitative requirements that describe robustness and maintainability, sensing range, feel, gesture identification, spatial, temporal, and adaptation attributes for the touch-perception system of a robot companion for children with autism. Lastly, by utilizing the literature and current best practices in tactile sensor development and signal processing, we transformed these qualitative requirements into quantitative specifications. We discuss the implications of these requirements for future HRI research in the sensing, computing, and user research communities.
DOI BibTeX

Autonomous Learning Haptic Intelligence Patent Method for force inference, method for training a feed-forward neural network, force inference module, and sensor arrangement Sun, H., Martius, G., Kuchenbecker, K. J. (PCT/EP2021/050231), Max Planck Institute for Intelligent Systems, Max Planck Ring 4, January 2021
The invention relates to a method for force inference of a sensor arrangement for sensing forces, to a method for training a feed-forward neural network, to a force inference module, and to a sensor arrangement.
BibTeX

Autonomous Learning Haptic Intelligence Patent Sensor Arrangement for Sensing Forces and Methods for Fabricating a Sensor Arrangement and Parts Thereof Sun, H., Martius, G., Kuchenbecker, K. J. (PCT/EP2021/050230), Max Planck Institute for Intelligent Systems, Max Planck Ring 4, January 2021
The invention relates to a vision-based haptic sensor arrangement for sensing forces, to a method for fabricating a top portion of a sensor arrangement, and to a method for fabricating a sensor arrangement.
BibTeX

Empirical Inference Haptic Intelligence Perceiving Systems Physical Intelligence Robotic Materials MPI Year Book Scientific Report 2016 - 2021 2021
This report presents research done at the Max Planck Institute for Intelligent Systems from January2016 to November 2021. It is our fourth report since the founding of the institute in 2011. Dueto the fact that the upcoming evaluation is an extended one, the report covers a longer reportingperiod.This scientific report is organized as follows: we begin with an overview of the institute, includingan outline of its structure, an introduction of our latest research departments, and a presentationof our main collaborative initiatives and activities (Chapter1). The central part of the scientificreport consists of chapters on the research conducted by the institute’s departments (Chapters2to6) and its independent research groups (Chapters7 to24), as well as the work of the institute’scentral scientific facilities (Chapter25). For entities founded after January 2016, the respectivereport sections cover work done from the date of the establishment of the department, group, orfacility. These chapters are followed by a summary of selected outreach activities and scientificevents hosted by the institute (Chapter26). The scientific publications of the featured departmentsand research groups published during the 6-year review period complete this scientific report.
Scientific Report 2016 - 2021 BibTeX