Back

Haptic Intelligence Members Publications

Haptic Empathetic Robot Animal (HERA)

Hera2 cropped
HERA is a NAO robot fitted with sixteen custom fabric-based tactile sensors to detect and react to social touch gestures like tickling.

Members

Thumb ticker sm 2025 rbburns headshot option2 resized
Haptic Intelligence
Partner Group Leader
Thumb ticker sm hastiseifi
Haptic Intelligence
Thumb ticker sm hyosanglee
Haptic Intelligence
Thumb ticker sm b554810c 430f 46c6 b155 999a08e26303
Robotics
  • Technical Staff
no image
Haptic Intelligence
Thumb ticker sm squareavi
Haptic Intelligence
  • Guest Scientist
Thumb ticker sm keshav headshot 2
Haptic Intelligence
  • Student Assistant
Thumb ticker sm nthomas squarex1280
Haptic Intelligence
  • Visitor
Thumb ticker sm ruby headshot
Haptic Intelligence
  • Intern
Thumb ticker sm minerva
Haptic Intelligence
  • Intern
Thumb ticker sm ben
Haptic Intelligence
  • Postdoctoral Researcher
Thumb ticker sm thumb ticker kjk 2024
Haptic Intelligence
Director

Publications

Haptic Intelligence Ph.D. Thesis Creating a Haptic Empathetic Robot Animal That Feels Touch and Emotion Burns, R. B. University of Tübingen, Tübingen, Germany, February 2024, Department of Computer Science (Published)
Social touch, such as a hug or a poke on the shoulder, is an essential aspect of everyday interaction. Humans use social touch to gain attention, communicate needs, express emotions, and build social bonds. Despite its importance, touch sensing is very limited in most commercially available robots. By endowing robots with social-touch perception, one can unlock a myriad of new interaction possibilities. In this thesis, I present my work on creating a Haptic Empathetic Robot Animal (HERA), a koala-like robot for children with autism. I demonstrate the importance of establishing design guidelines based on one's target audience, which we investigated through interviews with autism specialists. I share our work on creating full-body tactile sensing for the NAO robot using low-cost, do-it-yourself (DIY) methods, and I introduce an approach to model long-term robot emotions using second-order dynamics.
BibTeX

Haptic Intelligence Conference Paper Wear Your Heart on Your Sleeve: Users Prefer Robots with Emotional Reactions to Touch and Ambient Moods Burns, R. B., Ojo, F., Kuchenbecker, K. J. In Proceedings of the IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), 1914-1921, Busan, South Korea, August 2023 (Published)
Robots are increasingly being developed as assistants for household, education, therapy, and care settings. Such robots can use adaptive emotional behavior to communicate warmly and effectively with their users and to encourage interest in extended interactions. However, autonomous physical robots often lack a dynamic internal emotional state, instead displaying brief, fixed emotion routines to promote specific user interactions. Furthermore, despite the importance of social touch in human communication, most commercially available robots have limited touch sensing, if any at all. We propose that users' perceptions of a social robotic system will improve when the robot provides emotional responses on both shorter and longer time scales (reactions and moods), based on touch inputs from the user. We evaluated this proposal through an online study in which 51 diverse participants watched nine randomly ordered videos (a three-by-three full-factorial design) of the koala-like robot HERA being touched by a human. Users provided the highest ratings in terms of agency, ambient activity, enjoyability, and touch perceptivity for scenarios in which HERA showed emotional reactions and either neutral or emotional moods in response to social touch gestures. Furthermore, we summarize key qualitative findings about users' preferences for reaction timing, the ability of robot mood to show persisting memory, and perception of neutral behaviors as a curious or self-aware robot.
DOI BibTeX

Haptic Intelligence Miscellaneous Creating a Haptic Empathetic Robot Animal for Children with Autism Burns, R. B. Workshop paper (4 pages) presented at the RSS Pioneers Workshop, Daegu, South Korea, July 2023 (Published) URL BibTeX

Haptic Intelligence Miscellaneous A Lasting Impact: Using Second-Order Dynamics to Customize the Continuous Emotional Expression of a Social Robot Burns, R. B., Kuchenbecker, K. J. Workshop paper (5 pages) presented at the HRI Workshop on Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI), Stockholm, Sweden, March 2023 (Published)
Robots are increasingly being developed as assistants for household, education, therapy, and care settings. Such robots need social skills to interact warmly and effectively with their users, as well as adaptive behavior to maintain user interest. While complex emotion models exist for chat bots and virtual agents, autonomous physical robots often lack a dynamic internal affective state, instead displaying brief, fixed emotion routines to promote or discourage specific user actions. We address this need by creating a mathematical emotion model that can easily be implemented in a social robot to enable it to react intelligently to external stimuli. The robot's affective state is modeled as a second-order dynamic system analogous to a mass connected to ground by a parallel spring and damper. The present position of this imaginary mass shows the robot's valence, which we visualize as the height of its displayed smile (positive) or frown (negative). Associating positive and negative stimuli with appropriately oriented and sized force pulses applied to the mass enables the robot to respond to social touch and other inputs with a valence that evolves over a longer timescale, capturing essential features of approach-avoidance theory. By adjusting the parameters of this emotion model, one can modify three main aspects of the robot's personality, which we term disposition, stoicism, and calmness.
URL BibTeX

Haptic Intelligence Miscellaneous Do-It-Yourself Whole-Body Social-Touch Perception for a NAO Robot Burns, R. B., Rosenthal, R., Garg, K., Kuchenbecker, K. J. Workshop paper (1 page) presented at the IROS Workshop on Large-Scale Robotic Skin: Perception, Interaction and Control, Kyoto, Japan, October 2022 (Published) Poster URL BibTeX

Haptic Intelligence Robotics Article Endowing a NAO Robot with Practical Social-Touch Perception Burns, R. B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K. J. Frontiers in Robotics and AI, 9(840335):1-17, April 2022 (Published)
Social touch is essential to everyday interactions, but current socially assistive robots have limited touch-perception capabilities. Rather than build entirely new robotic systems, we propose to augment existing rigid-bodied robots with an external touch-perception system. This practical approach can enable researchers and caregivers to continue to use robotic technology they have already purchased and learned about, but with a myriad of new social-touch interactions possible. This paper presents a low-cost, easy-to-build, soft tactile-perception system that we created for the NAO robot, as well as participants' feedback on touching this system. We installed four of our fabric-and-foam-based resistive sensors on the curved surfaces of a NAO's left arm, including its hand, lower arm, upper arm, and shoulder. Fifteen adults then performed five types of affective touch-communication gestures (hitting, poking, squeezing, stroking, and tickling) at two force intensities (gentle and energetic) on the four sensor locations; we share this dataset of four time-varying resistances, our sensor patterns, and a characterization of the sensors' physical performance. After training, a gesture-classification algorithm based on a random forest identified the correct combined touch gesture and force intensity on windows of held-out test data with an average accuracy of 74.1\%, which is more than eight times better than chance. Participants rated the sensor-equipped arm as pleasant to touch and liked the robot's presence significantly more after touch interactions. Our promising results show that this type of tactile-perception system can detect necessary social-touch communication cues from users, can be tailored to a variety of robot body parts, and can provide HRI researchers with the tools needed to implement social touch in their own systems.
DOI BibTeX

Haptic Intelligence Robotics Miscellaneous Sensor Patterns Dataset for Endowing a NAO Robot with Practical Social-Touch Perception Burns, R. B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K. J. Dataset published as a companion to the journal article "Endowing a NAO Robot with Practical Social-Touch Perception" in Frontiers in Robotics and AI, March 2022 (Published) DOI BibTeX

Haptic Intelligence Robotics Miscellaneous User Study Dataset for Endowing a NAO Robot with Practical Social-Touch Perception Burns, R. B., Lee, H., Seifi, H., Faulkner, R., Kuchenbecker, K. J. Dataset published as a companion to the journal article "Endowing a NAO Robot with Practical Social-Touch Perception" in Frontiers in Robotics and AI, March 2022 (Published) DOI BibTeX

Haptic Intelligence Miscellaneous Teaching Safe Social Touch Interactions Using a Robot Koala Burns, R. B. Workshop paper (1 page) presented at the IROS Workshop on Proximity Perception in Robotics: Increasing Safety for Human-Robot Interaction Using Tactile and Proximity Perception, Prague, Czech Republic, September 2021 (Published) URL BibTeX

Haptic Intelligence Miscellaneous A Haptic Empathetic Robot Animal for Children with Autism Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J. Companion of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), 583-585, Workshop paper (3 pages) presented at the HRI Pioneers Workshop, Virtual, March 2021 (Published)
Children with autism and their families could greatly benefit from increased support resources. While robots are already being introduced into autism therapy and care, we propose that these robots could better understand the child’s needs and provide enriched interaction if they utilize touch. We present our plans, both completed and ongoing, for a touch-perceiving robot companion for children with autism. We established and validated touch-perception requirements for an ideal robot companion through interviews with 11 autism specialists. Currently, we are evaluating custom fabric-based tactile sensors that enable the robot to detect and identify various touch communication gestures. Finally, our robot companion will react to the child’s touches through an emotion response system that will be customizable by a therapist or caretaker.
DOI BibTeX

Haptic Intelligence Miscellaneous Evaluation of a Touch-Perceiving, Responsive Robot Koala for Children with Autism Burns, R. B., Seifi, H., Kuchenbecker, K. J. Workshop paper (4 pages) presented at the HRI Workshop on Workshop YOUR study design! Participatory critique and refinement of participants’ studies, Virtual, March 2021 (Published)
Social touch is a powerful component of human life, but current socially assistive robots have almost no touch-perception capabilities. In particular, there has been much interest in using socially assistive robots to help teach and assist children with autism. We propose that such robot companions could better understand and react to a child’s needs if they utilized augmented tactile sensing that captures the applied gesture and force intensity in addition to the more limited information measured by standard binary tactile sensors, which typically provide only contact location and timing. We present HERA, the Haptic Empathetic Robot Animal, as a touch-perceptive social robot for children with autism. In this paper, we propose a user study that aims to investigate whether HERA can help children with autism learn to use safe and appropriate touch behavior during social interaction.
BibTeX

Haptic Intelligence Article Getting in Touch with Children with Autism: Specialist Guidelines for a Touch-Perceiving Robot Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J. Paladyn. Journal of Behavioral Robotics, 12(1):115-135, January 2021 (Published)
Children with autism need innovative solutions that help them learn to master everyday experiences and cope with stressful situations. We propose that socially assistive robot companions could better understand and react to a child's needs if they utilized tactile sensing. We examined the existing relevant literature to create an initial set of six tactile-perception requirements, and we then evaluated these requirements through interviews with 11 experienced autism specialists from a variety of backgrounds. Thematic analysis of the comments shared by the specialists revealed three overarching themes: the touch-seeking and touch-avoiding behavior of autistic children, their individual differences and customization needs, and the roles that a touch-perceiving robot could play in such interactions. Using the interview study feedback, we refined our initial list into seven qualitative requirements that describe robustness and maintainability, sensing range, feel, gesture identification, spatial, temporal, and adaptation attributes for the touch-perception system of a robot companion for children with autism. Lastly, by utilizing the literature and current best practices in tactile sensor development and signal processing, we transformed these qualitative requirements into quantitative specifications. We discuss the implications of these requirements for future HRI research in the sensing, computing, and user research communities.
DOI BibTeX

Haptic Intelligence Miscellaneous Utilizing Interviews and Thematic Analysis to Uncover Specifications for a Companion Robot Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J. Workshop paper (2 pages) presented at the ICSR Workshop on Enriching HRI Research with Qualitative Methods, Virtual, November 2020 (Published)
We will share our experiences designing and conducting structured video-conferencing interviews with autism specialists and utilizing thematic analysis to create qualitative requirements and quantitative specifications for a touch-perceiving robot companion tailored for children with autism. We will also explain how we wrote about our qualitative approaches for a journal setting.
URL BibTeX

Haptic Intelligence Miscellaneous Tactile Textiles: An Assortment of Fabric-Based Tactile Sensors for Contact Force and Contact Location Burns, R. B., Thomas, N., Lee, H., Faulkner, R., Kuchenbecker, K. J. Hands-on demonstration presented at EuroHaptics, Leiden, the Netherlands, September 2020, Rachael Bevill Burns, Neha Thomas, and Hyosang Lee contributed equally to this publication (Published)
Fabric-based tactile sensors are promising for the construction of robotic skin due to their soft and flexible nature. Conductive fabric layers can be used to form piezoresistive structures that are sensitive to contact force and/or contact location. This demonstration showcases three diverse fabric-based tactile sensors we have created. The first detects dynamic tactile events anywhere within a region on a robot’s body. The second design measures the precise location at which a single low-force contact is applied. The third sensor uses electrical resistance tomography to output both the force and location of multiple simultaneous contacts applied across a surface.
BibTeX

Haptic Intelligence Miscellaneous A Fabric-Based Sensing System for Recognizing Social Touch Burns, R. B., Lee, H., Seifi, H., Kuchenbecker, K. J. Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, Crystal City, USA, March 2020 (Published)
We present a fabric-based piezoresistive tactile sensor system designed to detect social touch gestures on a robot. The unique sensor design utilizes three layers of low-conductivity fabric sewn together on alternating edges to form an accordion pattern and secured between two outer high-conductivity layers. This five-layer design demonstrates a greater resistance range and better low-force sensitivity than previous designs that use one layer of low-conductivity fabric with or without a plastic mesh layer. An individual sensor from our system can presently identify six different communication gestures – squeezing, patting, scratching, poking, hand resting without movement, and no touch – with an average accuracy of 90%. A layer of foam can be added beneath the sensor to make a rigid robot more appealing for humans to touch without inhibiting the system’s ability to register social touch gestures.
BibTeX

Haptic Intelligence Miscellaneous Designing a Haptic Empathetic Robot Animal for Children with Autism Burns, R., Kuchenbecker, K. J. Workshop paper (4 pages) presented at the Robotics: Science and Systems Workshop on Robot-Mediated Autism Intervention: Hardware, Software and Curriculum, Pittsburgh, USA, June 2018 (Published)
Children with autism often endure sensory overload, may be nonverbal, and have difficulty understanding and relaying emotions. These experiences result in heightened stress during social interaction. Animal-assisted intervention has been found to improve the behavior of children with autism during social interaction, but live animal companions are not always feasible. We are thus in the process of designing a robotic animal to mimic some successful characteristics of animal-assisted intervention while trying to improve on others. The over-arching hypothesis of this research is that an appropriately designed robot animal can reduce stress in children with autism and empower them to engage in social interaction.
URL BibTeX