Publications

DEPARTMENTS

Emperical Interference

Haptic Intelligence

Modern Magnetic Systems

Perceiving Systems

Physical Intelligence

Robotic Materials

Social Foundations of Computation


Research Groups

Autonomous Vision

Autonomous Learning

Bioinspired Autonomous Miniature Robots

Dynamic Locomotion

Embodied Vision

Human Aspects of Machine Learning

Intelligent Control Systems

Learning and Dynamical Systems

Locomotion in Biorobotic and Somatic Systems

Micro, Nano, and Molecular Systems

Movement Generation and Control

Neural Capture and Synthesis

Physics for Inference and Optimization

Organizational Leadership and Diversity

Probabilistic Learning Group


Topics

Robot Learning

Conference Paper

2022

Autonomous Learning

Robotics

AI

Career

Award


Haptic Intelligence Conference Paper PrendoSim: Proxy-Hand-Based Robot Grasp Generator Abdlkarim, D., Ortenzi, V., Pardi, T., Filipovica, M., Wing, A. M., Kuchenbecker, K. J., Di Luca, M. In Proceedings of the International Conference on Informatics in Control, Automation and Robotics (ICINCO), 60-68, (Editors: Gusikhin, Oleg and Nijmeijer, Henk and Madani, Kurosh), SciTePress, Virtual, July 2021 (Published)
The synthesis of realistic robot grasps in a simulated environment is pivotal in generating datasets that support sim-to-real transfer learning. In a step toward achieving this goal, we propose PrendoSim, an open-source grasp generator based on a proxy-hand simulation that employs NVIDIA's physics engine (PhysX) and the recently released articulated-body objects developed by Unity (https://prendosim.github.io). We present the implementation details, the method used to generate grasps, the approach to operationally evaluate stability of the generated grasps, and examples of grasps obtained with two different grippers (a parallel jaw gripper and a three-finger hand) grasping three objects selected from the YCB dataset (a pair of scissors, a hammer, and a screwdriver). Compared to simulators proposed in the literature, PrendoSim balances grasp realism and ease of use, displaying an intuitive interface and enabling the user to produce a large and varied dataset of stable grasps.
DOI BibTeX

Haptic Intelligence Miscellaneous Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures Gourishetti, R., Kuchenbecker, K. J. Hands-on demonstration presented at the IEEE World Haptics Conference (WHC), July 2021 (Published) BibTeX

Haptic Intelligence Article Free and Forced Vibration Modes of the Human Fingertip Serhat, G., Kuchenbecker, K. J. Applied Sciences, 11(12):5709, June 2021 (Published)
Computational analysis of free and forced vibration responses provides crucial information on the dynamic characteristics of deformable bodies. Although such numerical techniques are prevalently used in many disciplines, they have been underutilized in the quest to understand the form and function of human fingers. We addressed this opportunity by building DigiTip, a detailed three-dimensional finite element model of a representative human fingertip that is based on prior anatomical and biomechanical studies. Using the developed model, we first performed modal analyses to determine the free vibration modes with associated frequencies up to about 250 Hz, the frequency at which humans are most sensitive to vibratory stimuli on the fingertip. The modal analysis results reveal that this typical human fingertip exhibits seven characteristic vibration patterns in the considered frequency range. Subsequently, we applied distributed harmonic forces at the fingerprint centroid in three principal directions to predict forced vibration responses through frequency-response analyses; these simulations demonstrate that certain vibration modes are excited significantly more efficiently than the others under the investigated conditions. The results illuminate the dynamic behavior of the human fingertip in haptic interactions involving oscillating stimuli, such as textures and vibratory alerts, and they show how the modal information can predict the forced vibration responses of the soft tissue.
DOI BibTeX

Haptic Intelligence Conference Paper Robot Interaction Studio: A Platform for Unsupervised HRI Mohan, M., Nunez, C. M., Kuchenbecker, K. J. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 3330-3336, Xi’an, China, May 2021 (Published)
Robots hold great potential for supporting exercise and physical therapy, but such systems are often cumbersome to set up and require expert supervision. We aim to solve these concerns by combining Captury Live, a real-time markerless motion-capture system, with a Rethink Robotics Baxter Research Robot to create the Robot Interaction Studio. We evaluated this platform for unsupervised human-robot interaction (HRI) through a 75-minute-long user study with seven adults who were given minimal instructions and no feedback about their actions. The robot used sounds, facial expressions, facial colors, head motions, and arm motions to sequentially present three categories of cues in randomized order while constantly rotating its face screen to look at the user. Analysis of the captured user motions shows that the cue type significantly affected the distance subjects traveled and the amount of time they spent within the robot’s reachable workspace, in alignment with the design of the cues. Heat map visualizations of the recorded user hand positions confirm that users tended to mimic the robot’s arm poses. Despite some initial frustration, taking part in this study did not significantly change user opinions of the robot. We reflect on the advantages of the proposed approach to unsupervised HRI as well as the limitations and possible future extensions of our system.
DOI BibTeX

Haptic Intelligence Master Thesis Robotic Surgery Training in AR: Multimodal Record and Replay Krauthausen, F. University of Stuttgart, Stuttgart, Germany, May 2021, Study Program in Software Engineering (Published) BibTeX

Haptic Intelligence Conference Paper Ungrounded Vari-Dimensional Tactile Fingertip Feedback for Virtual Object Interaction Young, E. M., Kuchenbecker, K. J. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems, (217)1-14, Yokohama, Japan, May 2021 (Published)
Compared to grounded force feedback, providing tactile feedback via a wearable device can free the user and broaden the potential applications of simulated physical interactions. However, neither the limitations nor the full potential of tactile-only feedback have been precisely examined. Here we investigate how the dimensionality of cutaneous fingertip feedback affects user movements and virtual object recognition. We combine a recently invented 6-DOF fingertip device with motion tracking, a head-mounted display, and novel contact-rendering algorithms to enable a user to tactilely explore immersive virtual environments. We evaluate rudimentary 1-DOF, moderate 3-DOF, and complex 6-DOF tactile feedback during shape discrimination and mass discrimination, also comparing to interactions with real objects. Results from 20 naive study participants show that higher-dimensional tactile feedback may indeed allow completion of a wider range of virtual tasks, but that feedback dimensionality surprisingly does not greatly affect the exploratory techniques employed by the user.
DOI BibTeX

Haptic Intelligence Patent An electric machine with two-phase planar Lorentz coils and a ring-shaped Halbach array for high torque density and high-precision applications Nguyen, V., Javot, B., Kuchenbecker, K. J. (EP21170679.1), April 2021
An electric machine, in particular a motor or a generator, comprising a rotor and a stator, wherein the rotor comprises a planar, ring-shaped rotor base element and the stator comprises a planar ring-shaped stator base element, wherein the rotor base element and the stator base element are aligned along an axial axis (Z) of the electric machine, wherein a plurality of magnet elements are arranged around the circumference of the ring-shaped rotor base element forming a Halbach magnet-ring assembly, wherein the Halbach magnet-ring assembly generates a magnetic field (BR) with axial and azimuthal components, wherein a plurality of coils are arranged around the circumference (C) of the ring-shaped stator base element.
BibTeX

Haptic Intelligence Article Optimizing a Viscoelastic Finite Element Model to Represent the Dry, Natural, and Moist Human Finger Pressing on Glass Nam, S., Kuchenbecker, K. J. IEEE Transactions on Haptics, 14(2):303-309, IEEE, April 2021, Presented at the IEEE World Haptics Conference (WHC) (Published)
When a fingerpad presses into a hard surface, the development of the contact area depends on the pressing force and speed. Importantly, it also varies with the finger's moisture, presumably because hydration changes the tissue's material properties. Therefore, we collected data from one finger repeatedly pressing a glass plate under three moisture conditions, and we constructed a finite element model that we optimized to simulate the same three scenarios. We controlled the moisture of the subject's finger to be dry, natural, or moist and recorded 15 pressing trials in each condition. The measurements include normal force over time plus finger-contact images that are processed to yield gross contact area. We defined the axially symmetric 3D model's lumped parameters to include an SLS-Kelvin model (spring in series with parallel spring and damper) for the bulk tissue, plus an elastic epidermal layer. Particle swarm optimization was used to find the parameter values that cause the simulation to best match the trials recorded in each moisture condition. The results show that the softness of the bulk tissue reduces as the finger becomes more hydrated. The epidermis of the moist finger model is softest, while the natural finger model has the highest viscosity.
DOI BibTeX

Haptic Intelligence Miscellaneous A Haptic Empathetic Robot Animal for Children with Autism Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J. Companion of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), 583-585, Workshop paper (3 pages) presented at the HRI Pioneers Workshop, Virtual, March 2021 (Published)
Children with autism and their families could greatly benefit from increased support resources. While robots are already being introduced into autism therapy and care, we propose that these robots could better understand the child’s needs and provide enriched interaction if they utilize touch. We present our plans, both completed and ongoing, for a touch-perceiving robot companion for children with autism. We established and validated touch-perception requirements for an ideal robot companion through interviews with 11 autism specialists. Currently, we are evaluating custom fabric-based tactile sensors that enable the robot to detect and identify various touch communication gestures. Finally, our robot companion will react to the child’s touches through an emotion response system that will be customizable by a therapist or caretaker.
DOI BibTeX

Haptic Intelligence Miscellaneous Bimanual Wrist-Squeezing Haptic Feedback Changes Speed-Force Tradeoff in Robotic Surgery Training Cao, E., Machaca, S., Bernard, T., Chi, A., Wolfinger, B., Patterson, Z., Adrales, G. L., Kuchenbecker, K. J. Short paper presented at the ACS Surgeons and Engineers: A Dialogue on Surgical Simulation meeting, Virtual, March 2021 (Published) BibTeX

Haptic Intelligence Miscellaneous Dataset for Finger Motion and Contact by a Second Finger Influence the Tactile Perception of Electrovibration Vardar, Y., Kuchenbecker, K. J. Dataset, Dryad, March 2021 (Published)
Electrovibration holds great potential for creating vivid and realistic haptic sensations on touchscreens. Ideally, a designer should be able to control what users feel independent of the number of fingers they use, the movements they make, and how hard they press. We sought to understand the perception and physics of such interactions by determining the smallest 125 Hz electrovibration voltage that fifteen participants could reliably feel when performing four different touch interactions at two normal forces. The results proved for the first time that both finger motion and contact by a second finger significantly affect what the user feels. At a given voltage, a single moving finger experiences much larger fluctuating electrovibration forces than a single stationary finger, making electrovibration much easier to feel during interactions involving finger movement. Indeed, only about 30% of participants could detect the stimulus without motion. Part of this difference comes from the fact that relative motion greatly increases the electrical impedance between a finger and the screen, as shown via detailed measurements from one individual. In contrast, threshold-level electrovibration did not significantly affect the coefficient of kinetic friction in any conditions. These findings help lay the groundwork for delivering consistent haptic feedback via electrovibration.
DOI BibTeX

Haptic Intelligence Miscellaneous Evaluation of a Teleoperated Robotic Exercise Coach Mohan, M., Mat Husin, H., Kuchenbecker, K. J. Workshop paper (4 pages) presented at the HRI Workshop on Workshop YOUR study design! Participatory critique and refinement of participants’ studies, Virtual, March 2021 (Published) BibTeX

Haptic Intelligence Miscellaneous Evaluation of a Touch-Perceiving, Responsive Robot Koala for Children with Autism Burns, R. B., Seifi, H., Kuchenbecker, K. J. Workshop paper (4 pages) presented at the HRI Workshop on Workshop YOUR study design! Participatory critique and refinement of participants’ studies, Virtual, March 2021 (Published)
Social touch is a powerful component of human life, but current socially assistive robots have almost no touch-perception capabilities. In particular, there has been much interest in using socially assistive robots to help teach and assist children with autism. We propose that such robot companions could better understand and react to a child’s needs if they utilized augmented tactile sensing that captures the applied gesture and force intensity in addition to the more limited information measured by standard binary tactile sensors, which typically provide only contact location and timing. We present HERA, the Haptic Empathetic Robot Animal, as a touch-perceptive social robot for children with autism. In this paper, we propose a user study that aims to investigate whether HERA can help children with autism learn to use safe and appropriate touch behavior during social interaction.
BibTeX

Haptic Intelligence Article Finger Motion and Contact by a Second Finger Influence the Tactile Perception of Electrovibration Vardar, Y., Kuchenbecker, K. J. Journal of the Royal Society Interface, 18(176):20200783, March 2021 (Published)
Electrovibration holds great potential for creating vivid and realistic haptic sensations on touchscreens. Ideally, a designer should be able to control what users feel independent of the number of fingers they use, the movements they make, and how hard they press. We sought to understand the perception and physics of such interactions by determining the smallest 125 Hz electrovibration voltage that 15 participants could reliably feel when performing four different touch interactions at two normal forces. The results proved for the first time that both finger motion and contact by a second finger significantly affect what the user feels. At a given voltage, a single moving finger experiences much larger fluctuating electrovibration forces than a single stationary finger, making electrovibration much easier to feel during interactions involving finger movement. Indeed, only about 30% of participants could detect the stimulus without motion. Part of this difference comes from the fact that relative motion greatly increases the electrical impedance between a finger and the screen, as shown via detailed measurements from one individual. By contrast, threshold-level electrovibration did not significantly affect the coefficient of kinetic friction in any conditions. These findings help lay the groundwork for delivering consistent haptic feedback via electrovibration.
DOI BibTeX

Haptic Intelligence Miscellaneous Love, Actually? Robot Hugs, Oxytocin, and Cortisol Block, A. E., Kuchenbecker, S. Y., Lambercy, O., Gassert, R., Kuchenbecker, K. J. Workshop paper (5 pages) presented at the HRI Workshop on Workshop YOUR study design! Participatory critique and refinement of participants’ studies, Virtual, March 2021 (Published) BibTeX

Haptic Intelligence Conference Paper The Six Hug Commandments: Design and Evaluation of a Human-Sized Hugging Robot with Visual and Haptic Perception Block, A. E., Christen, S., Gassert, R., Hilliges, O., Kuchenbecker, K. J. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), 380-388, Virtual, March 2021 (Published)
Receiving a hug is one of the best ways to feel socially supported, and the lack of social touch can have severe negative effects on an individual's well-being. Based on previous research both within and outside of HRI, we propose six tenets (''commandments'') of natural and enjoyable robotic hugging: a hugging robot should be soft, be warm, be human sized, visually perceive its user, adjust its embrace to the user's size and position, and reliably release when the user wants to end the hug. Prior work validated the first two tenets, and the final four are new. We followed all six tenets to create a new robotic platform, HuggieBot 2.0, that has a soft, warm, inflated body (HuggieChest) and uses visual and haptic sensing to deliver closed-loop hugging. We first verified the outward appeal of this platform in comparison to the previous PR2-based HuggieBot 1.0 via an online video-watching study involving 117 users. We then conducted an in-person experiment in which 32 users each exchanged eight hugs with HuggieBot 2.0, experiencing all combinations of visual hug initiation, haptic sizing, and haptic releasing. The results show that adding haptic reactivity definitively improves user perception a hugging robot, largely verifying our four new tenets and illuminating several interesting opportunities for further improvement.
Block21-HRI-Commandments.pdf DOI BibTeX

Haptic Intelligence Article Getting in Touch with Children with Autism: Specialist Guidelines for a Touch-Perceiving Robot Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J. Paladyn. Journal of Behavioral Robotics, 12(1):115-135, January 2021 (Published)
Children with autism need innovative solutions that help them learn to master everyday experiences and cope with stressful situations. We propose that socially assistive robot companions could better understand and react to a child's needs if they utilized tactile sensing. We examined the existing relevant literature to create an initial set of six tactile-perception requirements, and we then evaluated these requirements through interviews with 11 experienced autism specialists from a variety of backgrounds. Thematic analysis of the comments shared by the specialists revealed three overarching themes: the touch-seeking and touch-avoiding behavior of autistic children, their individual differences and customization needs, and the roles that a touch-perceiving robot could play in such interactions. Using the interview study feedback, we refined our initial list into seven qualitative requirements that describe robustness and maintainability, sensing range, feel, gesture identification, spatial, temporal, and adaptation attributes for the touch-perception system of a robot companion for children with autism. Lastly, by utilizing the literature and current best practices in tactile sensor development and signal processing, we transformed these qualitative requirements into quantitative specifications. We discuss the implications of these requirements for future HRI research in the sensing, computing, and user research communities.
DOI BibTeX

Autonomous Learning Haptic Intelligence Patent Method for force inference, method for training a feed-forward neural network, force inference module, and sensor arrangement Sun, H., Martius, G., Kuchenbecker, K. J. (PCT/EP2021/050231), Max Planck Institute for Intelligent Systems, Max Planck Ring 4, January 2021
The invention relates to a method for force inference of a sensor arrangement for sensing forces, to a method for training a feed-forward neural network, to a force inference module, and to a sensor arrangement.
BibTeX

Autonomous Learning Haptic Intelligence Patent Sensor Arrangement for Sensing Forces and Methods for Fabricating a Sensor Arrangement and Parts Thereof Sun, H., Martius, G., Kuchenbecker, K. J. (PCT/EP2021/050230), Max Planck Institute for Intelligent Systems, Max Planck Ring 4, January 2021
The invention relates to a vision-based haptic sensor arrangement for sensing forces, to a method for fabricating a top portion of a sensor arrangement, and to a method for fabricating a sensor arrangement.
BibTeX

Empirical Inference Haptic Intelligence Perceiving Systems Physical Intelligence Robotic Materials MPI Year Book Scientific Report 2016 - 2021 2021
This report presents research done at the Max Planck Institute for Intelligent Systems from January2016 to November 2021. It is our fourth report since the founding of the institute in 2011. Dueto the fact that the upcoming evaluation is an extended one, the report covers a longer reportingperiod.This scientific report is organized as follows: we begin with an overview of the institute, includingan outline of its structure, an introduction of our latest research departments, and a presentationof our main collaborative initiatives and activities (Chapter1). The central part of the scientificreport consists of chapters on the research conducted by the institute’s departments (Chapters2to6) and its independent research groups (Chapters7 to24), as well as the work of the institute’scentral scientific facilities (Chapter25). For entities founded after January 2016, the respectivereport sections cover work done from the date of the establishment of the department, group, orfacility. These chapters are followed by a summary of selected outreach activities and scientificevents hosted by the institute (Chapter26). The scientific publications of the featured departmentsand research groups published during the 6-year review period complete this scientific report.
Scientific Report 2016 - 2021 BibTeX

Haptic Intelligence Ph.D. Thesis Delivering Expressive and Personalized Fingertip Tactile Cues Young, E. M. University of Pennsylvania, Philadelphia, PA, December 2020, Department of Mechanical Engineering and Applied Mechanics (Published)
Wearable haptic devices have seen growing interest in recent years, but providing realistic tactile feedback is not a challenge that is soon to be solved. Daily interac- tions with physical objects elicit complex sensations at the fingertips. Furthermore, human fingertips exhibit a broad range of physical dimensions and perceptive abilities, adding increased complexity to the task of simulating haptic interactions in a compelling manner. However, as the applications of wearable haptic feedback grow, concerns of wearability and generalizability often persuade tactile device designers to simplify the complexities associated with rendering realistic haptic sensations. As such, wearable devices tend to be optimized for particular uses and average users, rendering only the most salient dimensions of tactile feedback for a given task and assuming all users interpret the feedback in a similar fashion. We propose that providing more realistic haptic feedback will require in-depth examinations of higher-dimensional tactile cues and personalization of these cues for individual users. In this thesis, we aim to provide hardware and software-based solutions for rendering more expressive and personalized tactile cues to the fingertip. We first explore the idea of rendering six-degree-of-freedom (6-DOF) tactile fingertip feedback via a wearable device, such that any possible fingertip interaction with a flat surface can be simulated. We highlight the potential of parallel continuum manipulators (PCMs) to meet the requirements of such a device, and we refine the design of a PCM for providing fingertip tactile cues. We construct a manually actuated prototype to validate the concept, and then continue to develop a motorized version, named the Fingertip Puppeteer, or Fuppeteer for short. Various error reduction techniques are presented, and the resulting device is evaluated by analyzing system responses to step inputs, measuring forces rendered to a biomimetic finger sensor, and comparing intended sensations to perceived sensations of twenty-four participants in a human-subject study. Once the functionality of the Fuppeteer is validated, we begin to explore how the device can be used to broaden our understanding of higher-dimensional tactile feedback. One such application is using the 6-DOF device to simulate different lower-dimensional devices. We evaluate 1-, 3-, and 6-DOF tactile feedback during shape discrimination and mass discrimination in a virtual environment, also comparing to interactions with real objects. Results from 20 naive study participants show that higher-dimensional tactile feedback may indeed allow completion of a wider range of virtual tasks, but that feedback dimensionality surprisingly does not greatly affect the exploratory techniques employed by the user. To address alternative approaches to improving tactile rendering in scenarios where low-dimensional tactile feedback is appropriate, we then explore the idea of personalizing feedback for a particular user. We present two software-based approaches to personalize an existing data-driven haptic rendering algorithm for fingertips of different sizes. We evaluate our algorithms in the rendering of pre-recorded tactile sensations onto rubber casts of six different fingertips as well as onto the real fingertips of 13 human participants, all via a 3-DOF wearable device. Results show that both personalization approaches significantly reduced force error magnitudes and improved realism ratings.
BibTeX

Haptic Intelligence Conference Paper Synchronicity Trumps Mischief in Rhythmic Human-Robot Social-Physical Interaction Fitter, N. T., Kuchenbecker, K. J. In Robotics Research, 10:269-284, Springer Proceedings in Advanced Robotics, (Editors: Amato, Nancy M. and Hager, Greg and Thomas, Shawna and Torres-Torriti, Miguel), Springer Cham, International Symposium on Robotics Research (ISRR), December 2020 (Published)
Hand-clapping games and other forms of rhythmic social-physical interaction might help foster human-robot teamwork, but the design of such interactions has scarcely been explored. We leveraged our prior work to enable the Rethink Robotics Baxter Research Robot to competently play one-handed tempo-matching hand-clapping games with a human user. To understand how such a robot’s capabilities and behaviors affect user perception, we created four versions of this interaction: the hand clapping could be initiated by either the robot or the human, and the non-initiating partner could be either cooperative, yielding synchronous motion, or mischievously uncooperative. Twenty adults tested two clapping tempos in each of these four interaction modes in a random order, rating every trial on standardized scales. The study results showed that having the robot initiate the interaction gave it a more dominant perceived personality. Despite previous results on the intrigue of misbehaving robots, we found that moving synchronously with the robot almost always made the interaction more enjoyable, less mentally taxing, less physically demanding, and lower effort for users than asynchronous interactions caused by robot or human mischief. Taken together, our results indicate that cooperative rhythmic social-physical interaction has the potential to strengthen human-robot partnerships.
DOI BibTeX

Haptic Intelligence Patent System and Method for Simultaneously Sensing Contact Force and Lateral Strain Lee, H., Kuchenbecker, K. J. (EP20000480.2), December 2020
A tactile sensing system having a sensor component which comprises a plurality of layers stacked along a normal axis Z and a detection unit electrically connected to the sensor component, wherein the sensor component comprises a first layer, designed as a piezoresistive layer, a third layer, designed as a conductive layer which is electrically connected to the detection unit, and a second layer, designed as a spacing layer between the first layer and the third layer, wherein the first layer comprises a plurality of electrodes In electrically connected to the detection unit, wherein at least one contact force along the normal axis Z on the sensor component is detectable by the detection unit due to a change of a current distribution between the first layer and the third layer, wherein at least one lateral strain on the sensor component is detectable by the detection unit due to a change of the resistance distribution change in the piezoresistive first layer.
BibTeX

Autonomous Learning Haptic Intelligence Robotics Patent Method for Force Inference of a Sensor Arrangement, Methods for Training Networks, Force Inference Module and Sensor Arrangement Sun, H., Martius, G., Lee, H., Spiers, A., Fiene, J. (PCT/EP2020/083261), Max Planck Institute for Intelligent Systems, Max Planck Ring 4, November 2020
The present invention relates to a method for force inference of a sensor arrangement, to related methods for training of networks, to a force inference module for performing such methods, and to a sensor arrangement for sensing forces. When developing applications such as robots, sensing of forces applied on a robot hand or another part of a robot such as a leg or a manipulation device is crucial in giving robots increased capabilities to move around and/or manipulate objects. Known implementations for sensor arrangements that can be used in robotic applications in order to have feedback with regard to applied forces are quite expensive and do not have sufficient resolution. Sensor arrangements may be used to measure forces. However, known sensor arrangements need a high density of sensors to provide for a high special resolution. It is thus an object of the present invention to provide for a method for force inference of a sensor arrangement and related methods that are different or optimized with regard to the prior art. It is a further object to provide for a force inference module to perform such methods. It is a further object to provide for a sensor arrangement for sensing forces with such a force inference module.
BibTeX

Haptic Intelligence Miscellaneous Utilizing Interviews and Thematic Analysis to Uncover Specifications for a Companion Robot Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J. Workshop paper (2 pages) presented at the ICSR Workshop on Enriching HRI Research with Qualitative Methods, Virtual, November 2020 (Published)
We will share our experiences designing and conducting structured video-conferencing interviews with autism specialists and utilizing thematic analysis to create qualitative requirements and quantitative specifications for a touch-perceiving robot companion tailored for children with autism. We will also explain how we wrote about our qualitative approaches for a journal setting.
URL BibTeX

Haptic Intelligence Miscellaneous A Framework for Analyzing Both Finger-Surface and Tool-Surface Interactions Khojasteh, B., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, the Netherlands, September 2020 (Published)
We interact with surfaces both through our fingers and by means of tools every day. In this process, our tactile mechanoreceptors transduce rich contact-elicited signals to neuronal events, enabling ubiquitous tasks such as object recognition and surface-feature discrimination. Past research has shed light on the neural mechanisms of surface perception, but the involved interaction complexity tends to obfuscate the origin of the produced contact signals for all but the simplest interactions. The manner in which soft versus hard contact partners (skin versus tool) shape the dynamical signals is particularly elusive. To address this gap in our understanding about the mechanical basis of surface encoding, we designed a novel experimental apparatus that uses optical motion capture, miniature high-bandwidth accelerometers, and a six-axis force/torque sensor to capture relevant details of the contact interaction. We measured contact signals for finger and tool interactions with a set of diverse hard textures and analyzed the data with advanced signal-processing, stochastic time-series, and nonlinear time-series techniques. Our approach provides insights into several salient phenomena of finger- and tool-surface interaction. For example, segments of the signals relate to geometrical and mechanical properties of the contact pair. The results may not only elucidate our understanding of human skin as a complex soft matter, but they may also help in the design of prosthetics, electronic skin, human-machine interfaces and surgical robots.
BibTeX

Haptic Intelligence Miscellaneous Characterization of a Magnetic Levitation Haptic Interface for Realistic Tool-Based Interactions Lee, H., Tombak, G. I., Park, G., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, the Netherlands, September 2020 (Published)
We introduce our recent study on the characterization of a commercial magnetic levitation haptic interface (MagLev 200, Butterfly Haptics LLC) for realistic high-bandwidth interactions. This device's haptic rendering scheme can provide strong 6-DoF (force and torque) feedback without friction at all poses in its small workspace. The objective of our study is to enable the device to accurately render realistic multidimensional vibrotactile stimuli measured from a stylus-like tool. Our approach is to characterize the dynamics between the commanded wrench and the resulting translational acceleration across the frequency range of interest. To this end, we first custom-designed and attached a pen-shaped manipulandum (11.5 cm, aluminum) to the top of the MagLev 200's end-effector for better usability in grasping. An accelerometer (ADXL354, Analog Devices) was rigidly mounted inside the manipulandum. Then, we collected a data set where the input is a 30-second-long force and/or torque signal commanded as a sweep function from 10 to 500 Hz; the output is the corresponding acceleration measurement, which we collected both with and without a user holding the handle. We succeeded at fitting both non-parametric and parametric versions of the transfer functions for both scenarios, with a fitting accuracy of about 95% for the parametric transfer functions. In the future, we plan to find the best method of applying the inverse parametric transfer function to our system. We will then employ that compensation method in a user study to evaluate the realism of different algorithms for reducing the dimensionality of tool-based vibrotactile cues.
BibTeX

Haptic Intelligence Miscellaneous Do Touch Gestures Affect How Electrovibration Feels? Vardar, Y., Javot, B., Kuchenbecker, K. J. Hands-on demonstration presented at EuroHaptics, Leiden, the Netherlands, September 2020 (Published)
Our interactions with current electronic devices involve different finger gestures such as tapping, sliding, and pinching. Hence, when electrovibration technology is used for generating tactile feedback on these devices, the interaction of the user will not be limited to only one sliding finger. Does the perception of an electrovibration stimulus depend on the gesture being used? This demonstration lets attendees answer this question for themselves by interacting with an electrostatic display using four representative gestures: one finger stationary, one finger sliding, two fingers sliding, and one finger stationary and another finger sliding.
BibTeX

Haptic Intelligence Miscellaneous Estimating Human Handshape by Feeling the Wrist Forte, M., Young, E. M., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, the Netherlands, September 2020 (Published)
Hand gesture recognition has been widely studied for several applications, including sign language and touchless user interfaces. Sensing approaches for recognizing gestures range from cameras and sensorized gloves to electromyography and mechanomyography. Somewhat surprisingly, a human who places a finger on the inner wrist of another person can learn to perceive different handshapes, and in particular transitions between handshapes. Could this tactile sensing approach work for automatic gesture recognition? As proof of concept, we secured a finger-shaped biomimetic tactile sensor (SynTouch BioTac) to the palmar surface of a human wrist to gather wrist contour information. Typically used for robotic manipulation and surface characterization, this sensor outputs 19 spatially distributed finger pad deformations, DC and AC pressure, and DC and AC temperature. A user performed five gestures (the numbers 1 to 5 in American Sign Language, ASL), five times each with their dominant hand while BioTac data were collected from their wrist. We trained our model on 60% of the collected data, leaving the other 40% for testing. Using statistical features and ensembles of classifiers, we obtained a preliminary accuracy on the test set of 90%. Our short-term goals are to collect more data and classify the results considering the temporal evolution of the gestures. Our long-term goals are to more deeply investigate which sensing modalities included in the BioTac provide the most meaningful information for this application, to achieve similar results with a simpler wearable sensor, and to expand recognition to the entire range of nearly 40 ASL handshapes.
BibTeX

Haptic Intelligence Miscellaneous Haptify: A Comprehensive Benchmarking System for Grounded Force-Feedback Haptic Devices Fazlollahi, F., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, the Netherlands, September 2020 (Published)
Over the past three decades, hundreds of grounded force-feedback (GFF) haptic devices have been invented. Our previous work on Haptipedia shows that there is no standard framework for reporting device attributes, and some crucial attributes are not stated in the literature. To capture important characteristics of haptic interfaces, we have built a benchmarking setup, Haptify. This poster presents our experimental setup, raw recorded data for a common GFF haptic interface, preliminary analysis of our haptic recordings, and our future goals.
BibTeX

Haptic Intelligence Miscellaneous Insights into Human Perception of Asymmetric Vibrations via Dynamic Modeling Nunez, C. M., Vardar, Y., Kuchenbecker, K. J. Work-in-progress poster presented at Eurohaptics, Leiden, the Netherlands, September 2020 (Published)
Certain ungrounded asymmetric vibrations create a unidirectional force that makes the user feel as though their fingers are being pulled in a particular direction. However, although researchers have discovered this haptic feedback technique and showcased its success in a variety of applications, there is still little understanding about how different attributes of the asymmetric vibration signal affect the perceived pulling sensation. Our work aims to use dynamic modeling and measurement to bridge this gap between the design of the control signals and human perception. We present a new dynamic model of a common vibrotactile actuator (Haptuator Mark II) held between the soft, nonlinear fingers of a human user. After anecdotally observing that actuator acceleration strongly depends on grip force, we augmented this model so that grip force directly modifies the model parameters related to finger contact. We present results from driving this simulation with widely varying asymmetric vibrations that produce stronger and weaker pulling sensations. We also present preliminary data from a user study in which participants rated the perceived direction and strength of the same diverse range of asymmetric vibration cues; grip force and actuator acceleration were both recorded for all trials. Comparing the simulations with the physical measurements and perceptual results validates our dynamic model and provides insights on how different aspects of the asymmetric waveform affect the perception of the pulling sensation.
BibTeX

Haptic Intelligence Miscellaneous Intermediate Ridges Amplify Mechanoreceptor Strains in Static and Dynamic Touch Serhat, G., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, the Netherlands, September 2020 (Published) BibTeX

Haptic Intelligence Miscellaneous Optimal Sensor Placement for Recording the Contact Vibrations of a Medical Tool Gourishetti, R., Serhat, G., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, the Netherlands, September 2020 (Published) BibTeX

Haptic Intelligence Miscellaneous Seeing Through Touch: Contact-Location Sensing and Tactile Feedback for Prosthetic Hands Thomas, N., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, the Netherlands, September 2020 (Published)
Locating and picking up an object without vision is a simple task for able-bodied people, due in part to their rich tactile perception capabilities. The same cannot be said for users of standard myoelectric prostheses, who must rely largely on visual cues to successfully interact with the environment. To enable prosthesis users to locate and grasp objects without looking at them, we propose two changes: adding specialized contact-location sensing to the dorsal and palmar aspects of the prosthetic hand’s fingers, and providing the user with tactile feedback of where an object touches the fingers. To evaluate the potential utility of these changes, we developed a simple, sensitive, fabric-based tactile sensor which provides continuous contact location information via a change in voltage of a voltage divider circuit. This sensor was wrapped around the fingers of a commercial prosthetic hand (Ottobock SensorHand Speed). Using an ATI Nano17 force sensor, we characterized the tactile sensor’s response to normal force at distributed contact locations and obtained an average detection threshold of 0.63 +/- 0.26 N. We also confirmed that the voltage-to-location mapping is linear (R squared = 0.99). Sensor signals were adapted to the stationary vibrotactile funneling illusion to provide haptic feedback of contact location. These preliminary results indicate a promising system that imitates a key aspect of the sensory capabilities of the intact hand. Future work includes testing the system in a modified reach-grasp-and-lift study, in which participants must accomplish the task blindfolded.
BibTeX

Haptic Intelligence Miscellaneous Sweat Softens the Outermost Layer of the Human Finger Pad: Evidence from Simulations and Experiments Nam, S., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, the Netherlands, September 2020 (Published)
The softness of human finger pads renders them highly effective at grasping objects. It has recently been debated whether sweat secreted from the finger pad alters the softness of the stratum corneum, the outermost layer of skin. However, it is not feasible to mechanically test in vivo skin to understand the properties of only the stratum corneum layer. To address this open question, we created a finite element model of a finger pad touching a glass plate in COMSOL Multiphysics, and we tuned it to fit contact data for a particular human finger. The experimental data were collected using a previously developed apparatus that records gross contact area and normal force over time; one participant conducted a finger-pressing test 15 times at each of three moisture levels (dry, moderate, and highly moist). We then used repeated contact simulations to determine the most likely mechanical properties of the stratum corneum layer for each condition. Applying a one-term Ogden hyper-elastic model with a fixed strain hardening exponent (α=9), we found the best shear moduli (μ) by comparing the contact area as a function of normal force between the simulations and the experiments. Our results show that the stratum corneum of the highly moist finger is indeed significantly softer than that of the same finger when it is only moderately moist or dry.
BibTeX

Haptic Intelligence Miscellaneous Tactile Textiles: An Assortment of Fabric-Based Tactile Sensors for Contact Force and Contact Location Burns, R. B., Thomas, N., Lee, H., Faulkner, R., Kuchenbecker, K. J. Hands-on demonstration presented at EuroHaptics, Leiden, the Netherlands, September 2020, Rachael Bevill Burns, Neha Thomas, and Hyosang Lee contributed equally to this publication (Published)
Fabric-based tactile sensors are promising for the construction of robotic skin due to their soft and flexible nature. Conductive fabric layers can be used to form piezoresistive structures that are sensitive to contact force and/or contact location. This demonstration showcases three diverse fabric-based tactile sensors we have created. The first detects dynamic tactile events anywhere within a region on a robot’s body. The second design measures the precise location at which a single low-force contact is applied. The third sensor uses electrical resistance tomography to output both the force and location of multiple simultaneous contacts applied across a surface.
BibTeX

Haptic Intelligence Miscellaneous How Does Real-Time Feedback Affect Communicative Actions in Social-Physical Human-Robot Interaction? Mohan, M., Nunez, C. M., Kuchenbecker, K. J. Workshop paper (2 pages) presented at the ROMAN Workshop on Quality of Interaction in Socially Assistive Robots (QISAR), Virtual, August 2020 (Published)
Social robots are becoming more common, especially to motivate older adults to exercise and stay healthy. To increase the effectiveness of such robots, researchers need to develop autonomous interactions that are understandable to the user without help from a human operator. Motivated by this requirement, we have programmed a Baxter robot to play an exercise game via multi-modal non-verbal communication. When the user is confused or makes a mistake, the robot can optionally provide corrective feedback based on real-time measurements of user actions. We hypothesize feedback will improve both the user's physical performance and the user's opinion of the robot's social skills during a planned experiment.
BibTeX

Haptic Intelligence Ph.D. Thesis Modulating Physical Interactions in Human-Assistive Technologies Hu, S. University of Pennsylvania, Philadelphia, PA, August 2020, Department of Mechanical Engineering and Applied Mechanics (Published)
Many mechanical devices and robots operate in home environments, and they offer rich experiences and valuable functionalities for human users. When these devices interact physically with humans, additional care has to be taken in both hardware and software design to ensure that the robots provide safe and meaningful interactions. It is advantageous to have the robots be customizable so users could tinker them for their specific needs. There are many robot platforms that strive toward these goals, but the most successful robots in our world are either separated from humans (such as in factories and warehouses) or occupy the same space as humans but do not offer physical interactions (such as cleaning robots). In this thesis, we envision a suite of assistive robotic devices that assist people in their daily, physical tasks. Specifically, we begin with a hybrid force display that combines a cable, a brake, and a motor, which offers safe and powerful force output with a large workspace. Virtual haptic elements, including free space, constant force, springs, and dampers, can be simulated by this device. We then adapt the hybrid mechanism and develop the Gait Propulsion Trainer (GPT) for stroke rehabilitation, where we aim to reduce propulsion asymmetry by applying resistance at the user’s pelvis during unilateral stance gait phase. Sensors underneath the user’s shoes and a wireless communication module are added to precisely control the timing of the resistance force. To address the effort of parameter tuning in determining the optimal training scheme, we then develop a learning-from-demonstration (LfD) framework where robot behavior can be obtained from data, thus bypassing some of the tuning effort while enabling customization and generalization for different task situations. This LfD framework is evaluated in simulation and in a user study, and results show improved objective performance and human perception of the robot. Finally, we apply the LfD framework in an upper-limb therapy setting, where the robot directly learns the force output from a therapist when supporting stroke survivors in various physical exercises. Six stroke survivors and an occupational therapist provided demonstrations and tested the autonomous robot behaviors in a user study, and we obtain preliminary insights toward making the robot more intuitive and more effective for both therapists and clients of different impairment levels. This thesis thus considers both hardware and software design for robotic platforms, and we explore both direct and indirect force modulation for human-assistive technologies.
Hu20-PHDD-Modulating BibTeX

Haptic Intelligence Miscellaneous Sleep, Stress, and Experience Supersede Vibrotactile Haptic Feedback as Contributors to Workload During Robotic Surgical Skill Acquisition Gomez, E. D., Mat Husin, H., Dumon, K. R., Williams, N. N., Kuchenbecker, K. J. Extended abstract presented as an ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Cleveland, USA, August 2020 (Published)
Introduction: How does the absence of haptic feedback in robotic surgery affect surgical skill acquisition? This study is a prospective single-blinded randomized controlled trial examining the effect of haptic feedback of instrument vibrations during simulation-based training on resident workload and performance during both simulated and live operating room Robotic-Assisted Sleeve Gastrectomy (RASG), which provides surgical trainees with significant robotic console experience. Methods: Twelve surgical residents (seven PGY-3, five PGY-7) were randomized to receive either haptic feedback or no haptic feedback during a proctored simulation session that took place before the first operative cases of a bariatric service rotation. Workload measures including pre- and post-procedure short-form State-Trait Anxiety Inventory (STAI) and NASA Task Load Index (TLX) were recorded in both the simulated and OR settings. Multivariable linear regression with backward selection was performed to examine potential associations between workload measures and factors including haptic feedback, PGY-level, case volume, robotic operative time, and hours of sleep. Results: Subjects performed a total of 60 simulated bariatric surgical procedures and 79 live patient RASGs. During the simulation session, PGY-7 status was associated with a 12.8% decrease in TLX score (p<0.001); one additional hour of sleep yielded a 4.43% decrease in TLX score (p=0.004); one additional point in pre-procedure STAI score yielded a 1.87% increase in TLX score (p=0.003); and a one percent increase in operative time yielded a 0.12% increase in TLX score (p<0.001). During live OR cases, one additional RASG case experience during the rotation was associated with a 1.1% decrease in TLX score (p=0.01); one additional point in pre-procedure STAI score was associated with a 2.64% increase in TLX score (p<0.001); and a one percent increase in robotic operative time was associated with a 0.17% increase in TLX score (p<0.001). Haptic feedback did not significantly affect workload in either setting. No factors had a significant association with pre- to post-procedural change in STAI score. Conclusion: Providing vibrotactile haptic feedback during training neither increased nor decreased resident workload during simulated or live robotic surgical cases, possibly because the utility of the feedback counterbalances the additional processing required. In contrast, PGY-level, baseline stress, operative time, sleep, and case experience all contribute to workload in robotic surgery; these factors can be potential targets of educational intervention. Finally, TLX may be a more robust workload measurement tool than STAI in the context of robotic surgery.
BibTeX

Haptic Intelligence Miscellaneous Vision-based Force Estimation for a da Vinci Instrument Using Deep Neural Networks Lee, Y., Mat Husin, H., Forte, M., Lee, S., Kuchenbecker, K. J. Extended abstract presented as an Emerging Technology ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Cleveland, Ohio, USA, August 2020 (Published) URL BibTeX

Haptic Intelligence Article Using a Variable-Friction Robot Hand to Determine Proprioceptive Features for Object Classification During Within-Hand-Manipulation Spiers, A. J., Morgan, A. S., Srinivasan, K., Calli, B., Dollar, A. M. IEEE Transactions on Haptics, 13(3):600-610, July 2020 (Published)
Interactions with an object during within-hand manipulation (WIHM) constitutes an assortment of gripping, sliding, and pivoting actions. In addition to manipulation benefits, the re-orientation and motion of the objects within-the-hand also provides a rich array of additional haptic information via the interactions to the sensory organs of the hand. In this article, we utilize variable friction (VF) robotic fingers to execute a rolling WIHM on a variety of objects, while recording "proprioceptive" actuator data, which is then used for object classification (i.e., without tactile sensors). Rather than hand-picking a select group of features for this task, our approach begins with 66 general features, which are computed from actuator position and load profiles for each object-rolling manipulation, based on gradient changes. An Extra Trees classifier performs object classification while also ranking each feature's importance. Using only the six most-important "Key Features" from the general set, a classification accuracy of 86% was achieved for distinguishing the six geometric objects included in our data set. Comparatively, when all 66 features are used, the accuracy is 89.8%.
DOI BibTeX

Haptic Intelligence Conference Paper An ERT-Based Robotic Skin with Sparsely Distributed Electrodes: Structure, Fabrication, and DNN-Based Signal Processing Park, K., Park, H., Lee, H., Park, S., Kim, J. In 2020 IEEE International Conference on Robotics and Automation (ICRA 2020), 1617-1624, IEEE, Piscataway, NJ, IEEE International Conference on Robotics and Automation (ICRA 2020), May 2020 (Published)
Electrical resistance tomography (ERT) has previously been utilized to develop a large-scale tactile sensor because this approach enables the estimation of the conductivity distribution among the electrodes based on a known physical model. Such a sensor made with a stretchable material can conform to a curved surface. However, this sensor cannot fully cover a cylindrical surface because in such a configuration, the edges of the sensor must meet each other. The electrode configuration becomes irregular in this edge region, which may degrade the sensor performance. In this paper, we introduce an ERT-based robotic skin with evenly and sparsely distributed electrodes. For implementation, we sprayed a carbon nanotube (CNT)-dispersed solution to form a conductive sensing domain on a cylindrical surface. The electrodes were firmly embedded in the surface so that the wires were not exposed to the outside. The sensor output images were estimated using a deep neural network (DNN), which was trained with noisy simulation data. An indentation experiment revealed that the localization error of the sensor was 5.2 ± 3.3 mm, which is remarkable performance with only 30 electrodes. A frame rate of up to 120 Hz could be achieved with a sensing domain area of 90 cm2. The proposed approach simplifies the fabrication of 3D-shaped sensors, allowing them to be easily applied to existing robot arms in a seamless and robust manner.
DOI BibTeX

Haptic Intelligence Autonomous Learning Conference Paper Calibrating a Soft ERT-Based Tactile Sensor with a Multiphysics Model and Sim-to-real Transfer Learning Lee, H., Park, H., Serhat, G., Sun, H., Kuchenbecker, K. J. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 1632-1638, Paris, France, May 2020 (Published)
Tactile sensors based on electrical resistance tomography (ERT) have shown many advantages for implementing a soft and scalable whole-body robotic skin; however, calibration is challenging because pressure reconstruction is an ill-posed inverse problem. This paper introduces a method for calibrating soft ERT-based tactile sensors using sim-to-real transfer learning with a finite element multiphysics model. The model is composed of three simple models that together map contact pressure distributions to voltage measurements. We optimized the model parameters to reduce the gap between the simulation and reality. As a preliminary study, we discretized the sensing points into a 6 by 6 grid and synthesized single- and two-point contact datasets from the multiphysics model. We obtained another single-point dataset using the real sensor with the same contact location and force used in the simulation. Our new deep neural network architecture uses a de-noising network to capture the simulation-to-real gap and a reconstruction network to estimate contact force from voltage measurements. The proposed approach showed 82% hit rate for localization and 0.51 N of force estimation error performance in single-contact tests and 78.5% hit rate for localization and 5.0 N of force estimation error in two-point contact tests. We believe this new calibration method has the possibility to improve the sensing performance of ERT-based tactile sensors.
DOI BibTeX

Haptic Intelligence Miscellaneous Subject-Specific Biofeedback for Gait Retraining Outside of the Lab Rokhmanova, N., Shull, P. B., Kuchenbecker, K. J., Halilaj, E. Extended abstract (1 page) presented at the Dynamic Walking Conference, May 2020 (Published)
Knee osteoarthritis is a progressive degenerative disease that has been linked to knee loading. Targeted gait intervention with biofeedback to decrease joint loading is a potential conservative treatment strategy. Here we describe a method to evaluate the efficacy of vibrotactile feedback outside of a constrained laboratory setting.
URL BibTeX

Haptic Intelligence Conference Paper Capturing Experts’ Mental Models to Organize a Collection of Haptic Devices: Affordances Outweigh Attributes Seifi, H., Oppermann, M., Bullard, J., MacLean, K. E., Kuchenbecker, K. J. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), 1-12, Honolulu, USA, April 2020 (Published)
Humans rely on categories to mentally organize and understand sets of complex objects. One such set, haptic devices, has myriad technical attributes that affect user experience in complex ways. Seeking an effective navigation structure for a large online collection, we elicited expert mental categories for grounded force-feedback haptic devices: 18 experts (9 device creators, 9 interaction designers) reviewed, grouped, and described 75 devices according to their similarity in a custom card-sorting study. From the resulting quantitative and qualitative data, we identify prominent patterns of tagging versus binning, and we report 6 uber-attributes that the experts used to group the devices, favoring affordances over device specifications. Finally, we derive 7 device categories and 9 subcategories that reflect the imperfect yet semantic nature of the expert mental models. We visualize these device categories and similarities in the online haptic collection, and we offer insights for studying expert understanding of other human-centered technology.
DOI BibTeX

Haptic Intelligence Article Physical Variables Underlying Tactile Stickiness during Fingerpad Detachment Nam, S., Vardar, Y., Gueorguiev, D., Kuchenbecker, K. J. Frontiers in Neuroscience, 14:1-14, April 2020 (Published)
One may notice a relatively wide range of tactile sensations even when touching the same hard, flat surface in similar ways. Little is known about the reasons for this variability, so we decided to investigate how the perceptual intensity of light stickiness relates to the physical interaction between the skin and the surface. We conducted a psychophysical experiment in which nine participants actively pressed their finger on a flat glass plate with a normal force close to 1.5 N and detached it after a few seconds. A custom-designed apparatus recorded the contact force vector and the finger contact area during each interaction as well as pre- and post-trial finger moisture. After detaching their finger, participants judged the stickiness of the glass using a nine-point scale. We explored how sixteen physical variables derived from the recorded data correlate with each other and with the stickiness judgments of each participant. These analyses indicate that stickiness perception mainly depends on the pre-detachment pressing duration, the time taken for the finger to detach, and the impulse in the normal direction after the normal force changes sign; finger-surface adhesion seems to build with pressing time, causing a larger normal impulse during detachment and thus a more intense stickiness sensation. We additionally found a strong between-subjects correlation between maximum real contact area and peak pull-off force, as well as between finger moisture and impulse.
DOI BibTeX

Haptic Intelligence Miscellaneous A Fabric-Based Sensing System for Recognizing Social Touch Burns, R. B., Lee, H., Seifi, H., Kuchenbecker, K. J. Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, Crystal City, USA, March 2020 (Published)
We present a fabric-based piezoresistive tactile sensor system designed to detect social touch gestures on a robot. The unique sensor design utilizes three layers of low-conductivity fabric sewn together on alternating edges to form an accordion pattern and secured between two outer high-conductivity layers. This five-layer design demonstrates a greater resistance range and better low-force sensitivity than previous designs that use one layer of low-conductivity fabric with or without a plastic mesh layer. An individual sensor from our system can presently identify six different communication gestures – squeezing, patting, scratching, poking, hand resting without movement, and no touch – with an average accuracy of 90%. A layer of foam can be added beneath the sensor to make a rigid robot more appealing for humans to touch without inhibiting the system’s ability to register social touch gestures.
BibTeX

Haptic Intelligence Conference Paper Changes in Normal Force During Passive Dynamic Touch: Contact Mechanics and Perception Gueorguiev, D., Lambert, J., Thonnard, J., Kuchenbecker, K. J. In Proceedings of the IEEE Haptics Symposium (HAPTICS), 746-752, Crystal City, USA, March 2020 (Published)
Using a force-controlled robotic platform, we investigated the contact mechanics and psychophysical responses induced by negative and positive modulations in normal force during passive dynamic touch. In the natural state of the finger, the applied normal force modulation induces a correlated change in the tangential force. In a second condition, we applied talcum powder to the fingerpad, which induced a significant modification in the slope of the correlated tangential change. In both conditions, the same ten participants had to detect the interval that contained a decrease or an increase in the pre-stimulation normal force of 1 N. In the natural state, the 75% just noticeable difference for this task was found to be a ratio of 0.19 and 0.18 for decreases and increases, respectively. With talcum powder on the fingerpad, the normal force thresholds remained stable, following the Weber law of constant just noticeable differences, while the tangential force thresholds changed in the same way as the correlation slopes. This result suggests that participants predominantly relied on the normal force changes to perform the detection task. In addition, participants were asked to report whether the force decreased or increased. Their performance was generally poor at this second task even for above-threshold changes. However, their accuracy slightly improved with the talcum powder, which might be due to the reduced finger-surface friction.
DOI BibTeX

Haptic Intelligence Miscellaneous Do Touch Gestures Affect How Electrovibration Feels? Vardar, Y., Kuchenbecker, K. J. Hands-on demonstration presented at the IEEE Haptics Symposium, Crystal City, USA, March 2020 (Published) BibTeX