Publications

DEPARTMENTS

Emperical Interference

Haptic Intelligence

Modern Magnetic Systems

Perceiving Systems

Physical Intelligence

Robotic Materials

Social Foundations of Computation


Research Groups

Autonomous Vision

Autonomous Learning

Bioinspired Autonomous Miniature Robots

Dynamic Locomotion

Embodied Vision

Human Aspects of Machine Learning

Intelligent Control Systems

Learning and Dynamical Systems

Locomotion in Biorobotic and Somatic Systems

Micro, Nano, and Molecular Systems

Movement Generation and Control

Neural Capture and Synthesis

Physics for Inference and Optimization

Organizational Leadership and Diversity

Probabilistic Learning Group


Topics

Robot Learning

Conference Paper

2022

Autonomous Learning

Robotics

AI

Career

Award


Haptic Intelligence Ph.D. Thesis Delivering Expressive and Personalized Fingertip Tactile Cues Young, E. M. University of Pennsylvania, Philadelphia, PA, December 2020, Department of Mechanical Engineering and Applied Mechanics (Published)
Wearable haptic devices have seen growing interest in recent years, but providing realistic tactile feedback is not a challenge that is soon to be solved. Daily interac- tions with physical objects elicit complex sensations at the fingertips. Furthermore, human fingertips exhibit a broad range of physical dimensions and perceptive abilities, adding increased complexity to the task of simulating haptic interactions in a compelling manner. However, as the applications of wearable haptic feedback grow, concerns of wearability and generalizability often persuade tactile device designers to simplify the complexities associated with rendering realistic haptic sensations. As such, wearable devices tend to be optimized for particular uses and average users, rendering only the most salient dimensions of tactile feedback for a given task and assuming all users interpret the feedback in a similar fashion. We propose that providing more realistic haptic feedback will require in-depth examinations of higher-dimensional tactile cues and personalization of these cues for individual users. In this thesis, we aim to provide hardware and software-based solutions for rendering more expressive and personalized tactile cues to the fingertip. We first explore the idea of rendering six-degree-of-freedom (6-DOF) tactile fingertip feedback via a wearable device, such that any possible fingertip interaction with a flat surface can be simulated. We highlight the potential of parallel continuum manipulators (PCMs) to meet the requirements of such a device, and we refine the design of a PCM for providing fingertip tactile cues. We construct a manually actuated prototype to validate the concept, and then continue to develop a motorized version, named the Fingertip Puppeteer, or Fuppeteer for short. Various error reduction techniques are presented, and the resulting device is evaluated by analyzing system responses to step inputs, measuring forces rendered to a biomimetic finger sensor, and comparing intended sensations to perceived sensations of twenty-four participants in a human-subject study. Once the functionality of the Fuppeteer is validated, we begin to explore how the device can be used to broaden our understanding of higher-dimensional tactile feedback. One such application is using the 6-DOF device to simulate different lower-dimensional devices. We evaluate 1-, 3-, and 6-DOF tactile feedback during shape discrimination and mass discrimination in a virtual environment, also comparing to interactions with real objects. Results from 20 naive study participants show that higher-dimensional tactile feedback may indeed allow completion of a wider range of virtual tasks, but that feedback dimensionality surprisingly does not greatly affect the exploratory techniques employed by the user. To address alternative approaches to improving tactile rendering in scenarios where low-dimensional tactile feedback is appropriate, we then explore the idea of personalizing feedback for a particular user. We present two software-based approaches to personalize an existing data-driven haptic rendering algorithm for fingertips of different sizes. We evaluate our algorithms in the rendering of pre-recorded tactile sensations onto rubber casts of six different fingertips as well as onto the real fingertips of 13 human participants, all via a 3-DOF wearable device. Results show that both personalization approaches significantly reduced force error magnitudes and improved realism ratings.
BibTeX

Haptic Intelligence Conference Paper Synchronicity Trumps Mischief in Rhythmic Human-Robot Social-Physical Interaction Fitter, N. T., Kuchenbecker, K. J. In Robotics Research, 10:269-284, Springer Proceedings in Advanced Robotics, (Editors: Amato, Nancy M. and Hager, Greg and Thomas, Shawna and Torres-Torriti, Miguel), Springer Cham, International Symposium on Robotics Research (ISRR), December 2020 (Published)
Hand-clapping games and other forms of rhythmic social-physical interaction might help foster human-robot teamwork, but the design of such interactions has scarcely been explored. We leveraged our prior work to enable the Rethink Robotics Baxter Research Robot to competently play one-handed tempo-matching hand-clapping games with a human user. To understand how such a robot’s capabilities and behaviors affect user perception, we created four versions of this interaction: the hand clapping could be initiated by either the robot or the human, and the non-initiating partner could be either cooperative, yielding synchronous motion, or mischievously uncooperative. Twenty adults tested two clapping tempos in each of these four interaction modes in a random order, rating every trial on standardized scales. The study results showed that having the robot initiate the interaction gave it a more dominant perceived personality. Despite previous results on the intrigue of misbehaving robots, we found that moving synchronously with the robot almost always made the interaction more enjoyable, less mentally taxing, less physically demanding, and lower effort for users than asynchronous interactions caused by robot or human mischief. Taken together, our results indicate that cooperative rhythmic social-physical interaction has the potential to strengthen human-robot partnerships.
DOI BibTeX

Haptic Intelligence Patent System and Method for Simultaneously Sensing Contact Force and Lateral Strain Lee, H., Kuchenbecker, K. J. (EP20000480.2), December 2020
A tactile sensing system having a sensor component which comprises a plurality of layers stacked along a normal axis Z and a detection unit electrically connected to the sensor component, wherein the sensor component comprises a first layer, designed as a piezoresistive layer, a third layer, designed as a conductive layer which is electrically connected to the detection unit, and a second layer, designed as a spacing layer between the first layer and the third layer, wherein the first layer comprises a plurality of electrodes In electrically connected to the detection unit, wherein at least one contact force along the normal axis Z on the sensor component is detectable by the detection unit due to a change of a current distribution between the first layer and the third layer, wherein at least one lateral strain on the sensor component is detectable by the detection unit due to a change of the resistance distribution change in the piezoresistive first layer.
BibTeX

Autonomous Learning Haptic Intelligence Robotics Patent Method for Force Inference of a Sensor Arrangement, Methods for Training Networks, Force Inference Module and Sensor Arrangement Sun, H., Martius, G., Lee, H., Spiers, A., Fiene, J. (PCT/EP2020/083261), Max Planck Institute for Intelligent Systems, Max Planck Ring 4, November 2020
The present invention relates to a method for force inference of a sensor arrangement, to related methods for training of networks, to a force inference module for performing such methods, and to a sensor arrangement for sensing forces. When developing applications such as robots, sensing of forces applied on a robot hand or another part of a robot such as a leg or a manipulation device is crucial in giving robots increased capabilities to move around and/or manipulate objects. Known implementations for sensor arrangements that can be used in robotic applications in order to have feedback with regard to applied forces are quite expensive and do not have sufficient resolution. Sensor arrangements may be used to measure forces. However, known sensor arrangements need a high density of sensors to provide for a high special resolution. It is thus an object of the present invention to provide for a method for force inference of a sensor arrangement and related methods that are different or optimized with regard to the prior art. It is a further object to provide for a force inference module to perform such methods. It is a further object to provide for a sensor arrangement for sensing forces with such a force inference module.
BibTeX

Haptic Intelligence Miscellaneous Utilizing Interviews and Thematic Analysis to Uncover Specifications for a Companion Robot Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J. Workshop paper (2 pages) presented at the ICSR Workshop on Enriching HRI Research with Qualitative Methods, Virtual, November 2020 (Published)
We will share our experiences designing and conducting structured video-conferencing interviews with autism specialists and utilizing thematic analysis to create qualitative requirements and quantitative specifications for a touch-perceiving robot companion tailored for children with autism. We will also explain how we wrote about our qualitative approaches for a journal setting.
URL BibTeX

Haptic Intelligence Miscellaneous A Framework for Analyzing Both Finger-Surface and Tool-Surface Interactions Khojasteh, B., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (Published)
We interact with surfaces both through our fingers and by means of tools every day. In this process, our tactile mechanoreceptors transduce rich contact-elicited signals to neuronal events, enabling ubiquitous tasks such as object recognition and surface-feature discrimination. Past research has shed light on the neural mechanisms of surface perception, but the involved interaction complexity tends to obfuscate the origin of the produced contact signals for all but the simplest interactions. The manner in which soft versus hard contact partners (skin versus tool) shape the dynamical signals is particularly elusive. To address this gap in our understanding about the mechanical basis of surface encoding, we designed a novel experimental apparatus that uses optical motion capture, miniature high-bandwidth accelerometers, and a six-axis force/torque sensor to capture relevant details of the contact interaction. We measured contact signals for finger and tool interactions with a set of diverse hard textures and analyzed the data with advanced signal-processing, stochastic time-series, and nonlinear time-series techniques. Our approach provides insights into several salient phenomena of finger- and tool-surface interaction. For example, segments of the signals relate to geometrical and mechanical properties of the contact pair. The results may not only elucidate our understanding of human skin as a complex soft matter, but they may also help in the design of prosthetics, electronic skin, human-machine interfaces and surgical robots.
BibTeX

Haptic Intelligence Miscellaneous Characterization of a Magnetic Levitation Haptic Interface for Realistic Tool-Based Interactions Lee, H., Tombak, G. I., Park, G., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (Published)
We introduce our recent study on the characterization of a commercial magnetic levitation haptic interface (MagLev 200, Butterfly Haptics LLC) for realistic high-bandwidth interactions. This device's haptic rendering scheme can provide strong 6-DoF (force and torque) feedback without friction at all poses in its small workspace. The objective of our study is to enable the device to accurately render realistic multidimensional vibrotactile stimuli measured from a stylus-like tool. Our approach is to characterize the dynamics between the commanded wrench and the resulting translational acceleration across the frequency range of interest. To this end, we first custom-designed and attached a pen-shaped manipulandum (11.5 cm, aluminum) to the top of the MagLev 200's end-effector for better usability in grasping. An accelerometer (ADXL354, Analog Devices) was rigidly mounted inside the manipulandum. Then, we collected a data set where the input is a 30-second-long force and/or torque signal commanded as a sweep function from 10 to 500 Hz; the output is the corresponding acceleration measurement, which we collected both with and without a user holding the handle. We succeeded at fitting both non-parametric and parametric versions of the transfer functions for both scenarios, with a fitting accuracy of about 95% for the parametric transfer functions. In the future, we plan to find the best method of applying the inverse parametric transfer function to our system. We will then employ that compensation method in a user study to evaluate the realism of different algorithms for reducing the dimensionality of tool-based vibrotactile cues.
BibTeX

Haptic Intelligence Miscellaneous Do Touch Gestures Affect How Electrovibration Feels? Vardar, Y., Javot, B., Kuchenbecker, K. J. Hands-on demonstration presented at EuroHaptics, Leiden, The Netherlands, September 2020 (Published)
Our interactions with current electronic devices involve different finger gestures such as tapping, sliding, and pinching. Hence, when electrovibration technology is used for generating tactile feedback on these devices, the interaction of the user will not be limited to only one sliding finger. Does the perception of an electrovibration stimulus depend on the gesture being used? This demonstration lets attendees answer this question for themselves by interacting with an electrostatic display using four representative gestures: one finger stationary, one finger sliding, two fingers sliding, and one finger stationary and another finger sliding.
BibTeX

Haptic Intelligence Miscellaneous Estimating Human Handshape by Feeling the Wrist Forte, M., Young, E. M., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (Published)
Hand gesture recognition has been widely studied for several applications, including sign language and touchless user interfaces. Sensing approaches for recognizing gestures range from cameras and sensorized gloves to electromyography and mechanomyography. Somewhat surprisingly, a human who places a finger on the inner wrist of another person can learn to perceive different handshapes, and in particular transitions between handshapes. Could this tactile sensing approach work for automatic gesture recognition? As proof of concept, we secured a finger-shaped biomimetic tactile sensor (SynTouch BioTac) to the palmar surface of a human wrist to gather wrist contour information. Typically used for robotic manipulation and surface characterization, this sensor outputs 19 spatially distributed finger pad deformations, DC and AC pressure, and DC and AC temperature. A user performed five gestures (the numbers 1 to 5 in American Sign Language, ASL), five times each with their dominant hand while BioTac data were collected from their wrist. We trained our model on 60% of the collected data, leaving the other 40% for testing. Using statistical features and ensembles of classifiers, we obtained a preliminary accuracy on the test set of 90%. Our short-term goals are to collect more data and classify the results considering the temporal evolution of the gestures. Our long-term goals are to more deeply investigate which sensing modalities included in the BioTac provide the most meaningful information for this application, to achieve similar results with a simpler wearable sensor, and to expand recognition to the entire range of nearly 40 ASL handshapes.
BibTeX

Haptic Intelligence Miscellaneous Haptify: A Comprehensive Benchmarking System for Grounded Force-Feedback Haptic Devices Fazlollahi, F., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (Published)
Over the past three decades, hundreds of grounded force-feedback (GFF) haptic devices have been invented. Our previous work on Haptipedia shows that there is no standard framework for reporting device attributes, and some crucial attributes are not stated in the literature. To capture important characteristics of haptic interfaces, we have built a benchmarking setup, Haptify. This poster presents our experimental setup, raw recorded data for a common GFF haptic interface, preliminary analysis of our haptic recordings, and our future goals.
BibTeX

Haptic Intelligence Miscellaneous Insights into Human Perception of Asymmetric Vibrations via Dynamic Modeling Nunez, C. M., Vardar, Y., Kuchenbecker, K. J. Work-in-progress poster presented at Eurohaptics, Leiden, The Netherlands, September 2020 (Published)
Certain ungrounded asymmetric vibrations create a unidirectional force that makes the user feel as though their fingers are being pulled in a particular direction. However, although researchers have discovered this haptic feedback technique and showcased its success in a variety of applications, there is still little understanding about how different attributes of the asymmetric vibration signal affect the perceived pulling sensation. Our work aims to use dynamic modeling and measurement to bridge this gap between the design of the control signals and human perception. We present a new dynamic model of a common vibrotactile actuator (Haptuator Mark II) held between the soft, nonlinear fingers of a human user. After anecdotally observing that actuator acceleration strongly depends on grip force, we augmented this model so that grip force directly modifies the model parameters related to finger contact. We present results from driving this simulation with widely varying asymmetric vibrations that produce stronger and weaker pulling sensations. We also present preliminary data from a user study in which participants rated the perceived direction and strength of the same diverse range of asymmetric vibration cues; grip force and actuator acceleration were both recorded for all trials. Comparing the simulations with the physical measurements and perceptual results validates our dynamic model and provides insights on how different aspects of the asymmetric waveform affect the perception of the pulling sensation.
BibTeX

Haptic Intelligence Miscellaneous Intermediate Ridges Amplify Mechanoreceptor Strains in Static and Dynamic Touch Serhat, G., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (Published) BibTeX

Haptic Intelligence Miscellaneous Optimal Sensor Placement for Recording the Contact Vibrations of a Medical Tool Gourishetti, R., Serhat, G., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (Published) BibTeX

Haptic Intelligence Miscellaneous Seeing Through Touch: Contact-Location Sensing and Tactile Feedback for Prosthetic Hands Thomas, N., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (Published)
Locating and picking up an object without vision is a simple task for able-bodied people, due in part to their rich tactile perception capabilities. The same cannot be said for users of standard myoelectric prostheses, who must rely largely on visual cues to successfully interact with the environment. To enable prosthesis users to locate and grasp objects without looking at them, we propose two changes: adding specialized contact-location sensing to the dorsal and palmar aspects of the prosthetic hand’s fingers, and providing the user with tactile feedback of where an object touches the fingers. To evaluate the potential utility of these changes, we developed a simple, sensitive, fabric-based tactile sensor which provides continuous contact location information via a change in voltage of a voltage divider circuit. This sensor was wrapped around the fingers of a commercial prosthetic hand (Ottobock SensorHand Speed). Using an ATI Nano17 force sensor, we characterized the tactile sensor’s response to normal force at distributed contact locations and obtained an average detection threshold of 0.63 +/- 0.26 N. We also confirmed that the voltage-to-location mapping is linear (R squared = 0.99). Sensor signals were adapted to the stationary vibrotactile funneling illusion to provide haptic feedback of contact location. These preliminary results indicate a promising system that imitates a key aspect of the sensory capabilities of the intact hand. Future work includes testing the system in a modified reach-grasp-and-lift study, in which participants must accomplish the task blindfolded.
BibTeX

Haptic Intelligence Miscellaneous Sweat Softens the Outermost Layer of the Human Finger Pad: Evidence from Simulations and Experiments Nam, S., Kuchenbecker, K. J. Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (Published)
The softness of human finger pads renders them highly effective at grasping objects. It has recently been debated whether sweat secreted from the finger pad alters the softness of the stratum corneum, the outermost layer of skin. However, it is not feasible to mechanically test in vivo skin to understand the properties of only the stratum corneum layer. To address this open question, we created a finite element model of a finger pad touching a glass plate in COMSOL Multiphysics, and we tuned it to fit contact data for a particular human finger. The experimental data were collected using a previously developed apparatus that records gross contact area and normal force over time; one participant conducted a finger-pressing test 15 times at each of three moisture levels (dry, moderate, and highly moist). We then used repeated contact simulations to determine the most likely mechanical properties of the stratum corneum layer for each condition. Applying a one-term Ogden hyper-elastic model with a fixed strain hardening exponent (α=9), we found the best shear moduli (μ) by comparing the contact area as a function of normal force between the simulations and the experiments. Our results show that the stratum corneum of the highly moist finger is indeed significantly softer than that of the same finger when it is only moderately moist or dry.
BibTeX

Haptic Intelligence Miscellaneous Tactile Textiles: An Assortment of Fabric-Based Tactile Sensors for Contact Force and Contact Location Burns, R. B., Thomas, N., Lee, H., Faulkner, R., Kuchenbecker, K. J. Hands-on demonstration presented at EuroHaptics, Leiden, The Netherlands, September 2020, Rachael Bevill Burns, Neha Thomas, and Hyosang Lee contributed equally to this publication (Published)
Fabric-based tactile sensors are promising for the construction of robotic skin due to their soft and flexible nature. Conductive fabric layers can be used to form piezoresistive structures that are sensitive to contact force and/or contact location. This demonstration showcases three diverse fabric-based tactile sensors we have created. The first detects dynamic tactile events anywhere within a region on a robot’s body. The second design measures the precise location at which a single low-force contact is applied. The third sensor uses electrical resistance tomography to output both the force and location of multiple simultaneous contacts applied across a surface.
BibTeX

Haptic Intelligence Miscellaneous How Does Real-Time Feedback Affect Communicative Actions in Social-Physical Human-Robot Interaction? Mohan, M., Nunez, C. M., Kuchenbecker, K. J. Workshop paper (2 pages) presented at the ROMAN Workshop on Quality of Interaction in Socially Assistive Robots (QISAR), Virtual, August 2020 (Published)
Social robots are becoming more common, especially to motivate older adults to exercise and stay healthy. To increase the effectiveness of such robots, researchers need to develop autonomous interactions that are understandable to the user without help from a human operator. Motivated by this requirement, we have programmed a Baxter robot to play an exercise game via multi-modal non-verbal communication. When the user is confused or makes a mistake, the robot can optionally provide corrective feedback based on real-time measurements of user actions. We hypothesize feedback will improve both the user's physical performance and the user's opinion of the robot's social skills during a planned experiment.
BibTeX

Haptic Intelligence Ph.D. Thesis Modulating Physical Interactions in Human-Assistive Technologies Hu, S. University of Pennsylvania, Philadelphia, PA, August 2020, Department of Mechanical Engineering and Applied Mechanics (Published)
Many mechanical devices and robots operate in home environments, and they offer rich experiences and valuable functionalities for human users. When these devices interact physically with humans, additional care has to be taken in both hardware and software design to ensure that the robots provide safe and meaningful interactions. It is advantageous to have the robots be customizable so users could tinker them for their specific needs. There are many robot platforms that strive toward these goals, but the most successful robots in our world are either separated from humans (such as in factories and warehouses) or occupy the same space as humans but do not offer physical interactions (such as cleaning robots). In this thesis, we envision a suite of assistive robotic devices that assist people in their daily, physical tasks. Specifically, we begin with a hybrid force display that combines a cable, a brake, and a motor, which offers safe and powerful force output with a large workspace. Virtual haptic elements, including free space, constant force, springs, and dampers, can be simulated by this device. We then adapt the hybrid mechanism and develop the Gait Propulsion Trainer (GPT) for stroke rehabilitation, where we aim to reduce propulsion asymmetry by applying resistance at the user’s pelvis during unilateral stance gait phase. Sensors underneath the user’s shoes and a wireless communication module are added to precisely control the timing of the resistance force. To address the effort of parameter tuning in determining the optimal training scheme, we then develop a learning-from-demonstration (LfD) framework where robot behavior can be obtained from data, thus bypassing some of the tuning effort while enabling customization and generalization for different task situations. This LfD framework is evaluated in simulation and in a user study, and results show improved objective performance and human perception of the robot. Finally, we apply the LfD framework in an upper-limb therapy setting, where the robot directly learns the force output from a therapist when supporting stroke survivors in various physical exercises. Six stroke survivors and an occupational therapist provided demonstrations and tested the autonomous robot behaviors in a user study, and we obtain preliminary insights toward making the robot more intuitive and more effective for both therapists and clients of different impairment levels. This thesis thus considers both hardware and software design for robotic platforms, and we explore both direct and indirect force modulation for human-assistive technologies.
Hu20-PHDD-Modulating BibTeX

Haptic Intelligence Miscellaneous Sleep, Stress, and Experience Supersede Vibrotactile Haptic Feedback as Contributors to Workload During Robotic Surgical Skill Acquisition Gomez, E. D., Mat Husin, H., Dumon, K. R., Williams, N. N., Kuchenbecker, K. J. Extended abstract presented as an ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Cleveland, USA, August 2020 (Published)
Introduction: How does the absence of haptic feedback in robotic surgery affect surgical skill acquisition? This study is a prospective single-blinded randomized controlled trial examining the effect of haptic feedback of instrument vibrations during simulation-based training on resident workload and performance during both simulated and live operating room Robotic-Assisted Sleeve Gastrectomy (RASG), which provides surgical trainees with significant robotic console experience. Methods: Twelve surgical residents (seven PGY-3, five PGY-7) were randomized to receive either haptic feedback or no haptic feedback during a proctored simulation session that took place before the first operative cases of a bariatric service rotation. Workload measures including pre- and post-procedure short-form State-Trait Anxiety Inventory (STAI) and NASA Task Load Index (TLX) were recorded in both the simulated and OR settings. Multivariable linear regression with backward selection was performed to examine potential associations between workload measures and factors including haptic feedback, PGY-level, case volume, robotic operative time, and hours of sleep. Results: Subjects performed a total of 60 simulated bariatric surgical procedures and 79 live patient RASGs. During the simulation session, PGY-7 status was associated with a 12.8% decrease in TLX score (p<0.001); one additional hour of sleep yielded a 4.43% decrease in TLX score (p=0.004); one additional point in pre-procedure STAI score yielded a 1.87% increase in TLX score (p=0.003); and a one percent increase in operative time yielded a 0.12% increase in TLX score (p<0.001). During live OR cases, one additional RASG case experience during the rotation was associated with a 1.1% decrease in TLX score (p=0.01); one additional point in pre-procedure STAI score was associated with a 2.64% increase in TLX score (p<0.001); and a one percent increase in robotic operative time was associated with a 0.17% increase in TLX score (p<0.001). Haptic feedback did not significantly affect workload in either setting. No factors had a significant association with pre- to post-procedural change in STAI score. Conclusion: Providing vibrotactile haptic feedback during training neither increased nor decreased resident workload during simulated or live robotic surgical cases, possibly because the utility of the feedback counterbalances the additional processing required. In contrast, PGY-level, baseline stress, operative time, sleep, and case experience all contribute to workload in robotic surgery; these factors can be potential targets of educational intervention. Finally, TLX may be a more robust workload measurement tool than STAI in the context of robotic surgery.
BibTeX

Haptic Intelligence Miscellaneous Vision-based Force Estimation for a da Vinci Instrument Using Deep Neural Networks Lee, Y., Mat Husin, H., Forte, M., Lee, S., Kuchenbecker, K. J. Extended abstract presented as an Emerging Technology ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Cleveland, Ohio, USA, August 2020 (Published) URL BibTeX

Haptic Intelligence Article Using a Variable-Friction Robot Hand to Determine Proprioceptive Features for Object Classification During Within-Hand-Manipulation Spiers, A. J., Morgan, A. S., Srinivasan, K., Calli, B., Dollar, A. M. IEEE Transactions on Haptics, 13(3):600-610, July 2020 (Published)
Interactions with an object during within-hand manipulation (WIHM) constitutes an assortment of gripping, sliding, and pivoting actions. In addition to manipulation benefits, the re-orientation and motion of the objects within-the-hand also provides a rich array of additional haptic information via the interactions to the sensory organs of the hand. In this article, we utilize variable friction (VF) robotic fingers to execute a rolling WIHM on a variety of objects, while recording "proprioceptive" actuator data, which is then used for object classification (i.e., without tactile sensors). Rather than hand-picking a select group of features for this task, our approach begins with 66 general features, which are computed from actuator position and load profiles for each object-rolling manipulation, based on gradient changes. An Extra Trees classifier performs object classification while also ranking each feature's importance. Using only the six most-important "Key Features" from the general set, a classification accuracy of 86% was achieved for distinguishing the six geometric objects included in our data set. Comparatively, when all 66 features are used, the accuracy is 89.8%.
DOI BibTeX

Haptic Intelligence Conference Paper An ERT-Based Robotic Skin with Sparsely Distributed Electrodes: Structure, Fabrication, and DNN-Based Signal Processing Park, K., Park, H., Lee, H., Park, S., Kim, J. In 2020 IEEE International Conference on Robotics and Automation (ICRA 2020), 1617-1624, IEEE, Piscataway, NJ, IEEE International Conference on Robotics and Automation (ICRA 2020), May 2020 (Published)
Electrical resistance tomography (ERT) has previously been utilized to develop a large-scale tactile sensor because this approach enables the estimation of the conductivity distribution among the electrodes based on a known physical model. Such a sensor made with a stretchable material can conform to a curved surface. However, this sensor cannot fully cover a cylindrical surface because in such a configuration, the edges of the sensor must meet each other. The electrode configuration becomes irregular in this edge region, which may degrade the sensor performance. In this paper, we introduce an ERT-based robotic skin with evenly and sparsely distributed electrodes. For implementation, we sprayed a carbon nanotube (CNT)-dispersed solution to form a conductive sensing domain on a cylindrical surface. The electrodes were firmly embedded in the surface so that the wires were not exposed to the outside. The sensor output images were estimated using a deep neural network (DNN), which was trained with noisy simulation data. An indentation experiment revealed that the localization error of the sensor was 5.2 ± 3.3 mm, which is remarkable performance with only 30 electrodes. A frame rate of up to 120 Hz could be achieved with a sensing domain area of 90 cm2. The proposed approach simplifies the fabrication of 3D-shaped sensors, allowing them to be easily applied to existing robot arms in a seamless and robust manner.
DOI BibTeX

Haptic Intelligence Autonomous Learning Conference Paper Calibrating a Soft ERT-Based Tactile Sensor with a Multiphysics Model and Sim-to-real Transfer Learning Lee, H., Park, H., Serhat, G., Sun, H., Kuchenbecker, K. J. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 1632-1638, Paris, France, May 2020 (Published)
Tactile sensors based on electrical resistance tomography (ERT) have shown many advantages for implementing a soft and scalable whole-body robotic skin; however, calibration is challenging because pressure reconstruction is an ill-posed inverse problem. This paper introduces a method for calibrating soft ERT-based tactile sensors using sim-to-real transfer learning with a finite element multiphysics model. The model is composed of three simple models that together map contact pressure distributions to voltage measurements. We optimized the model parameters to reduce the gap between the simulation and reality. As a preliminary study, we discretized the sensing points into a 6 by 6 grid and synthesized single- and two-point contact datasets from the multiphysics model. We obtained another single-point dataset using the real sensor with the same contact location and force used in the simulation. Our new deep neural network architecture uses a de-noising network to capture the simulation-to-real gap and a reconstruction network to estimate contact force from voltage measurements. The proposed approach showed 82% hit rate for localization and 0.51 N of force estimation error performance in single-contact tests and 78.5% hit rate for localization and 5.0 N of force estimation error in two-point contact tests. We believe this new calibration method has the possibility to improve the sensing performance of ERT-based tactile sensors.
DOI BibTeX

Haptic Intelligence Miscellaneous Subject-Specific Biofeedback for Gait Retraining Outside of the Lab Rokhmanova, N., Shull, P. B., Kuchenbecker, K. J., Halilaj, E. Extended abstract (1 page) presented at the Dynamic Walking Conference, May 2020 (Published)
Knee osteoarthritis is a progressive degenerative disease that has been linked to knee loading. Targeted gait intervention with biofeedback to decrease joint loading is a potential conservative treatment strategy. Here we describe a method to evaluate the efficacy of vibrotactile feedback outside of a constrained laboratory setting.
URL BibTeX

Haptic Intelligence Conference Paper Capturing Experts’ Mental Models to Organize a Collection of Haptic Devices: Affordances Outweigh Attributes Seifi, H., Oppermann, M., Bullard, J., MacLean, K. E., Kuchenbecker, K. J. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), 1-12, Honolulu, USA, April 2020 (Published)
Humans rely on categories to mentally organize and understand sets of complex objects. One such set, haptic devices, has myriad technical attributes that affect user experience in complex ways. Seeking an effective navigation structure for a large online collection, we elicited expert mental categories for grounded force-feedback haptic devices: 18 experts (9 device creators, 9 interaction designers) reviewed, grouped, and described 75 devices according to their similarity in a custom card-sorting study. From the resulting quantitative and qualitative data, we identify prominent patterns of tagging versus binning, and we report 6 uber-attributes that the experts used to group the devices, favoring affordances over device specifications. Finally, we derive 7 device categories and 9 subcategories that reflect the imperfect yet semantic nature of the expert mental models. We visualize these device categories and similarities in the online haptic collection, and we offer insights for studying expert understanding of other human-centered technology.
DOI BibTeX

Haptic Intelligence Article Physical Variables Underlying Tactile Stickiness during Fingerpad Detachment Nam, S., Vardar, Y., Gueorguiev, D., Kuchenbecker, K. J. Frontiers in Neuroscience, 14:1-14, April 2020 (Published)
One may notice a relatively wide range of tactile sensations even when touching the same hard, flat surface in similar ways. Little is known about the reasons for this variability, so we decided to investigate how the perceptual intensity of light stickiness relates to the physical interaction between the skin and the surface. We conducted a psychophysical experiment in which nine participants actively pressed their finger on a flat glass plate with a normal force close to 1.5 N and detached it after a few seconds. A custom-designed apparatus recorded the contact force vector and the finger contact area during each interaction as well as pre- and post-trial finger moisture. After detaching their finger, participants judged the stickiness of the glass using a nine-point scale. We explored how sixteen physical variables derived from the recorded data correlate with each other and with the stickiness judgments of each participant. These analyses indicate that stickiness perception mainly depends on the pre-detachment pressing duration, the time taken for the finger to detach, and the impulse in the normal direction after the normal force changes sign; finger-surface adhesion seems to build with pressing time, causing a larger normal impulse during detachment and thus a more intense stickiness sensation. We additionally found a strong between-subjects correlation between maximum real contact area and peak pull-off force, as well as between finger moisture and impulse.
DOI BibTeX

Haptic Intelligence Miscellaneous A Fabric-Based Sensing System for Recognizing Social Touch Burns, R. B., Lee, H., Seifi, H., Kuchenbecker, K. J. Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, Crystal City, USA, March 2020 (Published)
We present a fabric-based piezoresistive tactile sensor system designed to detect social touch gestures on a robot. The unique sensor design utilizes three layers of low-conductivity fabric sewn together on alternating edges to form an accordion pattern and secured between two outer high-conductivity layers. This five-layer design demonstrates a greater resistance range and better low-force sensitivity than previous designs that use one layer of low-conductivity fabric with or without a plastic mesh layer. An individual sensor from our system can presently identify six different communication gestures – squeezing, patting, scratching, poking, hand resting without movement, and no touch – with an average accuracy of 90%. A layer of foam can be added beneath the sensor to make a rigid robot more appealing for humans to touch without inhibiting the system’s ability to register social touch gestures.
BibTeX

Haptic Intelligence Conference Paper Changes in Normal Force During Passive Dynamic Touch: Contact Mechanics and Perception Gueorguiev, D., Lambert, J., Thonnard, J., Kuchenbecker, K. J. In Proceedings of the IEEE Haptics Symposium (HAPTICS), 746-752, Crystal City, USA, March 2020 (Published)
Using a force-controlled robotic platform, we investigated the contact mechanics and psychophysical responses induced by negative and positive modulations in normal force during passive dynamic touch. In the natural state of the finger, the applied normal force modulation induces a correlated change in the tangential force. In a second condition, we applied talcum powder to the fingerpad, which induced a significant modification in the slope of the correlated tangential change. In both conditions, the same ten participants had to detect the interval that contained a decrease or an increase in the pre-stimulation normal force of 1 N. In the natural state, the 75% just noticeable difference for this task was found to be a ratio of 0.19 and 0.18 for decreases and increases, respectively. With talcum powder on the fingerpad, the normal force thresholds remained stable, following the Weber law of constant just noticeable differences, while the tangential force thresholds changed in the same way as the correlation slopes. This result suggests that participants predominantly relied on the normal force changes to perform the detection task. In addition, participants were asked to report whether the force decreased or increased. Their performance was generally poor at this second task even for above-threshold changes. However, their accuracy slightly improved with the talcum powder, which might be due to the reduced finger-surface friction.
DOI BibTeX

Haptic Intelligence Miscellaneous Do Touch Gestures Affect How Electrovibration Feels? Vardar, Y., Kuchenbecker, K. J. Hands-on demonstration presented at the IEEE Haptics Symposium, Crystal City, USA, March 2020 (Published) BibTeX

Haptic Intelligence Conference Paper Haptic Object Parameter Estimation during Within-Hand-Manipulation with a Simple Robot Gripper Mohtasham, D., Narayanan, G., Calli, B., Spiers, A. J. In Proceedings of the IEEE Haptics Symposium (HAPTICS), 140-147, March 2020 (Published)
Though it is common for robots to rely on vision for object feature estimation, there are environments where optical sensing performs poorly, due to occlusion, poor lighting or limited space for camera placement. Haptic sensing in robotics has a long history, but few approaches have combined this with within-hand-manipulation (WIHM), in order to expose more features of an object to the tactile sensing elements of the hand. As in the human hand, these sensing structures are generally non-homogenous in their coverage of a gripper's manipulation surfaces, as the sensitivity of some hand or finger regions is often different to other regions. In this work we use a modified version of the recently developed 2-finger Model VF (variable friction) robot gripper to acquire tactile information while rolling objects within the robot's grasp. This new gripper has one high-friction passive finger surface and one high-friction tactile sensing surface, equipped with 12 low-cost barometric force sensors encased in urethane. We have developed algorithms that use the data generated during these rolling actions to determine parametric aspects of the object under manipulation. Namely, two parameters are currently determined 1) the location of an object within the grasp 2) the object's shape (from three alternatives). The algorithms were first developed on a static test rig with passive object rolling and later evaluated with the robot gripper platform using active WIHM, which introduced artifacts into the data. With an object set consisting of 3 shapes and 5 sizes, an overall shape estimation accuracy was achieved of 88% and 78% for the test rig and hand respectively. Location estimation, of each object's centroid during motion, achieved a mean error of less than 2mm, along the 95mm length of the tactile sensing finger.
DOI BibTeX

Haptic Intelligence Miscellaneous Intermediate Ridges Amplify Mechanoreceptor Strains in Static and Dynamic Touch Serhat, G., Kuchenbecker, K. J. Work-in-progress paper (2 pages) presented at the IEEE Haptics Symposium, Crystal City, USA, March 2020 (Published) BibTeX

Haptic Intelligence Miscellaneous Using Affective Touch for Emotional Support with a Hugging Robot Block, A. E., Hilliges, O., Gassert, R., Kuchenbecker, K. J. Workshop paper (2 pages) presented at the Human-Robot Interaction (HRI) Workshop on Affect and Embodiment, Cambridge, UK, March 2020 (Published) BibTeX

Haptic Intelligence Article Exercising with Baxter: Preliminary Support for Assistive Social-Physical Human-Robot Interaction Fitter, N. T., Mohan, M., Kuchenbecker, K. J., Johnson, M. J. Journal of NeuroEngineering and Rehabilitation, 17:1-22, February 2020 (Published)
Background: The worldwide population of older adults will soon exceed the capacity of assisted living facilities. Accordingly, we aim to understand whether appropriately designed robots could help older adults stay active at home. Methods: Building on related literature as well as guidance from experts in game design, rehabilitation, and physical and occupational therapy, we developed eight human-robot exercise games for the Baxter Research Robot, six of which involve physical human-robot contact. After extensive iteration, these games were tested in an exploratory user study including 20 younger adult and 20 older adult users. Results: Only socially and physically interactive games fell in the highest ranges for pleasantness, enjoyment, engagement, cognitive challenge, and energy level. Our games successfully spanned three different physical, cognitive, and temporal challenge levels. User trust and confidence in Baxter increased significantly between pre- and post-study assessments. Older adults experienced higher exercise, energy, and engagement levels than younger adults, and women rated the robot more highly than men on several survey questions. Conclusions: The results indicate that social-physical exercise with a robot is more pleasant, enjoyable, engaging, cognitively challenging, and energetic than similar interactions that lack physical touch. In addition to this main finding, researchers working in similar areas can build on our design practices, our open-source resources, and the age-group and gender differences that we found.
DOI BibTeX

Haptic Intelligence Article Learning to Predict Perceptual Distributions of Haptic Adjectives Richardson, B. A., Kuchenbecker, K. J. Frontiers in Neurorobotics, 13(116):1-16, February 2020 (Published)
When humans touch an object with their fingertips, they can immediately describe its tactile properties using haptic adjectives, such as hardness and roughness; however, human perception is subjective and noisy, with significant variation across individuals and interactions. Recent research has worked to provide robots with similar haptic intelligence but was focused on identifying binary haptic adjectives, ignoring both attribute intensity and perceptual variability. Combining ordinal haptic adjective labels gathered from human subjects for a set of 60 objects with features automatically extracted from raw multi-modal tactile data collected by a robot repeatedly touching the same objects, we designed a machine-learning method that incorporates partial knowledge of the distribution of object labels into training; then, from a single interaction, it predicts a probability distribution over the set of ordinal labels. In addition to analyzing the collected labels (10 basic haptic adjectives) and demonstrating the quality of our method's predictions, we hold out specific features to determine the influence of individual sensor modalities on the predictive performance for each adjective. Our results demonstrate the feasibility of modeling both the intensity and the variation of haptic perception, two crucial yet previously neglected components of human haptic perception.
DOI BibTeX

Haptic Intelligence Article Compensating for Fingertip Size to Render Tactile Cues More Accurately Young, E. M., Gueorguiev, D., Kuchenbecker, K. J., Pacchierotti, C. IEEE Transactions on Haptics, 13(1):144-151, January 2020, Katherine J. Kuchenbecker and Claudio Pacchierotti contributed equally to this publication. Presented at the IEEE World Haptics Conference (WHC) (Published)
Fingertip haptic feedback offers advantages in many applications, including robotic teleoperation, gaming, and training. However, fingertip size and shape vary significantly across humans, making it difficult to design fingertip interfaces and rendering techniques suitable for everyone. This article starts with an existing data-driven haptic rendering algorithm that ignores fingertip size, and it then develops two software-based approaches to personalize this algorithm for fingertips of different sizes using either additional data or geometry. We evaluate our algorithms in the rendering of pre-recorded tactile sensations onto rubber casts of six different fingertips as well as onto the real fingertips of 13 human participants. Results on the casts show that both approaches significantly improve performance, reducing force error magnitudes by an average of 78% with respect to the standard non-personalized rendering technique. Congruent results were obtained for real fingertips, with subjects rating each of the two personalized rendering techniques significantly better than the standard non-personalized method.
DOI BibTeX

Haptic Intelligence Article How Does It Feel to Clap Hands with a Robot? Fitter, N. T., Kuchenbecker, K. J. International Journal of Social Robotics, 12:113-127, January 2020 (Published)
Future robots may need lighthearted physical interaction capabilities to connect with people in meaningful ways. To begin exploring how users perceive playful human–robot hand-to-hand interaction, we conducted a study with 20 participants. Each user played simple hand-clapping games with the Rethink Robotics Baxter Research Robot during a 1-h-long session involving 24 randomly ordered conditions that varied in facial reactivity, physical reactivity, arm stiffness, and clapping tempo. Survey data and experiment recordings demonstrate that this interaction is viable: all users successfully completed the experiment and mentioned enjoying at least one game without prompting. Hand-clapping tempo was highly salient to users, and human-like robot errors were more widely accepted than mechanical errors. Furthermore, perceptions of Baxter varied in the following statistically significant ways: facial reactivity increased the robot’s perceived pleasantness and energeticness; physical reactivity decreased pleasantness, energeticness, and dominance; higher arm stiffness increased safety and decreased dominance; and faster tempo increased energeticness and increased dominance. These findings can motivate and guide roboticists who want to design social–physical human–robot interactions.
DOI BibTeX