Publications

DEPARTMENTS

Emperical Interference

Haptic Intelligence

Modern Magnetic Systems

Perceiving Systems

Physical Intelligence

Robotic Materials

Social Foundations of Computation


Research Groups

Autonomous Vision

Autonomous Learning

Bioinspired Autonomous Miniature Robots

Dynamic Locomotion

Embodied Vision

Human Aspects of Machine Learning

Intelligent Control Systems

Learning and Dynamical Systems

Locomotion in Biorobotic and Somatic Systems

Micro, Nano, and Molecular Systems

Movement Generation and Control

Neural Capture and Synthesis

Physics for Inference and Optimization

Organizational Leadership and Diversity

Probabilistic Learning Group


Topics

Robot Learning

Conference Paper

2022

Autonomous Learning

Robotics

AI

Career

Award


Haptic Intelligence Article Evaluation of High-Fidelity Simulation as a Training Tool in Transoral Robotic Surgery Bur, A. M., Gomez, E. D., Newman, J. G., Weinstein, G. S., Bert W. O’Malley, J., Rassekh, C. H., Kuchenbecker, K. J. Laryngoscope, 127(12):2790-2795, December 2017 (Published) DOI BibTeX

Haptic Intelligence Conference Paper Mechanics of pseudo-haptics with computer mouse Kumar, A., Gourishetti, R., Manivannan, M. In 1-6, IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE), December 2017 (Published)
The haptic illusion based force feedback, known as pseudo-haptics, is used to simulate haptic explorations, such as stiffness, without using a force feedback device. There are many computer mouse-based pseudo-haptics work reported in the literature. However, none has explored the mechanics of the pseudo-haptics. The objective of this paper is to derive an analytical relation between the displacement of the mouse to that of a virtual spring assuming equal work done in both cases (mouse and virtual spring displacement) and experimentally validate their relation. A psychophysical experiment was conducted on eight subjects to discriminate the stiffness of two virtual springs using 2 Alternative Force Choice (AFC) discrimination task, Constant Stimuli method to measure Just Noticeable Difference (JND) for pseudo-stiffness. The mean pseudo-stiffness JND and average Weber fraction were calculated to be 14% and 9.54% respectively. The resulting JND and the Weber fraction from the experiment were comparable to that of the psychophysical parameters in the literature. Currently, this study simulates the haptic illusion for 1 DOF, however, it can be extended to 6 DOF.
DOI BibTeX

Haptic Intelligence Article Using Contact Forces and Robot Arm Accelerations to Automatically Rate Surgeon Skill at Peg Transfer Brown, J. D., O’Brien, C. E., Leung, S. C., Dumon, K. R., Lee, D. I., Kuchenbecker, K. J. IEEE Transactions on Biomedical Engineering, 64(9):2263-2275, September 2017 (Published) DOI BibTeX

Haptic Intelligence Ph.D. Thesis Design and Evaluation of Interactive Hand-Clapping Robots Fitter, N. T. University of Pennsylvania, August 2017, Department of Mechanical Engineering and Applied Mechanics (Published)
Human friends commonly connect through handshakes and high fives, and children around the world rejoice at hand-clapping games. As robots enter everyday human spaces, they will have the opportunity to join in such physical interactions, but few current robots are intended to touch humans. How should robots move and react in playful hand-to-hand interactions with people? We conducted research in four main areas to address this design challenge. First, we implemented and tested an initial hand-clapping robotic system. This effort began by recording sensor data from people performing a variety of hand-clapping activities; the resulting accelerometer and position data taught us how to design appropriate hand-clapping robot motion and logic. Implementation on a Rethink Robotics Baxter Research Robot demonstrated that a robot could move like our human participants and reliably detect hand impacts through its wrist-mounted accelerometers. N = 20 study participants clapped hands with differently configured versions of this robot in random order: the robot’s facial animation, physical reactivity, arm stiffness, and clapping tempo all significantly affected how users perceived the robot. We next sought to create and evaluate more sophisticated robot hand-clapping behaviors. Data from people performing interactive clapping tasks at increasing and decreasing tempos helped us propose prospective timing models and implement adaptive-tempo Baxter play. In a subsequent experiment that involved N = 20 users, a mischievous Baxter was equipped with the top-performing tempo adaptation model and chose to play cooperatively or asynchronously with its human partner. Although a few participants reacted positively to Baxter’s mischief, users overwhelmingly pre- ferred a synchronous, cooperative robot. Third, we set up and conducted a human-robot interaction experiment more similar to everyday human-human hand-clapping interactions. A machine learning pipeline trained on inertial data from human motions demonstrated that linear support vector machines (SVMs) can classify a new person’s hand-clapping actions with an accuracy of about 95%. This technique succeeded for both hand- and wrist-mounted inertial sensors, enabling people to teach the Baxter robot new hand- clapping games. Evaluation of various two-handed clapping play activities by N = 24 users showed that learning games from Baxter was significantly easier than teaching Baxter games, but that the teaching role caused people to consider more teamwork aspects of the gameplay. Finally, to broaden the scope of these interactions, we began exploring applications of Baxter in socially assistive robotics. Using many of the same sensing and actuation strategies, we developed a set of six playful hand-to-hand contact-based exercise interactions to be jointly executed between a person and Baxter, along with two similar non-contact games. A proof-of-concept experiment using these exercise games enrolled N = 20 young adults and N = 14 healthy adults over age 53. The results demonstrated that people are willing and motivated to interact with the robot in this way and that different games promote unique physical and cognitive exercise effects. Overall, this research aims to help shape design processes for socially relevant physical human-robot interaction and reveal new opportunities for socially assistive robotics.
BibTeX

Haptic Intelligence Miscellaneous Physical and Behavioral Factors Improve Robot Hug Quality Block, A. E., Kuchenbecker, K. J. Workshop Paper (2 pages) presented at the RO-MAN Workshop on Social Interaction and Multimodal Expression for Socially Intelligent Robots, Lisbon, Portugal, August 2017 (Published)
A hug is one of the most basic ways humans can express affection. As hugs are so common, a natural progression of robot development is to have robots one day hug humans as seamlessly as these intimate human-human interactions occur. This project’s purpose is to evaluate human responses to different robot physical characteristics and hugging behaviors. Specifically, we aim to test the hypothesis that a warm, soft, touch-sensitive PR2 humanoid robot can provide humans with satisfying hugs by matching both their hugging pressure and their hugging duration. Thirty participants experienced and evaluated twelve hugs with the robot, divided into three randomly ordered trials that focused on physical robot char- acteristics and nine randomly ordered trials with varied hug pressure and duration. We found that people prefer soft, warm hugs over hard, cold hugs. Furthermore, users prefer hugs that physically squeeze them and release immediately when they are ready for the hug to end.
BibTeX

Haptic Intelligence Conference Paper Stiffness Perception during Pinching and Dissection with Teleoperated Haptic Forceps Ng, C., Zareinia, K., Sun, Q., Kuchenbecker, K. J. In Proceedings of the International Symposium on Robot and Human Interactive Communication (RO-MAN), 456-463, Lisbon, Portugal, August 2017 (Published) DOI BibTeX

Haptic Intelligence Article Ungrounded Haptic Augmented Reality System for Displaying Texture and Friction Culbertson, H., Kuchenbecker, K. J. IEEE/ASME Transactions on Mechatronics, 22(4):1839-1849, August 2017 (Published) DOI BibTeX

Haptic Intelligence Conference Paper A Wrist-Squeezing Force-Feedback System for Robotic Surgery Training Brown, J. D., Fernandez, J. N., Cohen, S. P., Kuchenbecker, K. J. In Proceedings of the IEEE World Haptics Conference (WHC), 107-112, Munich, Germany, June 2017 (Published)
Over time, surgical trainees learn to compensate for the lack of haptic feedback in commercial robotic minimally invasive surgical systems. Incorporating touch cues into robotic surgery training could potentially shorten this learning process if the benefits of haptic feedback were sustained after it is removed. In this paper, we develop a wrist-squeezing haptic feedback system and evaluate whether it holds the potential to train novice da Vinci users to reduce the force they exert on a bimanual inanimate training task. Subjects were randomly divided into two groups according to a multiple baseline experimental design. Each of the ten participants moved a ring along a curved wire nine times while the haptic feedback was conditionally withheld, provided, and withheld again. The realtime tactile feedback of applied force magnitude significantly reduced the integral of the force produced by the da Vinci tools on the task materials, and this result remained even when the haptic feedback was removed. Overall, our findings suggest that wrist-squeezing force feedback can play an essential role in helping novice trainees learn to minimize the force they exert with a surgical robot.
DOI BibTeX

Haptic Intelligence Miscellaneous An Interactive Augmented-Reality Video Training Platform for the da Vinci Surgical System Carlson, J., Kuchenbecker, K. J. Workshop paper (3 pages) presented at the ICRA Workshop on C4 Surgical Robots, Singapore, June 2017 (Published)
Teleoperated surgical robots such as the Intuitive da Vinci Surgical System facilitate minimally invasive surgeries, which decrease risk to patients. However, these systems can be difficult to learn, and existing training curricula on surgical simulators do not offer students the realistic experience of a full operation. This paper presents an augmented-reality video training platform for the da Vinci that will allow trainees to rehearse any surgery recorded by an expert. While the trainee operates a da Vinci in free space, they see their own instruments overlaid on the expert video. Tools are identified in the source videos via color segmentation and kernelized correlation filter tracking, and their depth is calculated from the da Vinci’s stereoscopic video feed. The user tries to follow the expert’s movements, and if any of their tools venture too far away, the system provides instantaneous visual feedback and pauses to allow the user to correct their motion. The trainee can also rewind the expert video by bringing either da Vinci tool very close to the camera. This combined and augmented video provides the user with an immersive and interactive training experience.
BibTeX

Haptic Intelligence Conference Paper Design of a Parallel Continuum Manipulator for 6-DOF Fingertip Haptic Display Young, E. M., Kuchenbecker, K. J. In Proceedings of the IEEE World Haptics Conference (WHC), 599-604, Munich, Germany, June 2017, Finalist for best poster paper (Published)
Despite rapid advancements in the field of fingertip haptics, rendering tactile cues with six degrees of freedom (6 DOF) remains an elusive challenge. In this paper, we investigate the potential of displaying fingertip haptic sensations with a 6-DOF parallel continuum manipulator (PCM) that mounts to the user's index finger and moves a contact platform around the fingertip. Compared to traditional mechanisms composed of rigid links and discrete joints, PCMs have the potential to be strong, dexterous, and compact, but they are also more complicated to design. We define the design space of 6-DOF parallel continuum manipulators and outline a process for refining such a device for fingertip haptic applications. Following extensive simulation, we obtain 12 designs that meet our specifications, construct a manually actuated prototype of one such design, and evaluate the simulation's ability to accurately predict the prototype's motion. Finally, we demonstrate the range of deliverable fingertip tactile cues, including a normal force into the finger and shear forces tangent to the finger at three extreme points on the boundary of the fingertip.
DOI BibTeX

Haptic Intelligence Article Evaluation of a Vibrotactile Simulator for Dental Caries Detection Kuchenbecker, K. J., Parajon, R. C., Maggio, M. P. Simulation in Healthcare, 12(3):148-156, June 2017 (Published) DOI BibTeX

Haptic Intelligence Conference Paper Handling Scan-Time Parameters in Haptic Surface Classification Burka, A., Kuchenbecker, K. J. In Proceedings of the IEEE World Haptics Conference (WHC), 424-429, Munich, Germany, June 2017 (Published) DOI BibTeX

Haptic Intelligence Conference Paper High Magnitude Unidirectional Haptic Force Display Using a Motor/Brake Pair and a Cable Hu, S., Kuchenbecker, K. J. In Proceedings of the IEEE World Haptics Conference (WHC), 394-399, Munich, Germany, June 2017 (Published)
Clever electromechanical design is required to make the force feedback delivered by a kinesthetic haptic interface both strong and safe. This paper explores a onedimensional haptic force display that combines a DC motor and a magnetic particle brake on the same shaft. Rather than a rigid linkage, a spooled cable connects the user to the actuators to enable a large workspace, reduce the moving mass, and eliminate the sticky residual force from the brake. This design combines the high torque/power ratio of the brake and the active output capabilities of the motor to provide a wider range of forces than can be achieved with either actuator alone. A prototype of this device was built, its performance was characterized, and it was used to simulate constant force sources and virtual springs and dampers. Compared to the conventional design of using only a motor, the hybrid device can output higher unidirectional forces at the expense of free space feeling less free.
DOI BibTeX

Haptic Intelligence Miscellaneous How Should Robots Hug? Block, A. E., Kuchenbecker, K. J. Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (Published) BibTeX

Haptic Intelligence Miscellaneous Physically Interactive Exercise Games with a Baxter Robot Fitter, N. T., Kuchenbecker, K. J. Hands-on demonstration presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (Published) BibTeX

Haptic Intelligence Miscellaneous Teaching a Robot to Collaborate with a Human Via Haptic Teleoperation Hu, S., Kuchenbecker, K. J. Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (Published) BibTeX

Haptic Intelligence Miscellaneous Proton Pack: Visuo-Haptic Surface Data Recording Burka, A., Kuchenbecker, K. J. Hands-on demonstration presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (Published) BibTeX

Haptic Intelligence Article Haptic Feedback in Needle Insertion Modeling and Simulation Gourishetti, R., Manivannan, M. IEEE Reviews in Biomedical Engineering , 10:63-77, IEEE, May 2017 (Published)
Needle insertion is the most basic skill in medical care, and training has to be imparted not only for physicians but also for nurses and paramedics. In most needle insertion procedures, haptic feedback from the needle is the main stimulus in which novices need training. For better patient safety, the classical methods of training the haptic skills have to be replaced with simulators based on new robotic and graphics technologies. This paper reviews the current advances in needle insertion modeling, classified into three sections: needle insertion models, tissue deformation models, and needle-tissue interaction models. Although understated in the literature, the classical and dynamic friction models, which are critical for needle insertion modeling, are also discussed. The experimental setup or the needle simulators that have been developed to validate the models are described. The need of psychophysics for needle simulators and psychophysical parameter analysis of human perception in needle insertion are discussed, which are completely ignored in the literature.
DOI BibTeX

Haptic Intelligence Master Thesis How Should Robots Hug? Block, A. E. University of Pennsylvania, May 2017, Robotics Degree Program
A hug is one of the most basic ways humans can express affection. As hugs are so common, a natural progression of robot development is to have robots one day hug humans as seamlessly as these intimate human-human interactions occur. This project’s purpose is to evaluate human responses to different robot hugging techniques and behaviors. Specifically, we aim to test the hypothesis that a warm, soft, touch-sensitive PR2 humanoid robot can provide humans with satisfying hugs by matching both their hugging pressure and their hugging duration. Thirty participants experienced and evaluated twelve hugs with the robot, divided into three randomly ordered trials that focused on physical robot characteristics and nine randomly ordered trials with varied hug pressure and timing. We found that people prefer soft, warm hugs over hard, cold hugs. Furthermore, users prefer hugs that physically squeeze them and release immediately when they are ready for the hug to end. When comparing responses to a survey taken at the start and end of the hugging session, we found that after the experiment users felt significantly more understood by the robot, trusted it more, and thought it was easier to use than they initially anticipated.
BibTeX

Haptic Intelligence Conference Paper Proton 2: Increasing the Sensitivity and Portability of a Visuo-haptic Surface Interaction Recorder Burka, A., Rajvanshi, A., Allen, S., Kuchenbecker, K. J. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 439-445, Singapore, May 2017 (Published)
The Portable Robotic Optical/Tactile ObservatioN PACKage (PROTONPACK, or Proton for short) is a new handheld visuo-haptic sensing system that records surface interactions. We previously demonstrated system calibration and a classification task using external motion tracking. This paper details improvements in surface classification performance and removal of the dependence on external motion tracking, necessary before embarking on our goal of gathering a vast surface interaction dataset. Two experiments were performed to refine data collection parameters. After adjusting the placement and filtering of the Proton's high-bandwidth accelerometers, we recorded interactions between two differently-sized steel tooling ball end-effectors (diameter 6.35 and 9.525 mm) and five surfaces. Using features based on normal force, tangential force, end-effector speed, and contact vibration, we trained multi-class SVMs to classify the surfaces using 50 ms chunks of data from each end-effector. Classification accuracies of 84.5% and 91.5% respectively were achieved on unseen test data, an improvement over prior results. In parallel, we pursued on-board motion tracking, using the Proton's camera and fiducial markers. Motion tracks from the external and onboard trackers agree within 2 mm and 0.01 rad RMS, and the accuracy decreases only slightly to 87.7% when using onboard tracking for the 9.525 mm end-effector. These experiments indicate that the Proton 2 is ready for portable data collection.
DOI BibTeX

Haptic Intelligence Miscellaneous Automatic OSATS Rating of Trainee Skill at a Pediatric Laparoscopic Suturing Task Oquendo, Y. A., Riddle, E. W., Hiller, D., Blinman, T. A., Kuchenbecker, K. J. Surgical Endoscopy, 31(Supplement 1):S28, Extended abstract presented as a podium presentation at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Springer, Houston, USA, March 2017 (Published)
Introduction: Minimally invasive surgery has revolutionized surgical practice, but challenges remain. Trainees must acquire complex technical skills while minimizing patient risk, and surgeons must maintain their skills for rare procedures. These challenges are magnified in pediatric surgery due to the smaller spaces, finer tissue, and relative dearth of both inanimate and virtual simulators. To build technical expertise, trainees need opportunities for deliberate practice with specific performance feedback, which is typically provided via tedious human grading. This study aimed to validate a novel motion-tracking system and machine learning algorithm for automatically evaluating trainee performance on a pediatric laparoscopic suturing task using a 1–5 OSATS Overall Skill rating. Methods: Subjects (n=14) ranging from medical students to fellows per- formed one or two trials of an intracorporeal suturing task in a custom pediatric laparoscopy training box (Fig. 1) after watching a video of ideal performance by an expert. The position and orientation of the tools and endoscope were recorded over time using Ascension trakSTAR magnetic motion-tracking sensors, and both instrument grasp angles were recorded over time using flex sensors on the handles. The 27 trials were video-recorded and scored on the OSATS scale by a senior fellow; ratings ranged from 1 to 4. The raw motion data from each trial was processed to calculate over 200 preliminary motion parameters. Regularized least-squares regression (LASSO) was used to identify the most predictive parameters for inclusion in a regression tree. Model performance was evaluated by leave-one-subject-out cross validation, wherein the automatic scores given to each subject’s trials (by a model trained on all other data) are compared to the corresponding human rater scores. Results: The best-performing LASSO algorithm identified 14 predictive parameters for inclusion in the regression tree, including completion time, linear path length, angular path length, angular acceleration, grasp velocity, and grasp acceleration. The final model’s raw output showed a strong positive correlation of 0.87 with the reviewer-generated scores, and rounding the output to the nearest integer yielded a leave-one-subject-out cross-validation accuracy of 77.8%. Results are summarized in the confusion matrix (Table 1). Conclusions: Our novel motion-tracking system and regression model automatically gave previously unseen trials overall skill scores that closely match scores from an expert human rater. With additional data and further development, this system may enable creation of a motion-based training platform for pediatric laparoscopic surgery and could yield insights into the fundamental components of surgical skill.
DOI BibTeX

Haptic Intelligence Miscellaneous Hand-Clapping Games with a Baxter Robot Fitter, N. T., Kuchenbecker, K. J. Hands-on demonstration presented at the ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vienna, Austria, March 2017 (Published)
Robots that work alongside humans might be more effective if they could forge a strong social bond with their human partners. Hand-clapping games and other forms of rhythmic social-physical interaction may foster human-robot teamwork, but the design of such interactions has scarcely been explored. At the HRI 2017 conference, we will showcase several such interactions taken from our recent work with the Rethink Robotics Baxter Research Robot, including tempo-matching, Simon says, and Pat-a-cake-like games. We believe conference attendees will be both entertained and intrigued by this novel demonstration of social-physical HRI.
BibTeX

Haptic Intelligence Miscellaneous How Much Haptic Surface Data is Enough? Burka, A., Kuchenbecker, K. J. Workshop paper (5 pages) presented at the AAAI Spring Symposium on Interactive Multi-Sensory Object Perception for Embodied Agents, Stanford, USA, March 2017 (Published)
The Proton Pack is a portable visuo-haptic surface interaction recording device that will be used to collect a vast multimodal dataset, intended for robots to use as part of an approach to understanding the world around them. In order to collect a useful dataset, we want to pick a suitable interaction duration for each surface, noting the tradeoff between data collection resources and completeness of data. One interesting approach frames the data collection process as an online learning problem, building an incremental surface model and using that model to decide when there is enough data. Here we examine how to do such online surface modeling and when to stop collecting data, using kinetic friction as a first domain in which to apply online modeling.
URL BibTeX

Haptic Intelligence Article Effects of Grip-Force, Contact, and Acceleration Feedback on a Teleoperated Pick-and-Place Task Khurshid, R. P., Fitter, N. T., Fedalei, E. A., Kuchenbecker, K. J. IEEE Transactions on Haptics, 10(1):40-53, January 2017 (Published) DOI BibTeX