Header logo is


2019


no image
High-Fidelity Multiphysics Finite Element Modeling of Finger-Surface Interactions with Tactile Feedback

Serhat, G., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
In this study, we develop a high-fidelity finite element (FE) analysis framework that enables multiphysics simulation of the human finger in contact with a surface that is providing tactile feedback. We aim to elucidate a variety of physical interactions that can occur at finger-surface interfaces, including contact, friction, vibration, and electrovibration. We also develop novel FE-based methods that will allow prediction of nonconventional features such as real finger-surface contact area and finger stickiness. We envision using the developed computational tools for efficient design and optimization of haptic devices by replacing expensive and lengthy experimental procedures with high-fidelity simulation.

hi

[BibTex]

2019


[BibTex]


no image
Fingertip Friction Enhances Perception of Normal Force Changes

Gueorguiev, D., Lambert, J., Thonnard, J., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
Using a force-controlled robotic platform, we tested the human perception of positive and negative modulations in normal force during passive dynamic touch, which also induced a strong related change in the finger-surface lateral force. In a two-alternative forced-choice task, eleven participants had to detect brief variations in the normal force compared to a constant controlled pre-stimulation force of 1 N and report whether it had increased or decreased. The average 75% just noticeable difference (JND) was found to be around 0.25 N for detecting the peak change and 0.30 N for correctly reporting the increase or the decrease. Interestingly, the friction coefficient of a subject’s fingertip positively correlated with his or her performance at detecting the change and reporting its direction, which suggests that humans may use the lateral force as a sensory cue to perceive variations in the normal force.

hi

[BibTex]

[BibTex]


Thumb xl pocketrendering
Inflatable Haptic Sensor for the Torso of a Hugging Robot

Block, A. E., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
During hugs, humans naturally provide and intuit subtle non-verbal cues that signify the strength and duration of an exchanged hug. Personal preferences for this close interaction may vary greatly between people; robots do not currently have the abilities to perceive or understand these preferences. This work-in-progress paper discusses designing, building, and testing a novel inflatable torso that can simultaneously soften a robot and act as a tactile sensor to enable more natural and responsive hugging. Using PVC vinyl, a microphone, and a barometric pressure sensor, we created a small test chamber to demonstrate a proof of concept for the full torso. While contacting the chamber in several ways common in hugs (pat, squeeze, scratch, and rub), we recorded data from the two sensors. The preliminary results suggest that the complementary haptic sensing channels allow us to detect coarse and fine contacts typically experienced during hugs, regardless of user hand placement.

hi

Project Page [BibTex]

Project Page [BibTex]


Thumb xl figure1
Understanding the Pull-off Force of the Human Fingerpad

Nam, S., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
To understand the adhesive force that occurs when a finger pulls off of a smooth surface, we built an apparatus to measure the fingerpad’s moisture, normal force, and real contact area over time during interactions with a glass plate. We recorded a total of 450 trials (45 interactions by each of ten human subjects), capturing a wide range of values across the aforementioned variables. The experimental results showed that the pull-off force increases with larger finger contact area and faster detachment rate. Additionally, moisture generally increases the contact area of the finger, but too much moisture can restrict the increase in the pull-off force.

hi

[BibTex]

[BibTex]


Thumb xl h a image3
The Haptician and the Alphamonsters

Forte, M. P., L’Orsa, R., Mohan, M., Nam, S., Kuchenbecker, K. J.

Student Innovation Challenge on Implementing Haptics in Virtual Reality Environment presented at the IEEE World Haptics Conference, Tokyo, Japan, July 2019, Maria Paola Forte, Rachael L'Orsa, Mayumi Mohan, and Saekwang Nam contributed equally to this publication (misc)

Abstract
Dysgraphia is a neurological disorder characterized by writing disabilities that affects between 7% and 15% of children. It presents itself in the form of unfinished letters, letter distortion, inconsistent letter size, letter collision, etc. Traditional therapeutic exercises require continuous assistance from teachers or occupational therapists. Autonomous partial or full haptic guidance can produce positive results, but children often become bored with the repetitive nature of such activities. Conversely, virtual rehabilitation with video games represents a new frontier for occupational therapy due to its highly motivational nature. Virtual reality (VR) adds an element of novelty and entertainment to therapy, thus motivating players to perform exercises more regularly. We propose leveraging the HTC VIVE Pro and the EXOS Wrist DK2 to create an immersive spellcasting “exergame” (exercise game) that helps motivate children with dysgraphia to improve writing fluency.

hi

Student Innovation Challenge – Virtual Reality [BibTex]

Student Innovation Challenge – Virtual Reality [BibTex]


Thumb xl s ban outdoors 1   small
Explorations of Shape-Changing Haptic Interfaces for Blind and Sighted Pedestrian Navigation

Spiers, A., Kuchenbecker, K. J.

pages: 6, Workshop paper (6 pages) presented at the CHI 2019 Workshop on Hacking Blind Navigation, May 2019 (misc) Accepted

Abstract
Since the 1960s, technologists have worked to develop systems that facilitate independent navigation by vision-impaired (VI) pedestrians. These devices vary in terms of conveyed information and feedback modality. Unfortunately, many such prototypes never progress beyond laboratory testing. Conversely, smartphone-based navigation systems for sighted pedestrians have grown in robustness and capabilities, to the point of now being ubiquitous. How can we leverage the success of sighted navigation technology, which is driven by a larger global market, as a way to progress VI navigation systems? We believe one possibility is to make common devices that benefit both VI and sighted individuals, by providing information in a way that does not distract either user from their tasks or environment. To this end we have developed physical interfaces that eschew visual, audio or vibratory feedback, instead relying on the natural human ability to perceive the shape of a handheld object.

hi

[BibTex]

[BibTex]


no image
Bimanual Wrist-Squeezing Haptic Feedback Changes Speed-Force Tradeoff in Robotic Surgery Training

Cao, E., Machaca, S., Bernard, T., Wolfinger, B., Patterson, Z., Chi, A., Adrales, G. L., Kuchenbecker, K. J., Brown, J. D.

Extended abstract presented as an ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Baltimore, USA, April 2019 (misc) Accepted

hi

[BibTex]

[BibTex]


no image
Interactive Augmented Reality for Robot-Assisted Surgery

Forte, M. P., Kuchenbecker, K. J.

Extended abstract presented as an Emerging Technology ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Baltimore, Maryland, USA, April 2019 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
A Design Tool for Therapeutic Social-Physical Human-Robot Interactions

Mohan, M., Kuchenbecker, K. J.

Workshop paper (3 pages) presented at the HRI Pioneers Workshop, Daegu, South Korea, March 2019 (misc) Accepted

Abstract
We live in an aging society; social-physical human-robot interaction has the potential to keep our elderly adults healthy by motivating them to exercise. After summarizing prior work, this paper proposes a tool that can be used to design exercise and therapy interactions to be performed by an upper-body humanoid robot. The interaction design tool comprises a teleoperation system that transmits the operator’s arm motions, head motions and facial expression along with an interface to monitor and assess the motion of the user interacting with the robot. We plan to use this platform to create dynamic and intuitive exercise interactions.

hi

Project Page [BibTex]

Project Page [BibTex]


Thumb xl teaser
Toward Expert-Sourcing of a Haptic Device Repository

Seifi, H., Ip, J., Agrawal, A., Kuchenbecker, K. J., MacLean, K. E.

Glasgow, UK, 2019 (misc)

Abstract
Haptipedia is an online taxonomy, database, and visualization that aims to accelerate ideation of new haptic devices and interactions in human-computer interaction, virtual reality, haptics, and robotics. The current version of Haptipedia (105 devices) was created through iterative design, data entry, and evaluation by our team of experts. Next, we aim to greatly increase the number of devices and keep Haptipedia updated by soliciting data entry and verification from haptics experts worldwide.

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Visual-Inertial Mapping with Non-Linear Factor Recovery

Usenko, V., Demmel, N., Schubert, D., Stückler, J., Cremers, D.

2019, arXiv:1904.06504 (misc)

ev

[BibTex]

[BibTex]

2018


no image
Reducing 3D Vibrations to 1D in Real Time

Park, G., Kuchenbecker, K. J.

Hands-on demonstration (4 pages) presented at AsiaHaptics, Incheon, South Korea, November 2018 (misc)

Abstract
For simple and realistic vibrotactile feedback, 3D accelerations from real contact interactions are usually rendered using a single-axis vibration actuator; this dimensional reduction can be performed in many ways. This demonstration implements a real-time conversion system that simultaneously measures 3D accelerations and renders corresponding 1D vibrations using a two-pen interface. In the demonstration, a user freely interacts with various objects using an In-Pen that contains a 3-axis accelerometer. The captured accelerations are converted to a single-axis signal, and an Out-Pen renders the reduced signal for the user to feel. We prepared seven conversion methods from the simple use of a single-axis signal to applying principal component analysis (PCA) so that users can compare the performance of each conversion method in this demonstration.

hi

Project Page [BibTex]

2018


Project Page [BibTex]


Thumb xl representative image2
A Large-Scale Fabric-Based Tactile Sensor Using Electrical Resistance Tomography

Lee, H., Park, K., Kim, J., Kuchenbecker, K. J.

Hands-on demonstration (3 pages) presented at AsiaHaptics, Incheon, South Korea, November 2018 (misc)

Abstract
Large-scale tactile sensing is important for household robots and human-robot interaction because contacts can occur all over a robot’s body surface. This paper presents a new fabric-based tactile sensor that is straightforward to manufacture and can cover a large area. The tactile sensor is made of conductive and non-conductive fabric layers, and the electrodes are stitched with conductive thread, so the resulting device is flexible and stretchable. The sensor utilizes internal array electrodes and a reconstruction method called electrical resistance tomography (ERT) to achieve a high spatial resolution with a small number of electrodes. The developed sensor shows that only 16 electrodes can accurately estimate single and multiple contacts over a square that measures 20 cm by 20 cm.

hi

Project Page [BibTex]

Project Page [BibTex]


Thumb xl teaser ps hi
Statistical Modelling of Fingertip Deformations and Contact Forces during Tactile Interaction

Gueorguiev, D., Tzionas, D., Pacchierotti, C., Black, M. J., Kuchenbecker, K. J.

Extended abstract presented at the Hand, Brain and Technology conference (HBT), Ascona, Switzerland, August 2018 (misc)

Abstract
Little is known about the shape and properties of the human finger during haptic interaction, even though these are essential parameters for controlling wearable finger devices and deliver realistic tactile feedback. This study explores a framework for four-dimensional scanning (3D over time) and modelling of finger-surface interactions, aiming to capture the motion and deformations of the entire finger with high resolution while simultaneously recording the interfacial forces at the contact. Preliminary results show that when the fingertip is actively pressing a rigid surface, it undergoes lateral expansion and proximal/distal bending, deformations that cannot be captured by imaging of the contact area alone. Therefore, we are currently capturing a dataset that will enable us to create a statistical model of the finger’s deformations and predict the contact forces induced by tactile interaction with objects. This technique could improve current methods for tactile rendering in wearable haptic devices, which rely on general physical modelling of the skin’s compliance, by developing an accurate model of the variations in finger properties across the human population. The availability of such a model will also enable a more realistic simulation of virtual finger behaviour in virtual reality (VR) environments, as well as the ability to accurately model a specific user’s finger from lower resolution data. It may also be relevant for inferring the physical properties of the underlying tissue from observing the surface mesh deformations, as previously shown for body tissues.

hi

Project Page [BibTex]

Project Page [BibTex]


Thumb xl toc image
A machine from machines

Fischer, P.

Nature Physics, 14, pages: 1072–1073, July 2018 (misc)

Abstract
Building spinning microrotors that self-assemble and synchronize to form a gear sounds like an impossible feat. However, it has now been achieved using only a single type of building block -- a colloid that self-propels.

pf

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Reducing 3D Vibrations to 1D in Real Time

Park, G., Kuchenbecker, K. J.

Hands-on demonstration presented at EuroHaptics, Pisa, Italy, June 2018 (misc)

Abstract
In this demonstration, you will hold two pen-shaped modules: an in-pen and an out-pen. The in-pen is instrumented with a high-bandwidth three-axis accelerometer, and the out-pen contains a one-axis voice coil actuator. Use the in-pen to interact with different surfaces; the measured 3D accelerations are continually converted into 1D vibrations and rendered with the out-pen for you to feel. You can test conversion methods that range from simply selecting a single axis to applying a discrete Fourier transform or principal component analysis for realistic and brisk real-time conversion.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Haptipedia: Exploring Haptic Device Design Through Interactive Visualizations

Seifi, H., Fazlollahi, F., Park, G., Kuchenbecker, K. J., MacLean, K. E.

Hands-on demonstration presented at EuroHaptics, Pisa, Italy, June 2018 (misc)

Abstract
How many haptic devices have been proposed in the last 30 years? How can we leverage this rich source of design knowledge to inspire future innovations? Our goal is to make historical haptic invention accessible through interactive visualization of a comprehensive library – a Haptipedia – of devices that have been annotated with designer-relevant metadata. In this demonstration, participants can explore Haptipedia’s growing library of grounded force feedback devices through several prototype visualizations, interact with 3D simulations of the device mechanisms and movements, and tell us about the attributes and devices that could make Haptipedia a useful resource for the haptic design community.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Delivering 6-DOF Fingertip Tactile Cues

Young, E., Kuchenbecker, K. J.

Work-in-progress paper (5 pages) presented at EuroHaptics, Pisa, Italy, June 2018 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


Thumb xl koala
Designing a Haptic Empathetic Robot Animal for Children with Autism

Burns, R., Kuchenbecker, K. J.

Workshop paper (4 pages) presented at the Robotics: Science and Systems Workshop on Robot-Mediated Autism Intervention: Hardware, Software and Curriculum, Pittsburgh, USA, June 2018 (misc)

Abstract
Children with autism often endure sensory overload, may be nonverbal, and have difficulty understanding and relaying emotions. These experiences result in heightened stress during social interaction. Animal-assisted intervention has been found to improve the behavior of children with autism during social interaction, but live animal companions are not always feasible. We are thus in the process of designing a robotic animal to mimic some successful characteristics of animal-assisted intervention while trying to improve on others. The over-arching hypothesis of this research is that an appropriately designed robot animal can reduce stress in children with autism and empower them to engage in social interaction.

hi

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


no image
Soft Multi-Axis Boundary-Electrode Tactile Sensors for Whole-Body Robotic Skin

Lee, H., Kim, J., Kuchenbecker, K. J.

Workshop paper (2 pages) presented at the RSS Pioneers Workshop, Pittsburgh, USA, June 2018 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Arm-Worn Tactile Displays

Kuchenbecker, K. J.

Cross-Cutting Challenge Interactive Discussion presented at the IEEE Haptics Symposium, San Francisco, USA, March 2018 (misc)

Abstract
Fingertips and hands captivate the attention of most haptic interface designers, but humans can feel touch stimuli across the entire body surface. Trying to create devices that both can be worn and can deliver good haptic sensations raises challenges that rarely arise in other contexts. Most notably, tactile cues such as vibration, tapping, and squeezing are far simpler to implement in wearable systems than kinesthetic haptic feedback. This interactive discussion will present a variety of relevant projects to which I have contributed, attempting to pull out common themes and ideas for the future.

hi

[BibTex]

[BibTex]


Thumb xl wireframe main
Haptipedia: An Expert-Sourced Interactive Device Visualization for Haptic Designers

Seifi, H., MacLean, K. E., Kuchenbecker, K. J., Park, G.

Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, San Francisco, USA, March 2018 (misc)

Abstract
Much of three decades of haptic device invention is effectively lost to today’s designers: dispersion across time, region, and discipline imposes an incalculable drag on innovation in this field. Our goal is to make historical haptic invention accessible through interactive navigation of a comprehensive library – a Haptipedia – of devices that have been annotated with designer-relevant metadata. To build this open resource, we will systematically mine the literature and engage the haptics community for expert annotation. In a multi-year broad-based initiative, we will empirically derive salient attributes of haptic devices, design an interactive visualization tool where device creators and repurposers can efficiently explore and search Haptipedia, and establish methods and tools to manually and algorithmically collect data from the haptics literature and our community of experts. This paper outlines progress in compiling an initial corpus of grounded force-feedback devices and their attributes, and it presents a concept sketch of the interface we envision.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Exercising with Baxter: Design and Evaluation of Assistive Social-Physical Human-Robot Interaction

Fitter, N. T., Mohan, M., Kuchenbecker, K. J., Johnson, M. J.

Workshop paper (6 pages) presented at the HRI Workshop on Personal Robots for Exercising and Coaching, Chicago, USA, March 2018 (misc)

Abstract
The worldwide population of older adults is steadily increasing and will soon exceed the capacity of assisted living facilities. Accordingly, we aim to understand whether appropriately designed robots could help older adults stay active and engaged while living at home. We developed eight human-robot exercise games for the Baxter Research Robot with the guidance of experts in game design, therapy, and rehabilitation. After extensive iteration, these games were employed in a user study that tested their viability with 20 younger and 20 older adult users. All participants were willing to enter Baxter’s workspace and physically interact with the robot. User trust and confidence in Baxter increased significantly between pre- and post-experiment assessments, and one individual from the target user population supplied us with abundant positive feedback about her experience. The preliminary results presented in this paper indicate potential for the use of two-armed human-scale robots for social-physical exercise interaction.

hi

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


Thumb xl huggingpicture
Emotionally Supporting Humans Through Robot Hugs

Block, A. E., Kuchenbecker, K. J.

Workshop paper (2 pages) presented at the HRI Pioneers Workshop, Chicago, USA, March 2018 (misc)

Abstract
Hugs are one of the first forms of contact and affection humans experience. Due to their prevalence and health benefits, we want to enable robots to safely hug humans. This research strives to create and study a high fidelity robotic system that provides emotional support to people through hugs. This paper outlines our previous work evaluating human responses to a prototype’s physical and behavioral characteristics, and then it lays out our ongoing and future work.

hi

link (url) DOI Project Page [BibTex]

link (url) DOI Project Page [BibTex]


Thumb xl teaser ps hi
Towards a Statistical Model of Fingertip Contact Deformations from 4D Data

Gueorguiev, D., Tzionas, D., Pacchierotti, C., Black, M. J., Kuchenbecker, K. J.

Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, San Francisco, USA, March 2018 (misc)

Abstract
Little is known about the shape and properties of the human finger during haptic interaction even though this knowledge is essential to control wearable finger devices and deliver realistic tactile feedback. This study explores a framework for four-dimensional scanning and modeling of finger-surface interactions, aiming to capture the motion and deformations of the entire finger with high resolution. The results show that when the fingertip is actively pressing a rigid surface, it undergoes lateral expansion of about 0.2 cm and proximal/distal bending of about 30◦, deformations that cannot be captured by imaging of the contact area alone. This project constitutes a first step towards an accurate statistical model of the finger’s behavior during haptic interaction.

hi

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


no image
Can Humans Infer Haptic Surface Properties from Images?

Burka, A., Kuchenbecker, K. J.

Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, San Francisco, USA, March 2018 (misc)

Abstract
Human children typically experience their surroundings both visually and haptically, providing ample opportunities to learn rich cross-sensory associations. To thrive in human environments and interact with the real world, robots also need to build models of these cross-sensory associations; current advances in machine learning should make it possible to infer models from large amounts of data. We previously built a visuo-haptic sensing device, the Proton Pack, and are using it to collect a large database of matched multimodal data from tool-surface interactions. As a benchmark to compare with machine learning performance, we conducted a human subject study (n = 84) on estimating haptic surface properties (here: hardness, roughness, friction, and warmness) from images. Using a 100-surface subset of our database, we showed images to study participants and collected 5635 ratings of the four haptic properties, which we compared with ratings made by the Proton Pack operator and with physical data recorded using motion, force, and vibration sensors. Preliminary results indicate weak correlation between participant and operator ratings, but potential for matching up certain human ratings (particularly hardness and roughness) with features from the literature.

hi

Project Page [BibTex]

Project Page [BibTex]

2017


Thumb xl full outfit
Physical and Behavioral Factors Improve Robot Hug Quality

Block, A. E., Kuchenbecker, K. J.

Workshop Paper (2 pages) presented at the RO-MAN Workshop on Social Interaction and Multimodal Expression for Socially Intelligent Robots, Lisbon, Portugal, August 2017 (misc)

Abstract
A hug is one of the most basic ways humans can express affection. As hugs are so common, a natural progression of robot development is to have robots one day hug humans as seamlessly as these intimate human-human interactions occur. This project’s purpose is to evaluate human responses to different robot physical characteristics and hugging behaviors. Specifically, we aim to test the hypothesis that a warm, soft, touch-sensitive PR2 humanoid robot can provide humans with satisfying hugs by matching both their hugging pressure and their hugging duration. Thirty participants experienced and evaluated twelve hugs with the robot, divided into three randomly ordered trials that focused on physical robot char- acteristics and nine randomly ordered trials with varied hug pressure and duration. We found that people prefer soft, warm hugs over hard, cold hugs. Furthermore, users prefer hugs that physically squeeze them and release immediately when they are ready for the hug to end.

hi

Project Page [BibTex]

2017


Project Page [BibTex]


no image
Physically Interactive Exercise Games with a Baxter Robot

Fitter, N. T., Kuchenbecker, K. J.

Hands-on demonstration presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Proton Pack: Visuo-Haptic Surface Data Recording

Burka, A., Kuchenbecker, K. J.

Hands-on demonstration presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Teaching a Robot to Collaborate with a Human Via Haptic Teleoperation

Hu, S., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


Thumb xl full outfit
How Should Robots Hug?

Block, A. E., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
An Interactive Augmented-Reality Video Training Platform for the da Vinci Surgical System

Carlson, J., Kuchenbecker, K. J.

Workshop paper (3 pages) presented at the ICRA Workshop on C4 Surgical Robots, Singapore, May 2017 (misc)

Abstract
Teleoperated surgical robots such as the Intuitive da Vinci Surgical System facilitate minimally invasive surgeries, which decrease risk to patients. However, these systems can be difficult to learn, and existing training curricula on surgical simulators do not offer students the realistic experience of a full operation. This paper presents an augmented-reality video training platform for the da Vinci that will allow trainees to rehearse any surgery recorded by an expert. While the trainee operates a da Vinci in free space, they see their own instruments overlaid on the expert video. Tools are identified in the source videos via color segmentation and kernelized correlation filter tracking, and their depth is calculated from the da Vinci’s stereoscopic video feed. The user tries to follow the expert’s movements, and if any of their tools venture too far away, the system provides instantaneous visual feedback and pauses to allow the user to correct their motion. The trainee can also rewind the expert video by bringing either da Vinci tool very close to the camera. This combined and augmented video provides the user with an immersive and interactive training experience.

hi

[BibTex]

[BibTex]


no image
Hand-Clapping Games with a Baxter Robot

Fitter, N. T., Kuchenbecker, K. J.

Hands-on demonstration presented at ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vienna, Austria, March 2017 (misc)

Abstract
Robots that work alongside humans might be more effective if they could forge a strong social bond with their human partners. Hand-clapping games and other forms of rhythmic social-physical interaction may foster human-robot teamwork, but the design of such interactions has scarcely been explored. At the HRI 2017 conference, we will showcase several such interactions taken from our recent work with the Rethink Robotics Baxter Research Robot, including tempo-matching, Simon says, and Pat-a-cake-like games. We believe conference attendees will be both entertained and intrigued by this novel demonstration of social-physical HRI.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Automatic OSATS Rating of Trainee Skill at a Pediatric Laparoscopic Suturing Task

Oquendo, Y. A., Riddle, E. W., Hiller, D., Blinman, T. A., Kuchenbecker, K. J.

Surgical Endoscopy, 31(Supplement 1):S28, Extended abstract presented as a podium presentation at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Springer, Houston, USA, March 2017 (misc)

Abstract
Introduction: Minimally invasive surgery has revolutionized surgical practice, but challenges remain. Trainees must acquire complex technical skills while minimizing patient risk, and surgeons must maintain their skills for rare procedures. These challenges are magnified in pediatric surgery due to the smaller spaces, finer tissue, and relative dearth of both inanimate and virtual simulators. To build technical expertise, trainees need opportunities for deliberate practice with specific performance feedback, which is typically provided via tedious human grading. This study aimed to validate a novel motion-tracking system and machine learning algorithm for automatically evaluating trainee performance on a pediatric laparoscopic suturing task using a 1–5 OSATS Overall Skill rating. Methods: Subjects (n=14) ranging from medical students to fellows per- formed one or two trials of an intracorporeal suturing task in a custom pediatric laparoscopy training box (Fig. 1) after watching a video of ideal performance by an expert. The position and orientation of the tools and endoscope were recorded over time using Ascension trakSTAR magnetic motion-tracking sensors, and both instrument grasp angles were recorded over time using flex sensors on the handles. The 27 trials were video-recorded and scored on the OSATS scale by a senior fellow; ratings ranged from 1 to 4. The raw motion data from each trial was processed to calculate over 200 preliminary motion parameters. Regularized least-squares regression (LASSO) was used to identify the most predictive parameters for inclusion in a regression tree. Model performance was evaluated by leave-one-subject-out cross validation, wherein the automatic scores given to each subject’s trials (by a model trained on all other data) are compared to the corresponding human rater scores. Results: The best-performing LASSO algorithm identified 14 predictive parameters for inclusion in the regression tree, including completion time, linear path length, angular path length, angular acceleration, grasp velocity, and grasp acceleration. The final model’s raw output showed a strong positive correlation of 0.87 with the reviewer-generated scores, and rounding the output to the nearest integer yielded a leave-one-subject-out cross-validation accuracy of 77.8%. Results are summarized in the confusion matrix (Table 1). Conclusions: Our novel motion-tracking system and regression model automatically gave previously unseen trials overall skill scores that closely match scores from an expert human rater. With additional data and further development, this system may enable creation of a motion-based training platform for pediatric laparoscopic surgery and could yield insights into the fundamental components of surgical skill.

hi

[BibTex]

[BibTex]


no image
How Much Haptic Surface Data is Enough?

Burka, A., Kuchenbecker, K. J.

Workshop paper (5 pages) presented at the AAAI Spring Symposium on Interactive Multi-Sensory Object Perception for Embodied Agents, Stanford, USA, March 2017 (misc)

Abstract
The Proton Pack is a portable visuo-haptic surface interaction recording device that will be used to collect a vast multimodal dataset, intended for robots to use as part of an approach to understanding the world around them. In order to collect a useful dataset, we want to pick a suitable interaction duration for each surface, noting the tradeoff between data collection resources and completeness of data. One interesting approach frames the data collection process as an online learning problem, building an incremental surface model and using that model to decide when there is enough data. Here we examine how to do such online surface modeling and when to stop collecting data, using kinetic friction as a first domain in which to apply online modeling.

hi

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


Thumb xl screen shot 2018 05 04 at 11.44.04
Enhancing Human-Computer Interaction via Electrovibration

Emgin, S. E., Sadia, B., Vardar, Y., Basdogan, C.

Demo in IEEE World Haptics, 2017 (misc)

Abstract
We present a compact tablet that displays electrostatic haptic feedback to the user. We track user?s finger position via an infrared frame and then display haptic feedback through a capacitive touch screen based on her/his position. In order to demonstrate practical utility of the proposed system, the following applications have been developed: (1) Online Shopping application allows users to be able to feel the cord density of two different fabrics. (2) Education application asks user to add two numbers by dragging one number onto another in order to match the sum. After selecting the first number, haptic feedback assists user to select the right pair. (3) Gaming/Entertainment application presents users a bike riding experience on three different road textures -smooth, bumpy, and sandy. (4) User Interface application in which users are asked to drag two visually identical folders. While dragging, users are able to differentiate the amount of data in each folder based on haptic resistance.

hi

[BibTex]

[BibTex]


Thumb xl screen shot 2018 05 04 at 11.42.00
Reproduction of textures based on electrovibration

Fiedler, T., Vardar, Y., Strese, M., Steinbach, E., Basdogan, C.

Demo in IEEE World Haptics, 2017 (misc)

Abstract
This demonstration presents an approach to represent textures based on electovibration. We collect acceleration data which occurs while sliding a tool tip over a real texture surface. The prerecorded data was collected by a ADXL335 accelerometer, which is mounted on a FALCON device moving on the x-axis with a regulated velocity. In order to replicate the same acceleration with electrovibration, we found two problems. The frequency of one sine wave shifts to the double frequency. This effect originates from the electrostatic force between the finger pad and the tactile display as proposed by Kactmarek et Al. [1]. Taking the square root of the input signal corrects the effect. This was also earlier proposed by [1, 2, 3] However, if not only one but multiple sine waves are displayed interference occur and acceleration signals from real textures may not feel perceptually realistic. We propose to display only the dominant frequencies from a real texture signal. Peak frequencies are determined within the respect of the JND of 11 percent found by earlier literature. A new sine wave signal with the dominant frequencies is created. In the demo, we will let the attendees feel the differences between prerecorded and artificially created textures.

hi

[BibTex]

[BibTex]

2016


no image
Numerical Investigation of Frictional Forces Between a Finger and a Textured Surface During Active Touch

Khojasteh, B., Janko, M., Visell, Y.

Extended abstract presented in form of an oral presentation at the 3rd International Conference on BioTribology (ICoBT), London, England, September 2016 (misc)

Abstract
The biomechanics of the human finger pad has been investigated in relation to motor behaviour and sensory function in the upper limb. While the frictional properties of the finger pad are important for grip and grasp function, recent attention has also been given to the roles played by friction when perceiving a surface via sliding contact. Indeed, the mechanics of sliding contact greatly affect stimuli felt by the finger scanning a surface. Past research has shed light on neural mechanisms of haptic texture perception, but the relation with time-resolved frictional contact interactions is unknown. Current biotribological models cannot predict time-resolved frictional forces felt by a finger as it slides on a rough surface. This constitutes a missing link in understanding the mechanical basis of texture perception. To ameliorate this, we developed a two-dimensional finite element numerical simulation of a human finger pad in sliding contact with a textured surface. Our model captures bulk mechanical properties, including hyperelasticity, dissipation, and tissue heterogeneity, and contact dynamics. To validate it, we utilized a database of measurements that we previously captured with a variety of human fingers and surfaces. By designing the simulations to match the measurements, we evaluated the ability of the FEM model to predict time-resolved sliding frictional forces. We varied surface texture wavelength, sliding speed, and normal forces in the experiments. An analysis of the results indicated that both time- and frequency-domain features of forces produced during finger-surface sliding interactions were reproduced, including many of the phenomena that we observed in analyses of real measurements, including quasiperiodicity, harmonic distortion and spectral decay in the frequency domain, and their dependence on kinetics and surface properties. The results shed light on frictional signatures of surface texture during active touch, and may inform understanding of the role played by friction in texture discrimination.

hi

[BibTex]

2016


[BibTex]


Thumb xl romo and mini
Behavioral Learning and Imitation for Music-Based Robotic Therapy for Children with Autism Spectrum Disorder

Burns, R., Nizambad, S., Park, C. H., Jeon, M., Howard, A.

Workshop paper (5 pages) at the RO-MAN Workshop on Behavior Adaptation, Interaction and Learning for Assistive Robotics, August 2016 (misc)

Abstract
In this full workshop paper, we discuss the positive impacts of robot, music, and imitation therapies on children with autism. We also discuss the use of Laban Motion Analysis (LMA) to identify emotion through movement and posture cues. We present our preliminary studies of the "Five Senses" game that our two robots, Romo the penguin and Darwin Mini, partake in. Using an LMA-focused approach (enabled by our skeletal tracking Kinect algorithm), we find that our participants show increased frequency of movement and speed when the game has a musical accompaniment. Therefore, participants may have increased engagement with our robots and game if music is present. We also begin exploring motion learning for future works.

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Design and evaluation of a novel mechanical device to improve hemiparetic gait: a case report

Fjeld, K., Hu, S., Kuchenbecker, K. J., Vasudevan, E. V.

Extended abstract presented at the Biomechanics and Neural Control of Movement Conference (BANCOM), 2016, Poster presentation given by Fjeld (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
One Sensor, Three Displays: A Comparison of Tactile Rendering from a BioTac Sensor

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Philadelphia, Pennsylvania, USA, April 2016 (misc)

hi

[BibTex]

[BibTex]


no image
Designing Human-Robot Exercise Games for Baxter

Fitter, N. T., Hawkes, D. T., Johnson, M. J., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
IMU-Mediated Real-Time Human-Baxter Hand-Clapping Interaction

Fitter, N. T., Huang, Y. E., Mayer, J. P., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the {\em IEEE/RSJ International Conference on Intelligent Robots and Systems} (misc)

hi

[BibTex]

[BibTex]

2015


no image
Haptic Textures for Online Shopping

Culbertson, H., Kuchenbecker, K. J.

Interactive demonstrations in The Retail Collective exhibit, presented at the Dx3 Conference in Toronto, Canada, March 2015 (misc)

hi

[BibTex]

2015


[BibTex]

2014


no image
Teaching Forward and Inverse Kinematics of Robotic Manipulators Via MATLAB

Wong, D., Dames, P., J. Kuchenbecker, K.

June 2014, Presented at {\em ICRA Workshop on {MATLAB/Simulink} for Robotics Education and Research}. Oral presentation given by {Dames} and {Wong} (misc)

hi

[BibTex]

2014


[BibTex]


no image
Control of a Virtual Robot with Fingertip Contact, Pressure, Vibrotactile, and Grip Force Feedback

Pierce, R. M., Fedalei, E. A., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]


no image
A Modular Tactile Motion Guidance System

Kuchenbecker, K. J., Anon, A. M., Barkin, T., deVillafranca, K., Lo, M.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]


no image
The Penn Haptic Texture Toolkit

Culbertson, H., Delgado, J. J. L., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]

2013


no image
Jointonation: Robotization of the Human Body by Vibrotactile Feedback

Kurihara, Y., Hachisu, T., Kuchenbecker, K. J., Kajimoto, H.

Emerging Technologies Demonstration with Talk at ACM SIGGRAPH Asia, Hong Kong, November 2013, Hands-on demonstration given by Kurihara, Takei, and Nakai. Best Demonstration Award as voted by the Program Committee (misc)

hi

[BibTex]

2013


[BibTex]