Header logo is


2020


Characterization of a Magnetic Levitation Haptic Interface for Realistic Tool-Based Interactions
Characterization of a Magnetic Levitation Haptic Interface for Realistic Tool-Based Interactions

Lee, H., Tombak, G. I., Park, G., Kuchenbecker, K. J.

Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (misc)

Abstract
We introduce our recent study on the characterization of a commercial magnetic levitation haptic interface (MagLev 200, Butterfly Haptics LLC) for realistic high-bandwidth interactions. This device’s haptic rendering scheme can provide strong 6-DoF (force and torque) feedback without friction at all poses in its small workspace. The objective of our study is to enable the device to accurately render realistic multidimensional vibrotactile stimuli measured from a stylus-like tool. Our approach is to characterize the dynamics between the commanded wrench and the resulting translational acceleration across the frequency range of interest. To this end, we first custom-designed and attached a pen-shaped manipulandum (11.5 cm, aluminum) to the top of the MagLev 200’s end-effector for better usability in grasping. An accelerometer (ADXL354, Analog Devices) was rigidly mounted inside the manipulandum. Then, we collected a data set where the input is a 30-second-long force and/or torque signal commanded as a sweep function from 10 to 500 Hz; the output is the corresponding acceleration measurement, which we collected both with and without a user holding the handle. We succeeded at fitting both non-parametric and parametric versions of the transfer functions for both scenarios, with a fitting accuracy of about 95% for the parametric transfer functions. In the future, we plan to find the best method of applying the inverse parametric transfer function to our system. We will then employ that compensation method in a user study to evaluate the realism of different algorithms for reducing the dimensionality of tool-based vibrotactile cues.

hi

link (url) [BibTex]

2020


link (url) [BibTex]


Tactile Textiles: An Assortment of Fabric-Based Tactile Sensors for Contact Force and Contact Location
Tactile Textiles: An Assortment of Fabric-Based Tactile Sensors for Contact Force and Contact Location

Burns, R. B., Thomas, N., Lee, H., Faulkner, R., Kuchenbecker, K. J.

Hands-on demonstration presented at EuroHaptics, Leiden, The Netherlands, September 2020, Rachael Bevill Burns, Neha Thomas, and Hyosang Lee contributed equally to this publication (misc)

Abstract
Fabric-based tactile sensors are promising for the construction of robotic skin due to their soft and flexible nature. Conductive fabric layers can be used to form piezoresistive structures that are sensitive to contact force and/or contact location. This demonstration showcases three diverse fabric-based tactile sensors we have created. The first detects dynamic tactile events anywhere within a region on a robot’s body. The second design measures the precise location at which a single low-force contact is applied. The third sensor uses electrical resistance tomography to output both the force and location of multiple simultaneous contacts applied across a surface.

hi

Project Page Project Page [BibTex]

Project Page Project Page [BibTex]


no image
Estimating Human Handshape by Feeling the Wrist

Forte, M., Young, E. M., Kuchenbecker, K. J.

Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (misc)

hi

[BibTex]

[BibTex]


no image
Intermediate Ridges Amplify Mechanoreceptor Strains in Static and Dynamic Touch

Serhat, G., Kuchenbecker, K. J.

Work-in-progress poster presented at the EuroHaptics (EH), Leiden, The Netherlands, September 2020 (misc)

hi

[BibTex]

[BibTex]


no image
Seeing through Touch: Contact-Location Sensing and Tactile Feedback for Prosthetic Hands

Thomas, N., Kuchenbecker, K. J.

Works-in-progress abstract and poster presented at Eurohaptics 2020, Leiden, Netherlands, September 2020 (misc)

Abstract
Locating and picking up an object without vision is a simple task for able-bodied people, due in part to their rich tactile perception capabilities. The same cannot be said for users of standard myoelectric prostheses, who must rely largely on visual cues to successfully interact with the environment. To enable prosthesis users to locate and grasp objects without looking at them, we propose two changes: adding specialized contact-location sensing to the dorsal and palmar aspects of the prosthetic hand’s fingers, and providing the user with tactile feedback of where an object touches the fingers. To evaluate the potential utility of these changes, we developed a simple, sensitive, fabric-based tactile sensor which provides continuous contact location information via a change in voltage of a voltage divider circuit. This sensor was wrapped around the fingers of a commercial prosthetic hand (Ottobock SensorHand Speed). Using an ATI Nano17 force sensor, we characterized the tactile sensor’s response to normal force at distributed contact locations and obtained an average detection threshold of 0.63 +/- 0.26 N. We also confirmed that the voltage-to-location mapping is linear (R squared = 0.99). Sensor signals were adapted to the stationary vibrotactile funneling illusion to provide haptic feedback of contact location. These preliminary results indicate a promising system that imitates a key aspect of the sensory capabilities of the intact hand. Future work includes testing the system in a modified reach-grasp-and-lift study, in which participants must accomplish the task blindfolded.

hi

[BibTex]

[BibTex]


no image
Vision-based Force Estimation for a da Vinci Instrument Using Deep Neural Networks

Lee, Y., Husin, H. M., Forte, M., Lee, S., Kuchenbecker, K. J.

Extended abstract presented as an Emerging Technology ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Cleveland, Ohio, USA, August 2020 (misc) Accepted

hi

[BibTex]

[BibTex]


A Fabric-Based Sensing System for Recognizing Social Touch
A Fabric-Based Sensing System for Recognizing Social Touch

Burns, R. B., Lee, H., Seifi, H., Kuchenbecker, K. J.

Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, Washington, DC, USA, March 2020 (misc)

Abstract
We present a fabric-based piezoresistive tactile sensor system designed to detect social touch gestures on a robot. The unique sensor design utilizes three layers of low-conductivity fabric sewn together on alternating edges to form an accordion pattern and secured between two outer high-conductivity layers. This five-layer design demonstrates a greater resistance range and better low-force sensitivity than previous designs that use one layer of low-conductivity fabric with or without a plastic mesh layer. An individual sensor from our system can presently identify six different communication gestures – squeezing, patting, scratching, poking, hand resting without movement, and no touch – with an average accuracy of 90%. A layer of foam can be added beneath the sensor to make a rigid robot more appealing for humans to touch without inhibiting the system’s ability to register social touch gestures.

hi

Project Page [BibTex]

Project Page [BibTex]


Do Touch Gestures Affect How Electrovibration Feels?
Do Touch Gestures Affect How Electrovibration Feels?

Vardar, Y., Kuchenbecker, K. J.

Hands-on demonstration (1 page) presented at the IEEE Haptics Symposium, Washington, DC, USA, March 2020 (misc)

hi

[BibTex]

[BibTex]

2019


no image
Interactive Augmented Reality for Robot-Assisted Surgery

Forte, M., Kuchenbecker, K. J.

Workshop extended abstract presented as a podium presentation at the IROS Workshop on Legacy Disruptors in Applied Telerobotics, Macau, November 2019 (misc) Accepted

hi

Project Page [BibTex]

2019


Project Page [BibTex]


no image
High-Fidelity Multiphysics Finite Element Modeling of Finger-Surface Interactions with Tactile Feedback

Serhat, G., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
In this study, we develop a high-fidelity finite element (FE) analysis framework that enables multiphysics simulation of the human finger in contact with a surface that is providing tactile feedback. We aim to elucidate a variety of physical interactions that can occur at finger-surface interfaces, including contact, friction, vibration, and electrovibration. We also develop novel FE-based methods that will allow prediction of nonconventional features such as real finger-surface contact area and finger stickiness. We envision using the developed computational tools for efficient design and optimization of haptic devices by replacing expensive and lengthy experimental procedures with high-fidelity simulation.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Fingertip Friction Enhances Perception of Normal Force Changes

Gueorguiev, D., Lambert, J., Thonnard, J., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
Using a force-controlled robotic platform, we tested the human perception of positive and negative modulations in normal force during passive dynamic touch, which also induced a strong related change in the finger-surface lateral force. In a two-alternative forced-choice task, eleven participants had to detect brief variations in the normal force compared to a constant controlled pre-stimulation force of 1 N and report whether it had increased or decreased. The average 75% just noticeable difference (JND) was found to be around 0.25 N for detecting the peak change and 0.30 N for correctly reporting the increase or the decrease. Interestingly, the friction coefficient of a subject’s fingertip positively correlated with his or her performance at detecting the change and reporting its direction, which suggests that humans may use the lateral force as a sensory cue to perceive variations in the normal force.

hi

[BibTex]

[BibTex]


Inflatable Haptic Sensor for the Torso of a Hugging Robot
Inflatable Haptic Sensor for the Torso of a Hugging Robot

Block, A. E., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
During hugs, humans naturally provide and intuit subtle non-verbal cues that signify the strength and duration of an exchanged hug. Personal preferences for this close interaction may vary greatly between people; robots do not currently have the abilities to perceive or understand these preferences. This work-in-progress paper discusses designing, building, and testing a novel inflatable torso that can simultaneously soften a robot and act as a tactile sensor to enable more natural and responsive hugging. Using PVC vinyl, a microphone, and a barometric pressure sensor, we created a small test chamber to demonstrate a proof of concept for the full torso. While contacting the chamber in several ways common in hugs (pat, squeeze, scratch, and rub), we recorded data from the two sensors. The preliminary results suggest that the complementary haptic sensing channels allow us to detect coarse and fine contacts typically experienced during hugs, regardless of user hand placement.

hi

Project Page [BibTex]

Project Page [BibTex]


Understanding the Pull-off Force of the Human Fingerpad
Understanding the Pull-off Force of the Human Fingerpad

Nam, S., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
To understand the adhesive force that occurs when a finger pulls off of a smooth surface, we built an apparatus to measure the fingerpad’s moisture, normal force, and real contact area over time during interactions with a glass plate. We recorded a total of 450 trials (45 interactions by each of ten human subjects), capturing a wide range of values across the aforementioned variables. The experimental results showed that the pull-off force increases with larger finger contact area and faster detachment rate. Additionally, moisture generally increases the contact area of the finger, but too much moisture can restrict the increase in the pull-off force.

hi

Project Page [BibTex]

Project Page [BibTex]


The Haptician and the Alphamonsters
The Haptician and the Alphamonsters

Forte, M., L’Orsa, R., Mohan, M., Nam, S., Kuchenbecker, K. J.

Student Innovation Challenge on Implementing Haptics in Virtual Reality Environment presented at the IEEE World Haptics Conference, Tokyo, Japan, July 2019, Maria Paola Forte, Rachael L'Orsa, Mayumi Mohan, and Saekwang Nam contributed equally to this publication (misc)

Abstract
Dysgraphia is a neurological disorder characterized by writing disabilities that affects between 7% and 15% of children. It presents itself in the form of unfinished letters, letter distortion, inconsistent letter size, letter collision, etc. Traditional therapeutic exercises require continuous assistance from teachers or occupational therapists. Autonomous partial or full haptic guidance can produce positive results, but children often become bored with the repetitive nature of such activities. Conversely, virtual rehabilitation with video games represents a new frontier for occupational therapy due to its highly motivational nature. Virtual reality (VR) adds an element of novelty and entertainment to therapy, thus motivating players to perform exercises more regularly. We propose leveraging the HTC VIVE Pro and the EXOS Wrist DK2 to create an immersive spellcasting “exergame” (exercise game) that helps motivate children with dysgraphia to improve writing fluency.

hi

Student Innovation Challenge – Virtual Reality [BibTex]

Student Innovation Challenge – Virtual Reality [BibTex]


Explorations of Shape-Changing Haptic Interfaces for Blind and Sighted Pedestrian Navigation
Explorations of Shape-Changing Haptic Interfaces for Blind and Sighted Pedestrian Navigation

Spiers, A., Kuchenbecker, K. J.

pages: 6, Workshop paper (6 pages) presented at the CHI 2019 Workshop on Hacking Blind Navigation, May 2019 (misc) Accepted

Abstract
Since the 1960s, technologists have worked to develop systems that facilitate independent navigation by vision-impaired (VI) pedestrians. These devices vary in terms of conveyed information and feedback modality. Unfortunately, many such prototypes never progress beyond laboratory testing. Conversely, smartphone-based navigation systems for sighted pedestrians have grown in robustness and capabilities, to the point of now being ubiquitous. How can we leverage the success of sighted navigation technology, which is driven by a larger global market, as a way to progress VI navigation systems? We believe one possibility is to make common devices that benefit both VI and sighted individuals, by providing information in a way that does not distract either user from their tasks or environment. To this end we have developed physical interfaces that eschew visual, audio or vibratory feedback, instead relying on the natural human ability to perceive the shape of a handheld object.

hi

[BibTex]

[BibTex]


no image
Bimanual Wrist-Squeezing Haptic Feedback Changes Speed-Force Tradeoff in Robotic Surgery Training

Cao, E., Machaca, S., Bernard, T., Wolfinger, B., Patterson, Z., Chi, A., Adrales, G. L., Kuchenbecker, K. J., Brown, J. D.

Extended abstract presented as an ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Baltimore, USA, April 2019 (misc) Accepted

hi

[BibTex]

[BibTex]


no image
Interactive Augmented Reality for Robot-Assisted Surgery

Forte, M., Kuchenbecker, K. J.

Extended abstract presented as an Emerging Technology ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Baltimore, Maryland, USA, April 2019 (misc) Accepted

hi

Project Page [BibTex]

Project Page [BibTex]


no image
A Design Tool for Therapeutic Social-Physical Human-Robot Interactions

Mohan, M., Kuchenbecker, K. J.

Workshop paper (3 pages) presented at the HRI Pioneers Workshop, Daegu, South Korea, March 2019 (misc)

Abstract
We live in an aging society; social-physical human-robot interaction has the potential to keep our elderly adults healthy by motivating them to exercise. After summarizing prior work, this paper proposes a tool that can be used to design exercise and therapy interactions to be performed by an upper-body humanoid robot. The interaction design tool comprises a teleoperation system that transmits the operator’s arm motions, head motions and facial expression along with an interface to monitor and assess the motion of the user interacting with the robot. We plan to use this platform to create dynamic and intuitive exercise interactions.

hi

DOI Project Page [BibTex]

DOI Project Page [BibTex]


Toward Expert-Sourcing of a Haptic Device Repository
Toward Expert-Sourcing of a Haptic Device Repository

Seifi, H., Ip, J., Agrawal, A., Kuchenbecker, K. J., MacLean, K. E.

Glasgow, UK, 2019 (misc)

Abstract
Haptipedia is an online taxonomy, database, and visualization that aims to accelerate ideation of new haptic devices and interactions in human-computer interaction, virtual reality, haptics, and robotics. The current version of Haptipedia (105 devices) was created through iterative design, data entry, and evaluation by our team of experts. Next, we aim to greatly increase the number of devices and keep Haptipedia updated by soliciting data entry and verification from haptics experts worldwide.

hi

link (url) [BibTex]

link (url) [BibTex]

2016


no image
Quantifying Therapist Practitioner Roles Using Video-based Analysis: Can We Reliably Model Therapist-Patient Interactions During Task-Oriented Therapy?

Mendonca, R., Johnson, M. J., Laskin, S., Adair, L., Mohan, M.

pages: E55-E56, Abstract in the Archives of Physical Medicine and Rehabilitation, October 2016 (misc)

hi

DOI [BibTex]

2016


DOI [BibTex]


no image
Numerical Investigation of Frictional Forces Between a Finger and a Textured Surface During Active Touch

Khojasteh, B., Janko, M., Visell, Y.

Extended abstract presented in form of an oral presentation at the 3rd International Conference on BioTribology (ICoBT), London, England, September 2016 (misc)

Abstract
The biomechanics of the human finger pad has been investigated in relation to motor behaviour and sensory function in the upper limb. While the frictional properties of the finger pad are important for grip and grasp function, recent attention has also been given to the roles played by friction when perceiving a surface via sliding contact. Indeed, the mechanics of sliding contact greatly affect stimuli felt by the finger scanning a surface. Past research has shed light on neural mechanisms of haptic texture perception, but the relation with time-resolved frictional contact interactions is unknown. Current biotribological models cannot predict time-resolved frictional forces felt by a finger as it slides on a rough surface. This constitutes a missing link in understanding the mechanical basis of texture perception. To ameliorate this, we developed a two-dimensional finite element numerical simulation of a human finger pad in sliding contact with a textured surface. Our model captures bulk mechanical properties, including hyperelasticity, dissipation, and tissue heterogeneity, and contact dynamics. To validate it, we utilized a database of measurements that we previously captured with a variety of human fingers and surfaces. By designing the simulations to match the measurements, we evaluated the ability of the FEM model to predict time-resolved sliding frictional forces. We varied surface texture wavelength, sliding speed, and normal forces in the experiments. An analysis of the results indicated that both time- and frequency-domain features of forces produced during finger-surface sliding interactions were reproduced, including many of the phenomena that we observed in analyses of real measurements, including quasiperiodicity, harmonic distortion and spectral decay in the frequency domain, and their dependence on kinetics and surface properties. The results shed light on frictional signatures of surface texture during active touch, and may inform understanding of the role played by friction in texture discrimination.

hi

[BibTex]

[BibTex]


Behavioral Learning and Imitation for Music-Based Robotic Therapy for Children with Autism Spectrum Disorder
Behavioral Learning and Imitation for Music-Based Robotic Therapy for Children with Autism Spectrum Disorder

Burns, R., Nizambad, S., Park, C. H., Jeon, M., Howard, A.

Workshop paper (5 pages) at the RO-MAN Workshop on Behavior Adaptation, Interaction and Learning for Assistive Robotics, August 2016 (misc)

Abstract
In this full workshop paper, we discuss the positive impacts of robot, music, and imitation therapies on children with autism. We also discuss the use of Laban Motion Analysis (LMA) to identify emotion through movement and posture cues. We present our preliminary studies of the "Five Senses" game that our two robots, Romo the penguin and Darwin Mini, partake in. Using an LMA-focused approach (enabled by our skeletal tracking Kinect algorithm), we find that our participants show increased frequency of movement and speed when the game has a musical accompaniment. Therefore, participants may have increased engagement with our robots and game if music is present. We also begin exploring motion learning for future works.

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Design and evaluation of a novel mechanical device to improve hemiparetic gait: a case report

Fjeld, K., Hu, S., Kuchenbecker, K. J., Vasudevan, E. V.

Extended abstract presented at the Biomechanics and Neural Control of Movement Conference (BANCOM), 2016, Poster presentation given by Fjeld (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
One Sensor, Three Displays: A Comparison of Tactile Rendering from a BioTac Sensor

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Philadelphia, Pennsylvania, USA, April 2016 (misc)

hi

[BibTex]

[BibTex]


Multisensory robotic therapy to promote natural emotional interaction for children with ASD
Multisensory robotic therapy to promote natural emotional interaction for children with ASD

Bevill, R., Azzi, P., Spadafora, M., Park, C. H., Jeon, M., Kim, H. J., Lee, J., Raihan, K., Howard, A.

Proceedings of the ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 571, March 2016 (misc)

Abstract
In this video submission, we are introduced to two robots, Romo the penguin and Darwin Mini. We have programmed these robots to perform a variety of emotions through facial expression and body language, respectively. We aim to use these robots with children with autism, to demo safe emotional and social responses in various sensory situations.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder
Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder

Bevill, R., Park, C. H., Kim, H. J., Lee, J., Rennie, A., Jeon, M., Howard, A.

Extended abstract presented at the ACM/IEEE International Conference on Human Robot Interaction (HRI), March 2016 (misc)

Abstract
In this abstract, we present the overarching goal of our interactive robotic framework - to teach emotional and social behavior to children with autism spectrum disorders via multi-sensory therapy. We introduce our robot characters, Romo and Darwin Mini, and the "Five Senses" scenario they will undergo. This sensory game will develop the children's interest, and will model safe and appropriate reactions to typical sensory overload stimuli.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Designing Human-Robot Exercise Games for Baxter

Fitter, N. T., Hawkes, D. T., Johnson, M. J., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Design of a Low-Cost Platform for Autonomous Mobile Service Robots

Eaton, E., Mucchiani, C., Mohan, M., Isele, D., Luná, J. M., Clingerman, C.

Workshop paper (7 pages) presented at the 25th International Joint Conference on Artificial Intelligence (IJCAI) Workshop on Autonomous Mobile Service Robots, New York, USA, 2016 (misc)

Abstract
Most current autonomous mobile service robots are either expensive commercial platforms or custom manufactured for research environments, limiting their availability. We present the design for a lowcost service robot based on the widely used TurtleBot 2 platform, with the goal of making service robots affordable and accessible to the research, educational, and hobbyist communities. Our design uses a set of simple and inexpensive modifications to transform the TurtleBot 2 into a 4.5ft (1.37m) tall tour-guide or telepresence-style robot, capable of performing a wide variety of indoor service tasks. The resulting platform provides a shoulder-height touchscreen and 3D camera for interaction, an optional low-cost arm for manipulation, enhanced onboard computation, autonomous charging, and up to 6 hours of runtime. The resulting platform can support many of the tasks performed by significantly more expensive service robots. For compatibility with existing software packages, the service robot runs the Robot Operating System (ROS).

hi

link (url) [BibTex]

link (url) [BibTex]


no image
IMU-Mediated Real-Time Human-Baxter Hand-Clapping Interaction

Fitter, N. T., Huang, Y. E., Mayer, J. P., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the {\em IEEE/RSJ International Conference on Intelligent Robots and Systems} (misc)

hi

[BibTex]

[BibTex]

2015


no image
Haptic Textures for Online Shopping

Culbertson, H., Kuchenbecker, K. J.

Interactive demonstrations in The Retail Collective exhibit, presented at the Dx3 Conference in Toronto, Canada, March 2015 (misc)

hi

[BibTex]

2015


[BibTex]

2014


no image
Teaching Forward and Inverse Kinematics of Robotic Manipulators Via MATLAB

Wong, D., Dames, P., J. Kuchenbecker, K.

June 2014, Presented at {\em ICRA Workshop on {MATLAB/Simulink} for Robotics Education and Research}. Oral presentation given by {Dames} and {Wong} (misc)

hi

[BibTex]

2014


[BibTex]


no image
Control of a Virtual Robot with Fingertip Contact, Pressure, Vibrotactile, and Grip Force Feedback

Pierce, R. M., Fedalei, E. A., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]


no image
A Modular Tactile Motion Guidance System

Kuchenbecker, K. J., Anon, A. M., Barkin, T., deVillafranca, K., Lo, M.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]


no image
The Penn Haptic Texture Toolkit

Culbertson, H., Delgado, J. J. L., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]

2009


no image
Displaying Realistic Contact Accelerations Via a Dedicated Vibration Actuator

McMahan, W., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, Proc. IEEE World Haptics Conference, pp. 613–614, Salt Lake City, Utah, USA, March 2009, {B}est Demonstration Award (misc)

hi

[BibTex]

2009


[BibTex]


no image
The iTorqU 1.0 and 2.0

Winfree, K. N., Gewirtz, J., Mather, T., Fiene, J., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
Vibrotactile Feedback System for Intuitive Upper-Limb Rehabilitation

Kapur, P., Premakumar, S., Jax, S. A., Buxbaum, L. J., Dawson, A. M., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, USA, Proc. IEEE World Haptics Conference, pp. 621–622, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
The SlipGlove

Romano, J. M., Gray, S. R., Jacobs, N. T., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
Real-Time Graphic and Haptic Simulation of Deformable Tissue Puncture

Romano, J. M., Safonova, A., Kuchenbecker, K. J.

Hands-on demonstration presented at Medicine Meets Virtual Reality, Long Beach, California, USA, January 2009 (misc)

hi

[BibTex]

[BibTex]

2008


no image
The Touch Thimble

Kuchenbecker, K. J., Ferguson, D., Kutzer, M., Moses, M., Okamura, A. M.

Hands-on demonstration presented at IEEE Haptics Symposium, Reno, Nevada, USA, March 2008 (misc)

hi

[BibTex]

2008


[BibTex]

2007


no image
Comparing Visual and Haptic Position Feedback

Kuchenbecker, K. J., Gurari, N., Okamura, A. M.

Hands-on demonstration at IEEE World Haptics Conference, Tsukuba, Japan, March 2007 (misc)

hi

[BibTex]

2007


[BibTex]