Header logo is


2016


no image
Qualitative User Reactions to a Hand-Clapping Humanoid Robot

Fitter, N. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 317-327, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

hi

[BibTex]

2016


[BibTex]


no image
Designing and Assessing Expressive Open-Source Faces for the Baxter Robot

Fitter, N. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 340-350, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
Rhythmic Timing in Playful Human-Robot Social Motor Coordination

Fitter, N. T., Hawkes, D. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 296-305, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
An electro-active polymer based lens module for dynamically varying focal system

Yun, S., Park, S., Nam, S., Park, B., Park, S. K., Mun, S., Lim, J. M., Kyung, K.

Applied Physics Letters, 109(14):141908, October 2016 (article)

Abstract
We demonstrate a polymer-based active-lens module allowing a dynamic focus controllable optical system with a wide tunable range. The active-lens module is composed of parallelized two active- lenses with a convex and a concave shaped hemispherical lens structure, respectively. Under opera- tion with dynamic input voltage signals, each active-lens produces translational movement bi-directionally responding to a hybrid driving force that is a combination of an electro-active response of a thin dielectric elastomer membrane and an electro-static attraction force. Since the proposed active lens module widely modulates a gap-distance between lens-elements, an optical system based on the active-lens module provides widely-variable focusing for selective imaging of objects in arbitrary position.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Using IMU Data to Demonstrate Hand-Clapping Games to a Robot

Fitter, N. T., Kuchenbecker, K. J.

In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pages: 851 - 856, October 2016, Interactive presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
Wrinkle structures formed by formulating UV-crosslinkable liquid prepolymers

Park, S. K., Kwark, Y., Nam, S., Park, S., Park, B., Yun, S., Moon, J., Lee, J., Yu, B., Kyung, K.

Polymer, 99, pages: 447-452, September 2016 (article)

Abstract
Artificial wrinkles have recently been in the spotlight due to their potential use in high-tech applications. A spontaneously wrinkled film can be fabricated from UV-crosslinkable liquid prepolymers. Here, we controlled the wrinkle formation by simply formulating two UV-crosslinkable liquid prepolymers, tetraethylene glycol bis(4-ethenyl-2,3,5,6-tetrafluorophenyl) ether (TEGDSt) and tetraethylene glycol diacrylate (TEGDA). The wrinkles were formed from the TEGDSt/TEGDA formulated prepolymer layers containing up to 30 wt% of TEGDA. The wrinkle formation depended upon the rate of photo-crosslinking reaction of the formulated prepolymers. The first order apparent rate constant, kapp, was between ca. 5.7 × 10−3 and 12.2 × 10−3 s−1 for the wrinkle formation. The wrinkle structures were modulated within the kapp mainly due to variation in the extent of shrinkage of the formulated prepolymer layers with the content of TEGDA

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Numerical Investigation of Frictional Forces Between a Finger and a Textured Surface During Active Touch

Khojasteh, B., Janko, M., Visell, Y.

Extended abstract presented in form of an oral presentation at the 3rd International Conference on BioTribology (ICoBT), London, England, September 2016 (misc)

Abstract
The biomechanics of the human finger pad has been investigated in relation to motor behaviour and sensory function in the upper limb. While the frictional properties of the finger pad are important for grip and grasp function, recent attention has also been given to the roles played by friction when perceiving a surface via sliding contact. Indeed, the mechanics of sliding contact greatly affect stimuli felt by the finger scanning a surface. Past research has shed light on neural mechanisms of haptic texture perception, but the relation with time-resolved frictional contact interactions is unknown. Current biotribological models cannot predict time-resolved frictional forces felt by a finger as it slides on a rough surface. This constitutes a missing link in understanding the mechanical basis of texture perception. To ameliorate this, we developed a two-dimensional finite element numerical simulation of a human finger pad in sliding contact with a textured surface. Our model captures bulk mechanical properties, including hyperelasticity, dissipation, and tissue heterogeneity, and contact dynamics. To validate it, we utilized a database of measurements that we previously captured with a variety of human fingers and surfaces. By designing the simulations to match the measurements, we evaluated the ability of the FEM model to predict time-resolved sliding frictional forces. We varied surface texture wavelength, sliding speed, and normal forces in the experiments. An analysis of the results indicated that both time- and frequency-domain features of forces produced during finger-surface sliding interactions were reproduced, including many of the phenomena that we observed in analyses of real measurements, including quasiperiodicity, harmonic distortion and spectral decay in the frequency domain, and their dependence on kinetics and surface properties. The results shed light on frictional signatures of surface texture during active touch, and may inform understanding of the role played by friction in texture discrimination.

hi

[BibTex]

[BibTex]


no image
ProtonPack: A Visuo-Haptic Data Acquisition System for Robotic Learning of Surface Properties

Burka, A., Hu, S., Helgeson, S., Krishnan, S., Gao, Y., Hendricks, L. A., Darrell, T., Kuchenbecker, K. J.

In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), pages: 58-65, 2016, Oral presentation given by Burka (inproceedings)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Equipping the Baxter Robot with Human-Inspired Hand-Clapping Skills

Fitter, N. T., Kuchenbecker, K. J.

In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pages: 105-112, 2016 (inproceedings)

hi

[BibTex]

[BibTex]


Thumb xl romo and mini
Behavioral Analysis Automation for Music-Based Robotic Therapy for Children with Autism Spectrum Disorder

Burns, R., Nizambad, S., Park, C. H., Jeon, M., Howard, A.

Workshop paper (5 pages) at the RO-MAN Workshop on Behavior Adaptation, Interaction and Learning for Assistive Robotics, August 2016 (misc)

Abstract
In this full workshop paper, we discuss the positive impacts of robot, music, and imitation therapies on children with autism. We also discuss the use of Laban Motion Analysis (LMA) to identify emotion through movement and posture cues. We present our preliminary studies of the "Five Senses" game that our two robots, Romo the penguin and Darwin Mini, partake in. Using an LMA-focused approach (enabled by our skeletal tracking Kinect algorithm), we find that our participants show increased frequency of movement and speed when the game has a musical accompaniment. Therefore, participants may have increased engagement with our robots and game if music is present. We also begin exploring motion learning for future works.

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Reproducing a Laser Pointer Dot on a Secondary Projected Screen

Hu, S., Kuchenbecker, K. J.

In Proceedings of the IEEE International Conference on Advanced Intelligent Mechatronics (AIM), pages: 1645-1650, 2016, Oral presentation given by Hu (inproceedings)

hi

[BibTex]

[BibTex]


no image
Design and evaluation of a novel mechanical device to improve hemiparetic gait: a case report

Fjeld, K., Hu, S., Kuchenbecker, K. J., Vasudevan, E. V.

Extended abstract presented at the Biomechanics and Neural Control of Movement Conference (BANCOM), 2016, Poster presentation given by Fjeld (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Deep Learning for Tactile Understanding From Visual and Haptic Data

Gao, Y., Hendricks, L. A., Kuchenbecker, K. J., Darrell, T.

In Proceedings of the IEEE International Conference on Robotics and Automation, pages: 536-543, May 2016, Oral presentation given by Gao (inproceedings)

hi

[BibTex]

[BibTex]


no image
Robust Tactile Perception of Artificial Tumors Using Pairwise Comparisons of Sensor Array Readings

Hui, J. C. T., Block, A. E., Taylor, C. J., Kuchenbecker, K. J.

In Proceedings of the IEEE Haptics Symposium, pages: 305-312, Philadelphia, Pennsylvania, USA, April 2016, Oral presentation given by Hui (inproceedings)

hi

[BibTex]

[BibTex]


no image
Data-Driven Comparison of Four Cutaneous Displays for Pinching Palpation in Robotic Surgery

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

In Proceedings of the IEEE Haptics Symposium, pages: 147-154, Philadelphia, Pennsylvania, USA, April 2016, Oral presentation given by Brown (inproceedings)

hi

[BibTex]

[BibTex]


Thumb xl romo breakdown
Multisensory Robotic Therapy through Motion Capture and Imitation for Children with ASD

Burns, R., Nizambad, S., Park, C. H., Jeon, M., Howard, A.

Proceedings of the ASEE Spring 2016 Middle Atlantic Section Conference, April 2016 (conference)

Abstract
It is known that children with autism have difficulty with emotional communication. As the population of children with autism increases, it is crucial we create effective therapeutic programs that will improve their communication skills. We present an interactive robotic system that delivers emotional and social behaviors for multi­sensory therapy for children with autism spectrum disorders. Our framework includes emotion­-based robotic gestures and facial expressions, as well as tracking and understanding the child’s responses through Kinect motion capture.

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Design and Implementation of a Visuo-Haptic Data Acquisition System for Robotic Learning of Surface Properties

Burka, A., Hu, S., Helgeson, S., Krishnan, S., Gao, Y., Hendricks, L. A., Darrell, T., Kuchenbecker, K. J.

In Proceedings of the IEEE Haptics Symposium, pages: 350-352, April 2016, Work-in-progress paper. Poster presentation given by Burka (inproceedings)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Objective assessment of robotic surgical skill using instrument contact vibrations

Gomez, E. D., Aggarwal, R., McMahan, W., Bark, K., Kuchenbecker, K. J.

Surgical Endoscopy, 30(4):1419-1431, 2016 (article)

hi

[BibTex]

[BibTex]


no image
One Sensor, Three Displays: A Comparison of Tactile Rendering from a BioTac Sensor

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Philadelphia, Pennsylvania, USA, April 2016 (misc)

hi

[BibTex]

[BibTex]


Thumb xl angry romo
Multisensory robotic therapy to promote natural emotional interaction for children with ASD

Burns, R., Azzi, P., Spadafora, M., Park, C. H., Jeon, M., Kim, H. J., Lee, J., Raihan, K., Howard, A.

Proceedings of the Eleventh ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 571-571, March 2016 (conference)

Abstract
In this video submission, we are introduced to two robots, Romo the penguin and Darwin Mini. We have programmed these robots to perform a variety of emotions through facial expression and body language, respectively. We aim to use these robots with children with autism, to demo safe emotional and social responses in various sensory situations.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Thumb xl interactive
Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder

Burns, R., Park, C. H., Kim, H. J., Lee, J., Rennie, A., Jeon, M., Howard, A.

In Proceedings of the Eleventh ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 421-422, March 2016 (inproceedings)

Abstract
In this abstract, we present the overarching goal of our interactive robotic framework - to teach emotional and social behavior to children with autism spectrum disorders via multi-sensory therapy. We introduce our robot characters, Romo and Darwin Mini, and the "Five Senses" scenario they will undergo. This sensory game will develop the children's interest, and will model safe and appropriate reactions to typical sensory overload stimuli.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery

Pacchierotti, C., Prattichizzo, D., Kuchenbecker, K. J.

IEEE Transactions on Biomedical Engineering, 63(2):278-287, February 2016 (article)

hi

[BibTex]

[BibTex]


no image
Structure modulated electrostatic deformable mirror for focus and geometry control

Nam, S., Park, S., Yun, S., Park, B., Park, S. K., Kyung, K.

Optics Express, 24(1):55-66, OSA, January 2016 (article)

Abstract
We suggest a way to electrostatically control deformed geometry of an electrostatic deformable mirror (EDM) based on geometric modulation of a basement. The EDM is composed of a metal coated elastomeric membrane (active mirror) and a polymeric basement with electrode (ground). When an electrical voltage is applied across the components, the active mirror deforms toward the stationary basement responding to electrostatic attraction force in an air gap. Since the differentiated gap distance can induce change in electrostatic force distribution between the active mirror and the basement, the EDMs are capable of controlling deformed geometry of the active mirror with different basement structures (concave, flat, and protrusive). The modulation of the deformed geometry leads to significant change in the range of the focal length of the EDMs. Even under dynamic operations, the EDM shows fairly consistent and large deformation enough to change focal length in a wide frequency range (1~175 Hz). The geometric modulation of the active mirror with dynamic focus tunability can allow the EDM to be an active mirror lens for optical zoom devices as well as an optical component controlling field of view.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Designing Human-Robot Exercise Games for Baxter

Fitter, N. T., Hawkes, D. T., Johnson, M. J., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Psychophysical Power Optimization of Friction Modulation for Tactile Interfaces

Sednaoui, T., Vezzoli, E., Gueorguiev, D., Amberg, M., Chappaz, C., Lemaire-Semail, B.

In Haptics: Perception, Devices, Control, and Applications, pages: 354-362, Springer International Publishing, Cham, 2016 (inproceedings)

Abstract
Ultrasonic vibration and electrovibration can modulate the friction between a surface and a sliding finger. The power consumption of these devices is critical to their integration in modern mobile devices such as smartphones. This paper presents a simple control solution to reduce up to 68.8 {\%} this power consumption by taking advantage of the human perception limits.

hi

[BibTex]

[BibTex]


Thumb xl screen shot 2018 05 04 at 11.40.29
Effect of Waveform in Haptic Perception of Electrovibration on Touchscreens

Vardar, Y., Güçlü, B., Basdogan, C.

In Haptics: Perception, Devices, Control, and Applications, pages: 190-203, Springer International Publishing, Cham, 2016 (inproceedings)

Abstract
The perceived intensity of electrovibration can be altered by modulating the amplitude, frequency, and waveform of the input voltage signal applied to the conductive layer of a touchscreen. Even though the effect of the first two has been already investigated for sinusoidal signals, we are not aware of any detailed study investigating the effect of the waveform on our haptic perception in the domain of electrovibration. This paper investigates how input voltage waveform affects our haptic perception of electrovibration on touchscreens. We conducted absolute detection experiments using square wave and sinusoidal input signals at seven fundamental frequencies (15, 30, 60, 120, 240, 480 and 1920 Hz). Experimental results depicted the well-known U-shaped tactile sensitivity across frequencies. However, the sensory thresholds were lower for the square wave than the sinusoidal wave at fundamental frequencies less than 60 Hz while they were similar at higher frequencies. Using an equivalent circuit model of a finger-touchscreen system, we show that the sensation difference between the waveforms at low fundamental frequencies can be explained by frequency-dependent electrical properties of human skin and the differential sensitivity of mechanoreceptor channels to individual frequency components in the electrostatic force. As a matter of fact, when the electrostatic force waveforms are analyzed in the frequency domain based on human vibrotactile sensitivity data from the literature [15], the electrovibration stimuli caused by square-wave input signals at all the tested frequencies in this study are found to be detected by the Pacinian psychophysical channel.

hi

vardar_eurohaptics_2016 [BibTex]

vardar_eurohaptics_2016 [BibTex]


no image
Peripheral vs. central determinants of vibrotactile adaptation

Klöcker, A., Gueorguiev, D., Thonnard, J. L., Mouraux, A.

Journal of Neurophysiology, 115(2):685-691, 2016, PMID: 26581868 (article)

Abstract
Long-lasting mechanical vibrations applied to the skin induce a reversible decrease in the perception of vibration at the stimulated skin site. This phenomenon of vibrotactile adaptation has been studied extensively, yet there is still no clear consensus on the mechanisms leading to vibrotactile adaptation. In particular, the respective contributions of 1) changes affecting mechanical skin impedance, 2) peripheral processes, and 3) central processes are largely unknown. Here we used direct electrical stimulation of nerve fibers to bypass mechanical transduction processes and thereby explore the possible contribution of central vs. peripheral processes to vibrotactile adaptation. Three experiments were conducted. In the first, adaptation was induced with mechanical vibration of the fingertip (51- or 251-Hz vibration delivered for 8 min, at 40× detection threshold). In the second, we attempted to induce adaptation with transcutaneous electrical stimulation of the median nerve (51- or 251-Hz constant-current pulses delivered for 8 min, at 1.5× detection threshold). Vibrotactile detection thresholds were measured before and after adaptation. Mechanical stimulation induced a clear increase of vibrotactile detection thresholds. In contrast, thresholds were unaffected by electrical stimulation. In the third experiment, we assessed the effect of mechanical adaptation on the detection thresholds to transcutaneous electrical nerve stimuli, measured before and after adaptation. Electrical detection thresholds were unaffected by the mechanical adaptation. Taken together, our results suggest that vibrotactile adaptation is predominantly the consequence of peripheral mechanoreceptor processes and/or changes in biomechanical properties of the skin.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Silent Expectations: Dynamic Causal Modeling of Cortical Prediction and Attention to Sounds That Weren’t

Chennu, S., Noreika, V., Gueorguiev, D., Shtyrov, Y., Bekinschtein, T. A., Henson, R.

Journal of Neuroscience, 36(32):8305-8316, Society for Neuroscience, 2016 (article)

Abstract
There is increasing evidence that human perception is realized by a hierarchy of neural processes in which predictions sent backward from higher levels result in prediction errors that are fed forward from lower levels, to update the current model of the environment. Moreover, the precision of prediction errors is thought to be modulated by attention. Much of this evidence comes from paradigms in which a stimulus differs from that predicted by the recent history of other stimuli (generating a so-called {\textquotedblleft}mismatch response{\textquotedblright}). There is less evidence from situations where a prediction is not fulfilled by any sensory input (an {\textquotedblleft}omission{\textquotedblright} response). This situation arguably provides a more direct measure of {\textquotedblleft}top-down{\textquotedblright} predictions in the absence of confounding {\textquotedblleft}bottom-up{\textquotedblright} input. We applied Dynamic Causal Modeling of evoked electromagnetic responses recorded by EEG and MEG to an auditory paradigm in which we factorially crossed the presence versus absence of {\textquotedblleft}bottom-up{\textquotedblright} stimuli with the presence versus absence of {\textquotedblleft}top-down{\textquotedblright} attention. Model comparison revealed that both mismatch and omission responses were mediated by increased forward and backward connections, differing primarily in the driving input. In both responses, modeling results suggested that the presence of attention selectively modulated backward {\textquotedblleft}prediction{\textquotedblright} connections. Our results provide new model-driven evidence of the pure top-down prediction signal posited in theories of hierarchical perception, and highlight the role of attentional precision in strengthening this prediction.SIGNIFICANCE STATEMENT Human auditory perception is thought to be realized by a network of neurons that maintain a model of and predict future stimuli. Much of the evidence for this comes from experiments where a stimulus unexpectedly differs from previous ones, which generates a well-known {\textquotedblleft}mismatch response.{\textquotedblright} But what happens when a stimulus is unexpectedly omitted altogether? By measuring the brain{\textquoteright}s electromagnetic activity, we show that it also generates an {\textquotedblleft}omission response{\textquotedblright} that is contingent on the presence of attention. We model these responses computationally, revealing that mismatch and omission responses only differ in the location of inputs into the same underlying neuronal network. In both cases, we show that attention selectively strengthens the brain{\textquoteright}s prediction of the future.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Touch uses frictional cues to discriminate flat materials

Gueorguiev, D., Bochereau, S., Mouraux, A., Hayward, V., Thonnard, J.

Scientific reports, 6, pages: 25553, Nature Publishing Group, 2016 (article)

hi

[BibTex]

[BibTex]


no image
IMU-Mediated Real-Time Human-Baxter Hand-Clapping Interaction

Fitter, N. T., Huang, Y. E., Mayer, J. P., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the {\em IEEE/RSJ International Conference on Intelligent Robots and Systems} (misc)

hi

[BibTex]

[BibTex]

2015


no image
Reducing Student Anonymity and Increasing Engagement

Kuchenbecker, K. J.

University of Pennsylvania Almanac, 62(18):8, November 2015 (article)

hi

[BibTex]

2015


[BibTex]


no image
Surgeons and Non-Surgeons Prefer Haptic Feedback of Instrument Vibrations During Robotic Surgery

Koehn, J. K., Kuchenbecker, K. J.

Surgical Endoscopy, 29(10):2970-2983, October 2015 (article)

hi

[BibTex]

[BibTex]


no image
Displaying Sensed Tactile Cues with a Fingertip Haptic Device

Pacchierotti, C., Prattichizzo, D., Kuchenbecker, K. J.

IEEE Transactions on Haptics, 8(4):384-396, October 2015 (article)

hi

[BibTex]

[BibTex]


no image
A thin film active-lens with translational control for dynamically programmable optical zoom

Yun, S., Park, S., Park, B., Nam, S., Park, S. K., Kyung, K.

Applied Physics Letters, 107(8):081907, AIP Publishing, August 2015 (article)

Abstract
We demonstrate a thin film active-lens for rapidly and dynamically controllable optical zoom. The active-lens is composed of a convex hemispherical polydimethylsiloxane (PDMS) lens structure working as an aperture and a dielectric elastomer (DE) membrane actuator, which is a combination of a thin DE layer made with PDMS and a compliant electrode pattern using silver-nanowires. The active-lens is capable of dynamically changing focal point of the soft aperture as high as 18.4% through its translational movement in vertical direction responding to electrically induced bulged-up deformation of the DE membrane actuator. Under operation with various sinusoidal voltage signals, the movement responses are fairly consistent with those estimated from numerical simulation. The responses are not only fast, fairly reversible, and highly durable during continuous cyclic operations, but also large enough to impart dynamic focus tunability for optical zoom in microscopic imaging devices with a light-weight and ultra-slim configuration.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Toward a large-scale visuo-haptic dataset for robotic learning

Burka, A., Hu, S., Krishnan, S., Kuchenbecker, K. J., Hendricks, L. A., Gao, Y., Darrell, T.

In Proc. CVPR Workshop on the Future of Datasets in Vision, 2015 (inproceedings)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Detecting Lumps in Simulated Tissue via Palpation with a BioTac

Hui, J., Block, A., Kuchenbecker, K. J.

In Proc. IEEE World Haptics Conference, 2015, Work-in-progress paper. Poster presentation given by Hui (inproceedings)

hi

[BibTex]

[BibTex]


no image
Analysis of the Instrument Vibrations and Contact Forces Caused by an Expert Robotic Surgeon Doing FRS Tasks

Brown, J. D., O’Brien, C., Miyasaka, K., Dumon, K. R., Kuchenbecker, K. J.

In Proc. Hamlyn Symposium on Medical Robotics, pages: 75-76, London, England, June 2015, Poster presentation given by Brown (inproceedings)

hi

[BibTex]

[BibTex]


no image
Should Haptic Texture Vibrations Respond to User Force and Speed?

Culbertson, H., Kuchenbecker, K. J.

In IEEE World Haptics Conference, pages: 106 - 112, Evanston, Illinois, USA, June 2015, Oral presentation given by Culbertson (inproceedings)

hi

[BibTex]

[BibTex]


no image
Enabling the Baxter Robot to Play Hand-Clapping Games

Fitter, N. T., Neuburger, M., Kuchenbecker, K. J.

In Proc. IEEE World Haptics Conference, June 2015, Work-in-progress paper. Poster presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
Data-Driven Motion Mappings Improve Transparency in Teleoperation

Khurshid, R. P., Kuchenbecker, K. J.

Presence: Teleoperators and Virtual Environments, 24(2):132-154, May 2015 (article)

hi

[BibTex]

[BibTex]


no image
Using IMU Data to Teach a Robot Hand-Clapping Games

Fitter, N. T., Kuchenbecker, K. J.

In Proc. IEEE Haptics Symposium, pages: 353-355, April 2015, Work-in-progress paper. Poster presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
Haptic Feedback in Transoral Robotic Surgery: A Feasibility Study

Bur, A. M., Gomez, E. D., Rassekh, C. H., Newman, J. G., Weinstein, G. S., Kuchenbecker, K. J.

In Proc. Annual Meeting of the Triological Society at COSM, April 2015, Poster presentation given by Bur (inproceedings)

hi

[BibTex]

[BibTex]


no image
Haptic Textures for Online Shopping

Culbertson, H., Kuchenbecker, K. J.

Interactive demonstrations in The Retail Collective exhibit, presented at the Dx3 Conference in Toronto, Canada, March 2015 (misc)

hi

[BibTex]

[BibTex]


no image
Design and Validation of a Practical Simulator for Transoral Robotic Surgery

Bur, A. M., Gomez, E. D., Chalian, A. A., Newman, J. G., Weinstein, G. S., Kuchenbecker, K. J.

In Proc. Society for Robotic Surgery Annual Meeting: Transoral Program, (T8), February 2015, Oral presentation given by Bur (inproceedings)

hi

[BibTex]

[BibTex]


no image
Robotic Learning of Haptic Adjectives Through Physical Interaction

Chu, V., McMahon, I., Riano, L., McDonald, C. G., He, Q., Perez-Tejada, J. M., Arrigo, M., Darrell, T., Kuchenbecker, K. J.

Robotics and Autonomous Systems, 63(3):279-292, 2015, Vivian Chu, Ian MacMahon, and Lorenzo Riano contributed equally to this publication. Corrigendum published in June 2016 (article)

hi

[BibTex]

[BibTex]


no image
Effects of Vibrotactile Feedback on Human Motor Learning of Arbitrary Arm Motions

Bark, K., Hyman, E., Tan, F., Cha, E., Jax, S. A., Buxbaum, L. J., Kuchenbecker, K. J.

IEEE Transactions on Neural Systems and Rehabilitation Engineering, 23(1):51-63, January 2015 (article)

hi

[BibTex]

[BibTex]

2009


no image
Image-Enabled Force Feedback for Robotic Teleoperation of a Flexible Tool

Lindsey, Q., Tenenholtz, N., Lee, D. I., Kuchenbecker, K. J.

In Proc. IASTED International Conference on Robotics and Applications, pages: 224-233, Boston, Massachusetts, November 2009, Oral presentation given by Lindsey (inproceedings)

hi

[BibTex]

2009


[BibTex]


no image
GPU Methods for Real-Time Haptic Interaction with 3D Fluids

Yang, M., Lu, J., Safonova, A., Kuchenbecker, K. J.

In Proc. IEEE International Workshop on Haptic Audio-Visual Environments and Games, pages: 24-29, Lecco, Italy, November 2009, Oral presentation given by Kuchenbecker (inproceedings)

hi

[BibTex]

[BibTex]


no image
The AirWand: Design and Characterization of a Large-Workspace Haptic Device

Romano, J. M., Kuchenbecker, K. J.

In Proc. IEEE International Conference on Robotics and Automation, pages: 1461-1466, Kobe, Japan, May 2009, Oral presentation given by \uline{Romano} (inproceedings)

hi

[BibTex]

[BibTex]


no image
Stiffness Discrimination with Visual and Proprioceptive Cues

Gurari, N., Kuchenbecker, K. J., Okamura, A. M.

In Proc. IEEE World Haptics Conference, pages: 121-126, Salt Lake City, Utah, USA, March 2009, Poster presentation given by Gurari (inproceedings)

hi

[BibTex]

[BibTex]