Header logo is


2016


no image
Qualitative User Reactions to a Hand-Clapping Humanoid Robot

Fitter, N. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 317-327, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

hi

[BibTex]

2016


[BibTex]


no image
Designing and Assessing Expressive Open-Source Faces for the Baxter Robot

Fitter, N. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 340-350, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
Rhythmic Timing in Playful Human-Robot Social Motor Coordination

Fitter, N. T., Hawkes, D. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 296-305, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
Using IMU Data to Demonstrate Hand-Clapping Games to a Robot

Fitter, N. T., Kuchenbecker, K. J.

In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pages: 851 - 856, October 2016, Interactive presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
ProtonPack: A Visuo-Haptic Data Acquisition System for Robotic Learning of Surface Properties

Burka, A., Hu, S., Helgeson, S., Krishnan, S., Gao, Y., Hendricks, L. A., Darrell, T., Kuchenbecker, K. J.

In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), pages: 58-65, 2016, Oral presentation given by Burka (inproceedings)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Equipping the Baxter Robot with Human-Inspired Hand-Clapping Skills

Fitter, N. T., Kuchenbecker, K. J.

In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pages: 105-112, 2016 (inproceedings)

hi

[BibTex]

[BibTex]


no image
Reproducing a Laser Pointer Dot on a Secondary Projected Screen

Hu, S., Kuchenbecker, K. J.

In Proceedings of the IEEE International Conference on Advanced Intelligent Mechatronics (AIM), pages: 1645-1650, 2016, Oral presentation given by Hu (inproceedings)

hi

[BibTex]

[BibTex]


no image
Deep Learning for Tactile Understanding From Visual and Haptic Data

Gao, Y., Hendricks, L. A., Kuchenbecker, K. J., Darrell, T.

In Proceedings of the IEEE International Conference on Robotics and Automation, pages: 536-543, May 2016, Oral presentation given by Gao (inproceedings)

hi

[BibTex]

[BibTex]


no image
Robust Tactile Perception of Artificial Tumors Using Pairwise Comparisons of Sensor Array Readings

Hui, J. C. T., Block, A. E., Taylor, C. J., Kuchenbecker, K. J.

In Proceedings of the IEEE Haptics Symposium, pages: 305-312, Philadelphia, Pennsylvania, USA, April 2016, Oral presentation given by Hui (inproceedings)

hi

[BibTex]

[BibTex]


no image
Data-Driven Comparison of Four Cutaneous Displays for Pinching Palpation in Robotic Surgery

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

In Proceedings of the IEEE Haptics Symposium, pages: 147-154, Philadelphia, Pennsylvania, USA, April 2016, Oral presentation given by Brown (inproceedings)

hi

[BibTex]

[BibTex]


Multisensory Robotic Therapy through Motion Capture and Imitation for Children with ASD
Multisensory Robotic Therapy through Motion Capture and Imitation for Children with ASD

Burns, R., Nizambad, S., Park, C. H., Jeon, M., Howard, A.

Proceedings of the American Society of Engineering Education, Mid-Atlantic Section, Spring Conference, April 2016 (conference)

Abstract
It is known that children with autism have difficulty with emotional communication. As the population of children with autism increases, it is crucial we create effective therapeutic programs that will improve their communication skills. We present an interactive robotic system that delivers emotional and social behaviors for multi­sensory therapy for children with autism spectrum disorders. Our framework includes emotion­-based robotic gestures and facial expressions, as well as tracking and understanding the child’s responses through Kinect motion capture.

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Design and Implementation of a Visuo-Haptic Data Acquisition System for Robotic Learning of Surface Properties

Burka, A., Hu, S., Helgeson, S., Krishnan, S., Gao, Y., Hendricks, L. A., Darrell, T., Kuchenbecker, K. J.

In Proceedings of the IEEE Haptics Symposium, pages: 350-352, April 2016, Work-in-progress paper. Poster presentation given by Burka (inproceedings)

hi

Project Page [BibTex]

Project Page [BibTex]


Multisensory robotic therapy to promote natural emotional interaction for children with ASD
Multisensory robotic therapy to promote natural emotional interaction for children with ASD

Burns, R., Azzi, P., Spadafora, M., Park, C. H., Jeon, M., Kim, H. J., Lee, J., Raihan, K., Howard, A.

Proceedings of the Eleventh ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 571-571, March 2016 (conference)

Abstract
In this video submission, we are introduced to two robots, Romo the penguin and Darwin Mini. We have programmed these robots to perform a variety of emotions through facial expression and body language, respectively. We aim to use these robots with children with autism, to demo safe emotional and social responses in various sensory situations.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder
Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder

Burns, R., Park, C. H., Kim, H. J., Lee, J., Rennie, A., Jeon, M., Howard, A.

In Proceedings of the Eleventh ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 421-422, March 2016 (inproceedings)

Abstract
In this abstract, we present the overarching goal of our interactive robotic framework - to teach emotional and social behavior to children with autism spectrum disorders via multi-sensory therapy. We introduce our robot characters, Romo and Darwin Mini, and the "Five Senses" scenario they will undergo. This sensory game will develop the children's interest, and will model safe and appropriate reactions to typical sensory overload stimuli.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Psychophysical Power Optimization of Friction Modulation for Tactile Interfaces

Sednaoui, T., Vezzoli, E., Gueorguiev, D., Amberg, M., Chappaz, C., Lemaire-Semail, B.

In Haptics: Perception, Devices, Control, and Applications, pages: 354-362, Springer International Publishing, Cham, 2016 (inproceedings)

Abstract
Ultrasonic vibration and electrovibration can modulate the friction between a surface and a sliding finger. The power consumption of these devices is critical to their integration in modern mobile devices such as smartphones. This paper presents a simple control solution to reduce up to 68.8 {\%} this power consumption by taking advantage of the human perception limits.

hi

[BibTex]

[BibTex]


Effect of Waveform in Haptic Perception of Electrovibration on Touchscreens
Effect of Waveform in Haptic Perception of Electrovibration on Touchscreens

Vardar, Y., Güçlü, B., Basdogan, C.

In Haptics: Perception, Devices, Control, and Applications, pages: 190-203, Springer International Publishing, Cham, 2016 (inproceedings)

Abstract
The perceived intensity of electrovibration can be altered by modulating the amplitude, frequency, and waveform of the input voltage signal applied to the conductive layer of a touchscreen. Even though the effect of the first two has been already investigated for sinusoidal signals, we are not aware of any detailed study investigating the effect of the waveform on our haptic perception in the domain of electrovibration. This paper investigates how input voltage waveform affects our haptic perception of electrovibration on touchscreens. We conducted absolute detection experiments using square wave and sinusoidal input signals at seven fundamental frequencies (15, 30, 60, 120, 240, 480 and 1920 Hz). Experimental results depicted the well-known U-shaped tactile sensitivity across frequencies. However, the sensory thresholds were lower for the square wave than the sinusoidal wave at fundamental frequencies less than 60 Hz while they were similar at higher frequencies. Using an equivalent circuit model of a finger-touchscreen system, we show that the sensation difference between the waveforms at low fundamental frequencies can be explained by frequency-dependent electrical properties of human skin and the differential sensitivity of mechanoreceptor channels to individual frequency components in the electrostatic force. As a matter of fact, when the electrostatic force waveforms are analyzed in the frequency domain based on human vibrotactile sensitivity data from the literature [15], the electrovibration stimuli caused by square-wave input signals at all the tested frequencies in this study are found to be detected by the Pacinian psychophysical channel.

hi

vardar_eurohaptics_2016 [BibTex]

vardar_eurohaptics_2016 [BibTex]

2008


no image
A GPU-Based Approach for Real-Time Haptic Rendering of 3D Fluids

Yang, M., Lu, J., Zhou, Z., Safonova, A., Kuchenbecker, K. J.

In Proc. SIGGRAPH Asia Conference, Singapore, December 2008, Oral presentation given by Yang (inproceedings)

hi

[BibTex]

2008


[BibTex]


no image
A Practice-Integrated Curriculum in Mechanical Engineering

Yim, M., Kuchenbecker, K. J., Arratia, P., Bassani, J., Fiene, J. P., Kumar, V., Lukes, J.

In Proc. ASEE Annual Conference and Exposition, Pittsburgh, Pennsylvania, USA, June 2008, Oral presentation given by Yim (inproceedings)

hi

[BibTex]

[BibTex]


no image
Effects of Proprioceptive Motion Feedback on Sighted and Unsighted Control of a Virtual Hand Prosthesis

Blank, A., Okamura, A. M., Kuchenbecker, K. J.

In Proc. IEEE Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pages: 141-142, Reno, Nevada, USA, March 2008, Poster presentation given by Blank (inproceedings)

hi

[BibTex]

[BibTex]


no image
The Touch Thimble: Providing Fingertip Contact Feedback During Point-Force Haptic Interaction

Kuchenbecker, K. J., Ferguson, D., Kutzer, M., Moses, M., Okamura, A. M.

In Proc. IEEE Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pages: 239-246, Reno, Nevada, USA, March 2008, Oral presentation given by Kuchenbecker (inproceedings)

hi

[BibTex]

[BibTex]


no image
Haptography: Capturing the Feel of Real Objects to Enable Authentic Haptic Rendering

Kuchenbecker, K. J.

In Proc. Haptic in Ambient Systems (HAS) Workshop, in conjunction with the First International Conference on Ambient Media and Systems, Montreal, Canada, February 2008 (inproceedings)

hi

[BibTex]

[BibTex]

2006


no image
Improving Telerobotic Touch Via High-Frequency Acceleration Matching

Kuchenbecker, K. J., Niemeyer, G.

In Proc. IEEE International Conference on Robotics and Automation, pages: 3893-3898, Orlando, Florida, USA, May 2006, Oral presentation given by Kuchenbecker (inproceedings)

hi

[BibTex]

2006


[BibTex]


no image
Event-Based Haptic Tapping with Grip Force Compensation

Fiene, J. P., Kuchenbecker, K. J., Niemeyer, G.

In Proc. IEEE Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pages: 117-123, Arlington, Virginia, USA, March 2006, Oral presentation given by Fiene (inproceedings)

hi

[BibTex]

[BibTex]