Header logo is


2018


no image
Reducing 3D Vibrations to 1D in Real Time

Park, G., Kuchenbecker, K. J.

pages: 21-24, Hands-on demonstration (4 pages) presented at AsiaHaptics, Incheon, South Korea, November 2018 (misc)

Abstract
For simple and realistic vibrotactile feedback, 3D accelerations from real contact interactions are usually rendered using a single-axis vibration actuator; this dimensional reduction can be performed in many ways. This demonstration implements a real-time conversion system that simultaneously measures 3D accelerations and renders corresponding 1D vibrations using a two-pen interface. In the demonstration, a user freely interacts with various objects using an In-Pen that contains a 3-axis accelerometer. The captured accelerations are converted to a single-axis signal, and an Out-Pen renders the reduced signal for the user to feel. We prepared seven conversion methods from the simple use of a single-axis signal to applying principal component analysis (PCA) so that users can compare the performance of each conversion method in this demonstration.

hi

DOI Project Page [BibTex]

2018


DOI Project Page [BibTex]


A Large-Scale Fabric-Based Tactile Sensor Using Electrical Resistance Tomography
A Large-Scale Fabric-Based Tactile Sensor Using Electrical Resistance Tomography

Lee, H., Park, K., Kim, J., Kuchenbecker, K. J.

pages: 107-109, Hands-on demonstration (3 pages) presented at AsiaHaptics, Incheon, South Korea, November 2018 (misc)

Abstract
Large-scale tactile sensing is important for household robots and human-robot interaction because contacts can occur all over a robot’s body surface. This paper presents a new fabric-based tactile sensor that is straightforward to manufacture and can cover a large area. The tactile sensor is made of conductive and non-conductive fabric layers, and the electrodes are stitched with conductive thread, so the resulting device is flexible and stretchable. The sensor utilizes internal array electrodes and a reconstruction method called electrical resistance tomography (ERT) to achieve a high spatial resolution with a small number of electrodes. The developed sensor shows that only 16 electrodes can accurately estimate single and multiple contacts over a square that measures 20 cm by 20 cm.

hi

DOI Project Page [BibTex]

DOI Project Page [BibTex]


no image
Multi-objective Optimization of Nonconventional Laminated Composite Panels

Serhat, G.

Koc University, October 2018 (phdthesis)

Abstract
Laminated composite panels are extensively used in various industries due to their high stiffness-to-weight ratio and directional properties that allow optimization of stiffness characteristics for specific applications. With the recent improvements in the manufacturing techniques, the technology trend has been shifting towards the development of nonconventional composites. This work aims to develop new methods for the design and optimization of nonconventional laminated composites. Lamination parameters method is used to characterize laminate stiffness matrices in a compact form. An optimization framework based on finite element analysis was developed to calculate the solutions for different panel geometries, boundary conditions and load cases. The first part of the work addresses the multi-objective optimization of composite laminates to maximize dynamic and load-carrying performances simultaneously. Conforming and conflicting behaviors of multiple objective functions are investigated by determining Pareto-optimal solutions, which provide a valuable insight for multi-objective optimization problems. In the second part, design of curved laminated panels for optimal dynamic response is studied in detail. Firstly, the designs yielding maximum fundamental frequency values are computed. Next, optimal designs minimizing equivalent radiated power are obtained for the panels under harmonic pressure excitation, and their effective frequency bands are shown. The relationship between these two design sets is investigated to study the effectiveness of the frequency maximization technique. In the last part, a new method based on lamination parameters is proposed for the design of variable-stiffness composite panels. The results demonstrate that the proposed method provides manufacturable designs with smooth fiber paths that outperform the constant-stiffness laminates, while utilizing the advantages of lamination parameters formulation.

hi

Multi-objective Optimization of Nonconventional Laminated Composite Panels DOI [BibTex]


Statistical Modelling of Fingertip Deformations and Contact Forces during Tactile Interaction
Statistical Modelling of Fingertip Deformations and Contact Forces during Tactile Interaction

Gueorguiev, D., Tzionas, D., Pacchierotti, C., Black, M. J., Kuchenbecker, K. J.

Extended abstract presented at the Hand, Brain and Technology conference (HBT), Ascona, Switzerland, August 2018 (misc)

Abstract
Little is known about the shape and properties of the human finger during haptic interaction, even though these are essential parameters for controlling wearable finger devices and deliver realistic tactile feedback. This study explores a framework for four-dimensional scanning (3D over time) and modelling of finger-surface interactions, aiming to capture the motion and deformations of the entire finger with high resolution while simultaneously recording the interfacial forces at the contact. Preliminary results show that when the fingertip is actively pressing a rigid surface, it undergoes lateral expansion and proximal/distal bending, deformations that cannot be captured by imaging of the contact area alone. Therefore, we are currently capturing a dataset that will enable us to create a statistical model of the finger’s deformations and predict the contact forces induced by tactile interaction with objects. This technique could improve current methods for tactile rendering in wearable haptic devices, which rely on general physical modelling of the skin’s compliance, by developing an accurate model of the variations in finger properties across the human population. The availability of such a model will also enable a more realistic simulation of virtual finger behaviour in virtual reality (VR) environments, as well as the ability to accurately model a specific user’s finger from lower resolution data. It may also be relevant for inferring the physical properties of the underlying tissue from observing the surface mesh deformations, as previously shown for body tissues.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Instrumentation, Data, and Algorithms for Visually Understanding Haptic Surface Properties

Burka, A. L.

University of Pennsylvania, Philadelphia, USA, August 2018, Department of Electrical and Systems Engineering (phdthesis)

Abstract
Autonomous robots need to efficiently walk over varied surfaces and grasp diverse objects. We hypothesize that the association between how such surfaces look and how they physically feel during contact can be learned from a database of matched haptic and visual data recorded from various end-effectors' interactions with hundreds of real-world surfaces. Testing this hypothesis required the creation of a new multimodal sensing apparatus, the collection of a large multimodal dataset, and development of a machine-learning pipeline. This thesis begins by describing the design and construction of the Portable Robotic Optical/Tactile ObservatioN PACKage (PROTONPACK, or Proton for short), an untethered handheld sensing device that emulates the capabilities of the human senses of vision and touch. Its sensory modalities include RGBD vision, egomotion, contact force, and contact vibration. Three interchangeable end-effectors (a steel tooling ball, an OptoForce three-axis force sensor, and a SynTouch BioTac artificial fingertip) allow for different material properties at the contact point and provide additional tactile data. We then detail the calibration process for the motion and force sensing systems, as well as several proof-of-concept surface discrimination experiments that demonstrate the reliability of the device and the utility of the data it collects. This thesis then presents a large-scale dataset of multimodal surface interaction recordings, including 357 unique surfaces such as furniture, fabrics, outdoor fixtures, and items from several private and public material sample collections. Each surface was touched with one, two, or three end-effectors, comprising approximately one minute per end-effector of tapping and dragging at various forces and speeds. We hope that the larger community of robotics researchers will find broad applications for the published dataset. Lastly, we demonstrate an algorithm that learns to estimate haptic surface properties given visual input. Surfaces were rated on hardness, roughness, stickiness, and temperature by the human experimenter and by a pool of purely visual observers. Then we trained an algorithm to perform the same task as well as infer quantitative properties calculated from the haptic data. Overall, the task of predicting haptic properties from vision alone proved difficult for both humans and computers, but a hybrid algorithm using a deep neural network and a support vector machine achieved a correlation between expected and actual regression output between approximately ρ = 0.3 and ρ = 0.5 on previously unseen surfaces.

hi

Project Page [BibTex]

Project Page [BibTex]


Robust Visual Augmented Reality in Robot-Assisted Surgery
Robust Visual Augmented Reality in Robot-Assisted Surgery

Forte, M. P.

Politecnico di Milano, Milan, Italy, July 2018, Department of Electronic, Information, and Biomedical Engineering (mastersthesis)

Abstract
The broader research objective of this line of research is to test the hypothesis that real-time stereo video analysis and augmented reality can increase safety and task efficiency in robot-assisted surgery. This master’s thesis aims to solve the first step needed to achieve this goal: the creation of a robust system that delivers the envisioned feedback to a surgeon while he or she controls a surgical robot that is identical to those used on human patients. Several approaches for applying augmented reality to da Vinci Surgical Systems have been proposed, but none of them entirely rely on a clinical robot; specifically, they require additional sensors, depend on access to the da Vinci API, are designed for a very specific task, or were tested on systems that are starkly different from those in clinical use. There has also been prior work that presents the real-world camera view and the computer graphics on separate screens, or not in real time. In other scenarios, the digital information is overlaid manually by the surgeons themselves or by computer scientists, rather than being generated automatically in response to the surgeon’s actions. We attempted to overcome the aforementioned constraints by acquiring input signals from the da Vinci stereo endoscope and providing augmented reality to the console in real time (less than 150 ms delay, including the 62 ms of inherent latency of the da Vinci). The potential benefits of the resulting system are broad because it was built to be general, rather than customized for any specific task. The entire platform is compatible with any generation of the da Vinci System and does not require a dVRK (da Vinci Research Kit) or access to the API. Thus, it can be applied to existing da Vinci Systems in operating rooms around the world.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Reducing 3D Vibrations to 1D in Real Time

Park, G., Kuchenbecker, K. J.

Hands-on demonstration presented at EuroHaptics, Pisa, Italy, June 2018 (misc)

Abstract
In this demonstration, you will hold two pen-shaped modules: an in-pen and an out-pen. The in-pen is instrumented with a high-bandwidth three-axis accelerometer, and the out-pen contains a one-axis voice coil actuator. Use the in-pen to interact with different surfaces; the measured 3D accelerations are continually converted into 1D vibrations and rendered with the out-pen for you to feel. You can test conversion methods that range from simply selecting a single axis to applying a discrete Fourier transform or principal component analysis for realistic and brisk real-time conversion.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Haptipedia: Exploring Haptic Device Design Through Interactive Visualizations

Seifi, H., Fazlollahi, F., Park, G., Kuchenbecker, K. J., MacLean, K. E.

Hands-on demonstration presented at EuroHaptics, Pisa, Italy, June 2018 (misc)

Abstract
How many haptic devices have been proposed in the last 30 years? How can we leverage this rich source of design knowledge to inspire future innovations? Our goal is to make historical haptic invention accessible through interactive visualization of a comprehensive library – a Haptipedia – of devices that have been annotated with designer-relevant metadata. In this demonstration, participants can explore Haptipedia’s growing library of grounded force feedback devices through several prototype visualizations, interact with 3D simulations of the device mechanisms and movements, and tell us about the attributes and devices that could make Haptipedia a useful resource for the haptic design community.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Delivering 6-DOF Fingertip Tactile Cues

Young, E., Kuchenbecker, K. J.

Work-in-progress paper (5 pages) presented at EuroHaptics, Pisa, Italy, June 2018 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


Designing a Haptic Empathetic Robot Animal for Children with Autism
Designing a Haptic Empathetic Robot Animal for Children with Autism

Burns, R., Kuchenbecker, K. J.

Workshop paper (4 pages) presented at the Robotics: Science and Systems Workshop on Robot-Mediated Autism Intervention: Hardware, Software and Curriculum, Pittsburgh, USA, June 2018 (misc)

Abstract
Children with autism often endure sensory overload, may be nonverbal, and have difficulty understanding and relaying emotions. These experiences result in heightened stress during social interaction. Animal-assisted intervention has been found to improve the behavior of children with autism during social interaction, but live animal companions are not always feasible. We are thus in the process of designing a robotic animal to mimic some successful characteristics of animal-assisted intervention while trying to improve on others. The over-arching hypothesis of this research is that an appropriately designed robot animal can reduce stress in children with autism and empower them to engage in social interaction.

hi

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


no image
Soft Multi-Axis Boundary-Electrode Tactile Sensors for Whole-Body Robotic Skin

Lee, H., Kim, J., Kuchenbecker, K. J.

Workshop paper (2 pages) presented at the RSS Pioneers Workshop, Pittsburgh, USA, June 2018 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Arm-Worn Tactile Displays

Kuchenbecker, K. J.

Cross-Cutting Challenge Interactive Discussion presented at the IEEE Haptics Symposium, San Francisco, USA, March 2018 (misc)

Abstract
Fingertips and hands captivate the attention of most haptic interface designers, but humans can feel touch stimuli across the entire body surface. Trying to create devices that both can be worn and can deliver good haptic sensations raises challenges that rarely arise in other contexts. Most notably, tactile cues such as vibration, tapping, and squeezing are far simpler to implement in wearable systems than kinesthetic haptic feedback. This interactive discussion will present a variety of relevant projects to which I have contributed, attempting to pull out common themes and ideas for the future.

hi

[BibTex]

[BibTex]


Haptipedia: An Expert-Sourced Interactive Device Visualization for Haptic Designers
Haptipedia: An Expert-Sourced Interactive Device Visualization for Haptic Designers

Seifi, H., MacLean, K. E., Kuchenbecker, K. J., Park, G.

Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, San Francisco, USA, March 2018 (misc)

Abstract
Much of three decades of haptic device invention is effectively lost to today’s designers: dispersion across time, region, and discipline imposes an incalculable drag on innovation in this field. Our goal is to make historical haptic invention accessible through interactive navigation of a comprehensive library – a Haptipedia – of devices that have been annotated with designer-relevant metadata. To build this open resource, we will systematically mine the literature and engage the haptics community for expert annotation. In a multi-year broad-based initiative, we will empirically derive salient attributes of haptic devices, design an interactive visualization tool where device creators and repurposers can efficiently explore and search Haptipedia, and establish methods and tools to manually and algorithmically collect data from the haptics literature and our community of experts. This paper outlines progress in compiling an initial corpus of grounded force-feedback devices and their attributes, and it presents a concept sketch of the interface we envision.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Exercising with Baxter: Design and Evaluation of Assistive Social-Physical Human-Robot Interaction

Fitter, N. T., Mohan, M., Kuchenbecker, K. J., Johnson, M. J.

Workshop paper (6 pages) presented at the HRI Workshop on Personal Robots for Exercising and Coaching, Chicago, USA, March 2018 (misc)

Abstract
The worldwide population of older adults is steadily increasing and will soon exceed the capacity of assisted living facilities. Accordingly, we aim to understand whether appropriately designed robots could help older adults stay active and engaged while living at home. We developed eight human-robot exercise games for the Baxter Research Robot with the guidance of experts in game design, therapy, and rehabilitation. After extensive iteration, these games were employed in a user study that tested their viability with 20 younger and 20 older adult users. All participants were willing to enter Baxter’s workspace and physically interact with the robot. User trust and confidence in Baxter increased significantly between pre- and post-experiment assessments, and one individual from the target user population supplied us with abundant positive feedback about her experience. The preliminary results presented in this paper indicate potential for the use of two-armed human-scale robots for social-physical exercise interaction.

hi

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


no image
Defining Social Robot Roles in Rehabilitation Based on Observed Patient-Therapist Interactions in Stroke Therapy

Johnson, M., Mohan, M., Mendonca, R.

Workshop paper (2 pages) presented at the HRI Workshop on Social Robots in Therapy: Focusing on Autonomy and Ethical Challenges, Chicago, USA, March 2018 (misc)

Abstract
In this paper, we discuss how to improve robot-patient interactions in task-oriented stroke therapy, which currently do not accurately model therapist-patient interactions in the real world. From observations of patient-therapist interactions in task-oriented stroke therapy captured in 8 videos, we describe three dyads of interactions where the therapist and the patient take on a set of acting states or roles and are motivated to move from one role to another when certain physical or verbal stimuli or cues are sensed and received. We propose possible model for robot-patient interaction and discuss challenges to its implementation, including the ethics

hi

link (url) [BibTex]

link (url) [BibTex]


Emotionally Supporting Humans Through Robot Hugs
Emotionally Supporting Humans Through Robot Hugs

Block, A. E., Kuchenbecker, K. J.

Workshop paper (2 pages) presented at the HRI Pioneers Workshop, Chicago, USA, March 2018 (misc)

Abstract
Hugs are one of the first forms of contact and affection humans experience. Due to their prevalence and health benefits, we want to enable robots to safely hug humans. This research strives to create and study a high fidelity robotic system that provides emotional support to people through hugs. This paper outlines our previous work evaluating human responses to a prototype’s physical and behavioral characteristics, and then it lays out our ongoing and future work.

hi

link (url) DOI Project Page [BibTex]

link (url) DOI Project Page [BibTex]


Towards a Statistical Model of Fingertip Contact Deformations from 4{D} Data
Towards a Statistical Model of Fingertip Contact Deformations from 4D Data

Gueorguiev, D., Tzionas, D., Pacchierotti, C., Black, M. J., Kuchenbecker, K. J.

Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, San Francisco, USA, March 2018 (misc)

Abstract
Little is known about the shape and properties of the human finger during haptic interaction even though this knowledge is essential to control wearable finger devices and deliver realistic tactile feedback. This study explores a framework for four-dimensional scanning and modeling of finger-surface interactions, aiming to capture the motion and deformations of the entire finger with high resolution. The results show that when the fingertip is actively pressing a rigid surface, it undergoes lateral expansion of about 0.2 cm and proximal/distal bending of about 30◦, deformations that cannot be captured by imaging of the contact area alone. This project constitutes a first step towards an accurate statistical model of the finger’s behavior during haptic interaction.

hi

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


no image
Can Humans Infer Haptic Surface Properties from Images?

Burka, A., Kuchenbecker, K. J.

Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, San Francisco, USA, March 2018 (misc)

Abstract
Human children typically experience their surroundings both visually and haptically, providing ample opportunities to learn rich cross-sensory associations. To thrive in human environments and interact with the real world, robots also need to build models of these cross-sensory associations; current advances in machine learning should make it possible to infer models from large amounts of data. We previously built a visuo-haptic sensing device, the Proton Pack, and are using it to collect a large database of matched multimodal data from tool-surface interactions. As a benchmark to compare with machine learning performance, we conducted a human subject study (n = 84) on estimating haptic surface properties (here: hardness, roughness, friction, and warmness) from images. Using a 100-surface subset of our database, we showed images to study participants and collected 5635 ratings of the four haptic properties, which we compared with ratings made by the Proton Pack operator and with physical data recorded using motion, force, and vibration sensors. Preliminary results indicate weak correlation between participant and operator ratings, but potential for matching up certain human ratings (particularly hardness and roughness) with features from the literature.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Probabilistic Approaches to Stochastic Optimization

Mahsereci, M.

Eberhard Karls Universität Tübingen, Germany, 2018 (phdthesis)

ei pn

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


Tactile perception by electrovibration
Tactile perception by electrovibration

Vardar, Y.

Koc University, 2018 (phdthesis)

Abstract
One approach to generating realistic haptic feedback on touch screens is electrovibration. In this technique, the friction force is altered via electrostatic forces, which are generated by applying an alternating voltage signal to the conductive layer of a capacitive touchscreen. Although the technology for rendering haptic effects on touch surfaces using electrovibration is already in place, our knowledge of the perception mechanisms behind these effects is limited. This thesis aims to explore the mechanisms underlying haptic perception of electrovibration in two parts. In the first part, the effect of input signal properties on electrovibration perception is investigated. Our findings indicate that the perception of electrovibration stimuli depends on frequency-dependent electrical properties of human skin and human tactile sensitivity. When a voltage signal is applied to a touchscreen, it is filtered electrically by human finger and it generates electrostatic forces in the skin and mechanoreceptors. Depending on the spectral energy content of this electrostatic force signal, different psychophysical channels may be activated. The channel which mediates the detection is determined by the frequency component which has a higher energy than the sensory threshold at that frequency. In the second part, effect of masking on the electrovibration perception is investigated. We show that the detection thresholds are elevated as linear functions of masking levels for simultaneous and pedestal masking. The masking effectiveness is larger for pedestal masking compared to simultaneous masking. Moreover, our results suggest that sharpness perception depends on the local contrast between background and foreground stimuli, which varies as a function of masking amplitude and activation levels of frequency-dependent psychophysical channels.

hi

Tactile perception by electrovibration [BibTex]


no image
Probabilistic Ordinary Differential Equation Solvers — Theory and Applications

Schober, M.

Eberhard Karls Universität Tübingen, Germany, 2018 (phdthesis)

ei pn

[BibTex]

[BibTex]

2016


no image
Quantifying Therapist Practitioner Roles Using Video-based Analysis: Can We Reliably Model Therapist-Patient Interactions During Task-Oriented Therapy?

Mendonca, R., Johnson, M. J., Laskin, S., Adair, L., Mohan, M.

pages: E55-E56, Abstract in the Archives of Physical Medicine and Rehabilitation, October 2016 (misc)

hi

DOI [BibTex]

2016


DOI [BibTex]


no image
Numerical Investigation of Frictional Forces Between a Finger and a Textured Surface During Active Touch

Khojasteh, B., Janko, M., Visell, Y.

Extended abstract presented in form of an oral presentation at the 3rd International Conference on BioTribology (ICoBT), London, England, September 2016 (misc)

Abstract
The biomechanics of the human finger pad has been investigated in relation to motor behaviour and sensory function in the upper limb. While the frictional properties of the finger pad are important for grip and grasp function, recent attention has also been given to the roles played by friction when perceiving a surface via sliding contact. Indeed, the mechanics of sliding contact greatly affect stimuli felt by the finger scanning a surface. Past research has shed light on neural mechanisms of haptic texture perception, but the relation with time-resolved frictional contact interactions is unknown. Current biotribological models cannot predict time-resolved frictional forces felt by a finger as it slides on a rough surface. This constitutes a missing link in understanding the mechanical basis of texture perception. To ameliorate this, we developed a two-dimensional finite element numerical simulation of a human finger pad in sliding contact with a textured surface. Our model captures bulk mechanical properties, including hyperelasticity, dissipation, and tissue heterogeneity, and contact dynamics. To validate it, we utilized a database of measurements that we previously captured with a variety of human fingers and surfaces. By designing the simulations to match the measurements, we evaluated the ability of the FEM model to predict time-resolved sliding frictional forces. We varied surface texture wavelength, sliding speed, and normal forces in the experiments. An analysis of the results indicated that both time- and frequency-domain features of forces produced during finger-surface sliding interactions were reproduced, including many of the phenomena that we observed in analyses of real measurements, including quasiperiodicity, harmonic distortion and spectral decay in the frequency domain, and their dependence on kinetics and surface properties. The results shed light on frictional signatures of surface texture during active touch, and may inform understanding of the role played by friction in texture discrimination.

hi

[BibTex]

[BibTex]


Behavioral Learning and Imitation for Music-Based Robotic Therapy for Children with Autism Spectrum Disorder
Behavioral Learning and Imitation for Music-Based Robotic Therapy for Children with Autism Spectrum Disorder

Burns, R., Nizambad, S., Park, C. H., Jeon, M., Howard, A.

Workshop paper (5 pages) at the RO-MAN Workshop on Behavior Adaptation, Interaction and Learning for Assistive Robotics, August 2016 (misc)

Abstract
In this full workshop paper, we discuss the positive impacts of robot, music, and imitation therapies on children with autism. We also discuss the use of Laban Motion Analysis (LMA) to identify emotion through movement and posture cues. We present our preliminary studies of the "Five Senses" game that our two robots, Romo the penguin and Darwin Mini, partake in. Using an LMA-focused approach (enabled by our skeletal tracking Kinect algorithm), we find that our participants show increased frequency of movement and speed when the game has a musical accompaniment. Therefore, participants may have increased engagement with our robots and game if music is present. We also begin exploring motion learning for future works.

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Design and evaluation of a novel mechanical device to improve hemiparetic gait: a case report

Fjeld, K., Hu, S., Kuchenbecker, K. J., Vasudevan, E. V.

Extended abstract presented at the Biomechanics and Neural Control of Movement Conference (BANCOM), 2016, Poster presentation given by Fjeld (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
One Sensor, Three Displays: A Comparison of Tactile Rendering from a BioTac Sensor

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Philadelphia, Pennsylvania, USA, April 2016 (misc)

hi

[BibTex]

[BibTex]


Multisensory robotic therapy to promote natural emotional interaction for children with ASD
Multisensory robotic therapy to promote natural emotional interaction for children with ASD

Bevill, R., Azzi, P., Spadafora, M., Park, C. H., Jeon, M., Kim, H. J., Lee, J., Raihan, K., Howard, A.

Proceedings of the ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 571, March 2016 (misc)

Abstract
In this video submission, we are introduced to two robots, Romo the penguin and Darwin Mini. We have programmed these robots to perform a variety of emotions through facial expression and body language, respectively. We aim to use these robots with children with autism, to demo safe emotional and social responses in various sensory situations.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder
Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder

Bevill, R., Park, C. H., Kim, H. J., Lee, J., Rennie, A., Jeon, M., Howard, A.

Extended abstract presented at the ACM/IEEE International Conference on Human Robot Interaction (HRI), March 2016 (misc)

Abstract
In this abstract, we present the overarching goal of our interactive robotic framework - to teach emotional and social behavior to children with autism spectrum disorders via multi-sensory therapy. We introduce our robot characters, Romo and Darwin Mini, and the "Five Senses" scenario they will undergo. This sensory game will develop the children's interest, and will model safe and appropriate reactions to typical sensory overload stimuli.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Designing Human-Robot Exercise Games for Baxter

Fitter, N. T., Hawkes, D. T., Johnson, M. J., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Design of a Low-Cost Platform for Autonomous Mobile Service Robots

Eaton, E., Mucchiani, C., Mohan, M., Isele, D., Luná, J. M., Clingerman, C.

Workshop paper (7 pages) presented at the 25th International Joint Conference on Artificial Intelligence (IJCAI) Workshop on Autonomous Mobile Service Robots, New York, USA, 2016 (misc)

Abstract
Most current autonomous mobile service robots are either expensive commercial platforms or custom manufactured for research environments, limiting their availability. We present the design for a lowcost service robot based on the widely used TurtleBot 2 platform, with the goal of making service robots affordable and accessible to the research, educational, and hobbyist communities. Our design uses a set of simple and inexpensive modifications to transform the TurtleBot 2 into a 4.5ft (1.37m) tall tour-guide or telepresence-style robot, capable of performing a wide variety of indoor service tasks. The resulting platform provides a shoulder-height touchscreen and 3D camera for interaction, an optional low-cost arm for manipulation, enhanced onboard computation, autonomous charging, and up to 6 hours of runtime. The resulting platform can support many of the tasks performed by significantly more expensive service robots. For compatibility with existing software packages, the service robot runs the Robot Operating System (ROS).

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Extrapolation and learning equations

Martius, G., Lampert, C. H.

2016, arXiv preprint \url{https://arxiv.org/abs/1610.02995} (misc)

al

Project Page [BibTex]

Project Page [BibTex]


no image
IMU-Mediated Real-Time Human-Baxter Hand-Clapping Interaction

Fitter, N. T., Huang, Y. E., Mayer, J. P., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the {\em IEEE/RSJ International Conference on Intelligent Robots and Systems} (misc)

hi

[BibTex]

[BibTex]

2012


no image
Simon Game with Data-driven Visuo-audio-haptic Buttons

Castillo, P., Romano, J. M., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Vancouver, Canada, March 2012 (misc)

hi

[BibTex]

2012


[BibTex]


no image
Haptic Vibration Feedback for a Teleoperated Ground Vehicle

Healey, S. K., McMahan, W., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Vancouver, Canada, March 2012 (misc)

hi

[BibTex]

[BibTex]


no image
A Biofidelic CPR Manikin With Programmable Pneumatic Damping

Stanley, A. A., Healey, S. K., Maltese, M. R., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Vancouver, Canada, March 2012, Finalist for Best Hands-on Demonstration Award (misc)

hi

[BibTex]

[BibTex]


no image
StrokeSleeve: Real-Time Vibrotactile Feedback for Motion Guidance

Bark, K., Cha, E., Tan, F., Jax, S. A., Buxbaum, L. J., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Vancouver, Canada, Vancouver, Canada, March 2012 (misc)

hi

[BibTex]

[BibTex]


no image
Pen Tablet Drawing Program with Haptic Textures

Castillo, P., Romano, J. M., Culbertson, H., Mintz, M., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Vancouver, Canada, March 2012 (misc)

hi

[BibTex]

[BibTex]


no image
Exploring Presentation Timing through Haptic Reminders

Tam, D., Kuchenbecker, K. J., MacLean, K., McGrenere, J.

Hands-on demonstration presented at IEEE Haptics Symposium, Vancouver, Canada, March 2012 (misc)

hi

[BibTex]

[BibTex]


no image
HALO: Haptic Alerts for Low-hanging Obstacles in White Cane Navigation

Wang, Y., Koch, E., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Vancouver, Canada, March 2012, Finalist for Best Hands-on Demonstration Award (misc)

hi

[BibTex]

[BibTex]


no image
VerroTeach: Visuo-audio-haptic Training for Dental Caries Detection

Maggio, M. P., Parajon, R., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Vancouver, Canada, March 2012, {B}est Demonstration Award (three-way tie) (misc)

hi

[BibTex]

[BibTex]


Estimation of MIMO Closed-Loop Poles using Transfer Function Data
Estimation of MIMO Closed-Loop Poles using Transfer Function Data

Vardar, Y.

Eindhoven University of Technology, the Netherlands, 2012 (mastersthesis)

Abstract
For the development of high-tech systems such as lithographic positioning systems, throughput and accuracy are the main requirements. Nowadays, the trend to reach demanded accuracy and throughput levels is designing lightweight and consequently more flexible systems. To control these systems with a more effective and less conservative way, control design should go beyond the traditional rigid control and cope with the flexibilities that limit achievable bandwidth and performance. Therefore, conventional loop shaping methods are not sufficient to reach the performance criterions. Since obtaining an accurate parametric model is very complex and time-consuming for these high-tech systems, using well-developed model-based controller synthesis methods is also not a superior option. To achieve desired performance criterions, one solution can be implemented is reducing the gap between model-based and data-based control synthesis methods. In previous research, a method was developed to define the dynamic behavior of the system without a need for a parametric model. By this method transfer function data (TFD), which provides the information on the whole s-plane can be obtained from frequency response data (FRD) of the system. This innovation was a very important step to use data-based techniques for model-based controller synthesis methods. In this thesis firstly the standard technique to obtain TFD defined in [2] is extended. This standard technique to obtain TFD is not compatible with systems with pure integrators. To extend the methodology also for those systems, two techniques, which are altering the contour and filtering the system, are proposed. Then, the accuracy of TFD is investigated in detail. It is shown that the accuracy of TFD depends on the quality of FRD obtained and the computation techniques used to calculate TFD. Then, a technique which enables to determine the closed-loop poles of a MIMO system using TFD is discussed. The validity of the technique is proven with the help of complex function theory and calculus. Also, the factors that prevent determination of the closed-loop poles are discussed. In addition, it is observed that the accuracy of the closed-loop determination method depends on the quality of obtained TFD and the computation techniques. The proposed theory to obtain TFD and determination of closed-loop poles is validated with experiments conducted to a prototype lightweight system. Also, using experimental frequency response data of NXT-A7 test rig, the success of the proposed methodology is validated also for complex systems. Through these experimental results, it can be concluded that this new technique could be very advantageous in terms of ease of use and accuracy to determine the closed-loop poles of a MIMO lightly damped system.

hi

[BibTex]

[BibTex]

2009


no image
Displaying Realistic Contact Accelerations Via a Dedicated Vibration Actuator

McMahan, W., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, Proc. IEEE World Haptics Conference, pp. 613–614, Salt Lake City, Utah, USA, March 2009, {B}est Demonstration Award (misc)

hi

[BibTex]

2009


[BibTex]


no image
The iTorqU 1.0 and 2.0

Winfree, K. N., Gewirtz, J., Mather, T., Fiene, J., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
Vibrotactile Feedback System for Intuitive Upper-Limb Rehabilitation

Kapur, P., Premakumar, S., Jax, S. A., Buxbaum, L. J., Dawson, A. M., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, USA, Proc. IEEE World Haptics Conference, pp. 621–622, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
The SlipGlove

Romano, J. M., Gray, S. R., Jacobs, N. T., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
Real-Time Graphic and Haptic Simulation of Deformable Tissue Puncture

Romano, J. M., Safonova, A., Kuchenbecker, K. J.

Hands-on demonstration presented at Medicine Meets Virtual Reality, Long Beach, California, USA, January 2009 (misc)

hi

[BibTex]

[BibTex]

2008


no image
The Touch Thimble

Kuchenbecker, K. J., Ferguson, D., Kutzer, M., Moses, M., Okamura, A. M.

Hands-on demonstration presented at IEEE Haptics Symposium, Reno, Nevada, USA, March 2008 (misc)

hi

[BibTex]

2008


[BibTex]