Skip to content

Projects

Data-Driven Modeling

Haptic rendering requires a model of the object or sensation you would like to display. Traditional haptic rendering algorithms use a physics-based model to generate the forces or vibrations to be displayed to the user as a function of their position and/or movement. These traditional models are typically hand-tuned to create an approximation of the desired sensation, but are not able to completely match the richness of haptic feedback experienced during interactions with the physical world.

Instead of creating models to generate the touch signals, we take the inverse approach and model the touch signals directly. From neuroscience and perception, we know the types of signals that are generated when we interact with objects in our environment and how we perceive these signals. Therefore, in our research we take a data-driven approach of recording and modeling these signals using electromechanical analogues of our touch sensors, such as force sensors and accelerometers. This process of capturing the feel of interesting haptic phenomena has been coined “Haptography” (= haptic + photography). Haptography, like photography in the visual domain, enables us to quickly record the haptic feel of a real object so that the sensations can be reproduced later.

[1] Texture Rendering Device

In the past, this data-driven method has been successfully applied to modeling the feel of textured surfaces and recreating that sensation as the user interacted with a smooth tablet screen. We captured the haptic signals by using a custom haptic recording tool that was used to measure the forces and high-frequency vibrations felt when interacting with textured objects. The vibrations felt by the pen when it was dragged across the object encodes information about the object’s texture and roughness. We created mathematical models of these signals, which are then stored and used to create virtual versions of the objects later. During rendering, we attached a voice coil vibration actuator to a stylus, which is used to output the texture vibration signals as the user drags the stylus across the tablet surface.

 

Preference-Driven Modeling

Data-driven texture modeling and rendering has pushed the limit of realism in haptics. However, the lack of haptic texture databases and difficulties of model interpolation and expansion prevent data-driven methods from capturing a large variety of textures and from customizing models to suit specific output hardware or user needs.

Considering the fact that the evaluation of haptic rendering performance relies on human perception, which serves as the ‘gold standard’, we propose to use human’s perceptual preference during the interaction with a set of virtual texture candidates to guide the haptic texture modeling, which we call “preference-driven” modeling.

The preference-driven texture modeling is a two-stage process – texture generation and interactive texture search. The texture generation contains a generative adversarial network (GAN) for mapping the latent space into texture models. The texture search evolves the texture models generated by the GAN using an evolutionary algorithm (CMA-ES) given input about user’s preference. Our study showed that the proposed method can create realistic virtual counterparts of real textures without additional haptic data recording.

Preference-Driven Haptic Texture Modeling Framework

 

Multi-modal Modeling in Tool-Surface Interactions

Humans depend on multisensory perception when interacting with textured surfaces. Vision can be easily tricked in roughness perception by printing a textured pattern on the surface so that its true roughness can be determined only by touch. Yet, touch struggles to determine if an object is hollow, which is more easily distinguished through sound. Neuroscience research has attempted to reduce multimodal perception to a single modality, but experiments have shown that unimodality still limits human’s understanding and perception of the object. Creating a multisensory experience in virtual environments is critical, but challenging. Specifically, auditory feedback has often been neglected in favor of vision and touch although it plays a significant role in perception.

[2] Texture Sound Recording Devices

We propose a data-driven method of modeling and rendering sounds produced during tool-surface interactions with textured surfaces in real time. We used a statistical learning method to model the tool-texture sounds, which is more robust and fast compared to other sound modeling methods and is capable of handling the grainy and bursty sounds in tool-surface interactions. The results of user study showed that adding the virtual sound into the tool-surface interaction greatly increases the realism of the interaction. Additionally, it showed that with the existence of haptic cues, the virtual sound can improve human’s perception accuracy on texture roughness and hardness while the perception on slipperiness depends mainly on touch.

 

FALL-E: A Robotic Walker to Study Balance

Although walkers are often prescribed to help with stability, there are associations with falls while using them. A possible reason for walkers’ association with fall risk is the environmental perturbations created by the walker bumping into objects in the environment or rolling over cables or different terrains on the ground. By using a robotic walker that has been modified to provide environmental perturbations via force feedback we can perturb balance in a controlled laboratory setting. In addition, we have load cells in the leg of the walker to measure force and can play feedback to the user of how much force they apply to the walker. The research question is: how does appropriate force and magnitude of loading on a walker change stability in response to perturbation? Answering this question could describe why there exists an elevated fall risk in assistive device users, inform the design of protocols to test multisensory integration in balance particularly in older adults, and lead to better design criteria for assistive devices.

 

 

Wearable Devices

Traditional haptic devices are grounded, which means that they can display a force to the user due to their reaction force with the ground. However, the potential applications of these devices are limited by their small workspace (i.e. the range of motion through which the user can move). There has recently been a push towards making devices that are wearable as the allure of virtual reality (VR) has grown. Our lab focuses on creating expressive, lightweight, and low power wearable devices that can we worn at multiple sites on the body. We approach this problem from two directions: (1) creating novel actuators, and (2) designing novel methods of driving existing actuators to create new sensations.

We have been exploring the use of pneumatics (i.e. pressurized air) to display pressure cues to a user. We create these actuators by heat-sealing channels in pockets of thin thermoplastic material (LPDE). In contrast to traditional silicone pneumatic actuators, these actuators are faster to inflate, require lower pressures, and display a consistent sensation over a larger area of the body. Because the actuators are lightweight, the size and number of actuators is easily scalable for many different applications. This actuator design has been tested in a wristband configuration for a haptic guidance application. An individual actuator’s inflation was pulsed to indicate the direction. Similarly, we have also used these actuators in a medical guidance application by attaching them to an ultrasound probe and pulsing the inflation of the actuators to notify the user of errors in their probe position. Our work in [3] expands these pneumatic actuators to create the illusion of lateral motion on the arm using sequential inflation of the actuators.

Recent work [4] has expanded this actuator design to be driven with water of varying temperature instead of air. The change of actuation fluid allows us to provide simultaneous pressure and temperature cues. The system is actuated by using pumps to fill fabric actuators with water of varying temperature. The amount of water in the actuator determines the pressure cue that is displayed, and the temperature of the water determines the thermal cue. The device is capable of rapidly presenting large temperature changes to the skin, being limited only by the fill and drain time of the actuators.

 

Multi-Modal Mediated Touch

Multimodal social touch refers to the combination of various sensory modalities, such as touch, visual cues, and auditory stimuli, to enhance social interactions and communication. It encompasses the rich interplay of physical contact, facial expressions, body language, and vocal intonations that are integral to human connection. When engaging in multimodal social touch, individuals may experience a range of sensations, from the warmth of a handshake to the comforting embrace of a loved one. These tactile interactions not only convey emotional support, trust, and empathy but also serve as important nonverbal cues that can deepen social bonds and foster a sense of belonging. By integrating touch with other modes of communication, multimodal social touch amplifies the depth and richness of interpersonal exchanges, facilitating understanding and emotional connection in profound ways.

We designed a crossmodal vocalization-haptic system to allow users to communicate emotions to their partners. We explore affective context as a combination of user relationship (specifically the closeness between pairs of users), and user culture. We share the design and implementation of the crossmodal system that takes up to ten seconds of vocal expression (including humming or singing) from one user and transposes it into haptic signals to be displayed to twelve vibration actuators worn on the forearm of the second user. Our method of transposing musical vocal inputs captures the key signal features of rhythm, amplitude, time, and frequency. We present the results from a human subject study (N=20) involving 10 pairs of users with varying levels of closeness (ranging from siblings, friends, and strangers) to understand how our system supports affective communication. Our results show that audio parameters such as low-level and rhythm most strongly influence affective responses in our users. Additionally, the low-level vocal features are influenced by user demographic and the closeness between the pairs of users. The results suggest the impact of user closeness on affective communication.

 

Affective Communication

Touch as a modality in social communication has been getting more attention with recent developments in wearable technology and an increase in awareness of how limited physical contact can lead to touch starvation and feelings of depression. Although several mediated touch methods have been developed for conveying emotional support, the transfer of emotion through mediated touch has not been widely studied. This work addresses this need by exploring emotional communication through a novel wearable haptic system.www.frontiersin.org

The system records physical touch patterns through an array of force sensors, processes the recordings using novel gesture-based algorithms to create actuator control signals, and generates mediated social touch through an array of voice coil actuators. We conducted a human subject study (N = 20) to understand the perception and emotional components of this mediated social touch for common social touch gestures, including poking, patting, massaging, squeezing, and stroking. Our results show that the speed of the virtual gesture significantly alters the participants’ ratings of valence, arousal, realism, and comfort of these gestures with increased speed producing negative emotions and decreased realism. The findings from the study will allow us to better recognize generic patterns from human mediated touch perception and determine how mediated social touch can be used to convey emotion. Our system design, signal processing methods, and results can provide guidance in future mediated social touch design.

 

Social Touch

[5] Social Haptic Device

Touch is an essential component of our interpersonal relationships, serving to communicate emotion, improve social bonding, and reduce feelings of isolation.  Our lab studies how people naturally communicate to one another via touch with the goal of creating technology that will allow people to communicate over long distances in our increasingly digital world.

Humans use a variety of different gestures in social interactions, including squeezes, pats, and strokes. We have created a wearable device to create the sensation of a pleasant, calming stroke on the arm. Because it is mechanically difficult to create a long lateral motion in a wearable device, we instead create the illusion of lateral motion on the arm using an array of actuators that only have a small amount of vertical movement. These actuators, which are re-purposed excited speakers, are controlled to sequentially press into the user’s arm. We control the duration of each actuator’s motion and the delay between the onset of motion for adjacent actuators. By carefully tuning the duration and delay, we can create a pleasant and continuous sensation that creates the illusion of a hand stroking along the arm.

 

Human-Robot Social Touch

Touch between people is essential for forming bonds and communicating emotions. However, it is currently missing in human-robot interactions due to issues with reliability and safety. As robotics transitions to home and service sectors, it is increasingly important to design guidelines and models for human-robot social touch. This research aims to determine how variations in the motions a robot uses while patting the user’s forearm or shoulder affect perception and acceptance of the interaction [6]. We are using a Sawyer robot with a hand-like end-effector, varying the force, speed, location, and hold duration of the pat and trying to find safety, valence, arousal, and dominance related to each pat condition. Using these results, we propose guidelines for creating interactions that feel safe and non-dominant using low speed and low force trajectories. We also determine patting signals that can create a mechanical (low speed, high force), human-like (slow speed, hold at the end), or attention-getting (high speed, low force) sensation. These results will be useful in helping HRI designers create appropriate human-robot social touch interactions.

 

Haptic perturbation balance (HapPerBal) device for overground walking in a clinic environment

During rehabilitation studies, participants that are unstable when they walk may participate in walking trials. These trials may try to disturb the balance of the participant during walking, through a perturbation, in order to study the recovery after balance is upset. The perturbations are often done when a participant is walking on the treadmill, by accelerating or decelerating the treadmill belt rapidly. This method doesn’t work when the participant is walking overground. There is a need for a device that can provide perturbations during overground walking in a clinical environment to be used during walking and balance studies.

Haptics can play an important role in the creation of a device for perturbation during overground walking. In movement science, there are constructs developed to better understand movement control. For balance, the constructs often studied are stability and effort. Using haptics provides the opportunity to test aspects of another construct in balance – tactile perception. This project will use a haptic device called HapPerBal, that will create perturbations during overground walking for clinical use. The device is in prototyping phases, although current design strategies include a wearable, an immersive environment, or a haptic augmented floor. The device will use haptic actuators and provide multisensory stimulation.

References

[1] H. Culbertson, J. Unwin, and K. J. Kuchenbecker, “Modeling and Rendering Realistic Textures from Unconstrained Tool-Surface Interactions“, IEEE Transactions on Haptics, vol. 7, no. 3, pp. 381-393, July-Sept. 2014.

[2] S. Lu, Y. Chen, and H. Culbertson, “Towards multisensory perception: modeling and rendering sounds of tool-surface interactions,” IEEE Transactions on Haptics, vol. 13, no. 1, pp. 94-101, 2020.

[3] S. Lu, M. Zheng, M. Fontaine, S. Nikolaidis, and H. Culbertson, “Preference-Driven Texture Modeling Through Interactive Generation and Search,” IEEE Transactions on Haptics, vol. 15, no. 3, pp. 508-520, 2022.

[4]X. Zhu, T. Feng and H. Culbertson, 2022. Understanding the effect of speed on human emotion perception in mediated social touch using voice coil actuators. Frontiers in Computer Science4, p.826637.

[5]P. Jalapati, S. Sweidan, X. Zhu and Heather Culbertson, “Vocalization for Emotional Communication in Crossmodal Affective Display,” in AAAC Conference of Affective Computing and Intelligent Interaction, 2023.

[6] W. Wu and H. Culbertson, “Wearable haptic pneumatic device for creating the illusion of lateral motion on the arm,” in Proc. IEEE World Haptics Conference, 2019.

[7] D. T. Goetz, D. K. Owusu-Antwi, and H. Culbertson, “PATCH: Pump-Actuated Thermal Compression Haptics,” in Proc. IEEE Haptics Symposium, March, 2020.

[8] H. Culbertson, C. M. Nunez, A. Israr, F. Lau, F. Abnousi, and A. M. Okamura, “A Social Haptic Device to Create Continuous Lateral Motion Using Sequential Normal Indentation,” in Proc. IEEE Haptics Symposium, March 2018.

[9] N. Zamani, P. Moolchanchandani, N. T. Fitter, and H. Culbertson, “Effects of motion parameters on acceptability of human-robot social rouch,” in Proc. IEEE Haptics Symposium, March, 2020.


Skip to toolbar