Skip to content
- The Penn Haptic Texture Toolkit (HaTT) is a collection of 100 haptic texture and friction models, the recorded data from which the models were made, images of the textures, and the code and methods necessary to render these textures using an impedance-type haptic device such as a SensAble Phantom Omni. This toolkit was developed to provide haptics researchers with a method by which to compare and validate their texture modeling and rendering methods. The included rendering code has the additional benefit of allowing others, both researchers and designers, to incorporate our textures into their virtual environments, which will lead to a richer experience for the user.
- During social interactions, people use auditory, visual, and haptic cues to convey their thoughts, emotions, and intentions. Due to weight, energy, and other hardware constraints, it is difficult to create devices that completely capture the complexity of human touch. Here we explore whether a sparse representation of human touch is sufficient to convey social touch signals. To test this we collected a dataset of social touch interactions using a soft wearable pressure sensor array, developed an algorithm to map recorded data to an array of actuators, then applied our algorithm to create signals that drive an array of normal indentation actuators placed on the arm. Using this wearable, low-resolution, low-force device, we find that users are able to distinguish the intended social meaning, and compare performance to results based on direct human touch. As online communication becomes more prevalent, such systems to convey haptic signals could allow for improved distant socializing and empathetic remote human-human interaction.