Skip to content

Projects

Social Touch

Researcher: Xin Zhu
Summary: Touch is an essential component of our interpersonal relationships, serving to communicate emotion, improve social bonding, and reduce feelings of isolation.  Our lab studies how people naturally communicate to one another via touch with the goal of creating technology that will allow people to communicate over long distances in our increasingly digital world. Humans use a variety of different gestures in social interactions, including squeezes, pats, and strokes. The system records physical touch patterns through an array of force sensors, processes the recordings using novel gesture-based algorithms to create actuator control signals, and generates mediated social touch through an array of voice coil actuators. Click here for more info.

Surface Modeling

Researchers: Michael Qian and Shihan Lu
Summary: Haptic rendering requires a model of the object or sensation you would like to display. Traditional haptic rendering algorithms use a physics-based model to generate the forces or vibrations to be displayed to the user as a function of their position and/or movement.  In our research we take a data-driven approach of recording and modeling these signals using electromechanical analogues of our touch sensors, such as force sensors and accelerometers. Click here for more info.

Assistive Technology and Gait

Researcher: Catherine Yunis
Summary: Although walkers are often prescribed to help with stability, there are associations with falls while using them. A possible reason for walkers’ association with fall risk is the environmental perturbations created by the walker bumping into objects in the environment or rolling over cables or different terrains on the ground. By using a robotic walker that has been modified to provide environmental perturbations via force feedback we can perturb balance in a controlled laboratory setting. Click here for more info.

Touch-Based Drones

Researcher: Yang Chen
Summary: VR (Virtual reality) provides an immersive experience that shows different visual and auditory displays to the user through the usage of glasses or HMD (head mounted display). It would be nice if the user could perceive the virtual objects in VR given the realistic graphic rendering. However, rendering haptics in such diverse environments is challenging. VR’s spatial rendering capabilities made interaction complex and often at different locations with different magnitudes. This project aims to use quadcopters installed with safe-to-touch cages to provide haptic feedback to users. Click here for more info.

Open-Source Hardware

Researcher: Sandeep Kollannur
Summary: Click here for more info.

VR Surfing

Researcher: Premankur Banerjee
Summary: This project focuses on simulating the experience of surfing in Virtual Reality (VR) by integrating a 6-degree-of-freedom motion platform to recreate realistic aquatic motions. The system uses advanced algorithms to map surfboard dynamics and interactive paddling, allowing users to feel the motions and various wave interactions as they move in the virtual environment. Designed in Unity 3D and enhanced by real-time kinematic feedback, this setup includes sensory cues such as wind simulation through a fan, making the experience both immersive and engaging. This technology has potential applications in surf training, recreational VR, and therapeutic interventions like surf therapy. Click here for more info.

Fidgeting Devices

Researcher: Mahta Pourebadi
Summary: This research explores how adaptive fidgeting devices can support attention and emotional regulation in adults with ADHD. By designing devices equipped with sensors and microcontrollers to track real-time fidgeting behaviors, the study aims to personalize fidget tools based on individual preferences. Through two phases, the research first identifies preferred fidgeting actions by analyzing user interactions with different widgets, and then evaluates the impact of personalized fidget devices on performance, attention, and emotional regulation during cognitive tasks. The findings hold potential for practical applications in therapeutic, educational, and occupational settings. Click here for more info.