Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Applied Technology Review
How Haptic Patches are Turning Skin into a Sensory Interface
As a Northwestern University team member, Matthew Flavin, Ph.D., developed a haptic patch using LiDAR and wearable bioelectronics to assist those with neurological conditions, enhancing sensory feedback for vision and balance.
By
Applied Technology Review | Friday, August 01, 2025
Stay ahead of the industry with exclusive feature stories on
the top companies, expert insights and the latest news delivered straight to your
inbox. Subscribe today.
FREMONT, CA: A wearable bioelectronics lab at the Georgia Institute of Technology at Northwestern University is developing innovative haptic patches, termed epidermal VR, to help people with neurological conditions, especially those with early-onset vision impairments. These patches use sensors to transmit information to haptic devices, much like VR goggles replicate visual experiences.
The patches utilize actuators that operate at frequencies between 50 and 200 Hz, where the skin is most sensitive. These actuators can vibrate and apply pressure, requiring more force than typical vibration mechanisms. This small, battery-powered device achieves both functions using bistable magnetic materials and the skin's natural spring-like properties, making it more efficient than traditional, energy-heavy tethered devices. The bistable mechanism flips between states with a small burst of energy, similar to a light switch.
The actuator uses a combination of vibration, pressing and rotation to convey information to the skin. Researchers are exploring the optimal designs for these channels. For instance, in a visual sensory replacement system, indentation patterns created by the actuators can alert users to the presence of objects, warn of potential collisions and indicate the distance to obstacles, helping them navigate their surroundings. By integrating LiDAR systems and related APIs that identify objects like chairs, walls and doors, vibration can also guide users toward specific locations.
This epidermal VR system maps the environment and detects obstructions using LiDAR technology found in smartphones. This information is transmitted via Bluetooth to the haptic device for non-visual perception. Utilizing Apple's LiDAR APIs simplifies app development, with the phone handling image categorization and 3D reconstruction. Cloud processing may be incorporated to enhance the system's capabilities.
A key innovation is using kirigami, a Japanese paper-cutting technique, to convert the actuator's linear and rotational motions. This allows for creating intricate mechanical stimuli on the skin, like sub-pixels, by positioning multiple actuators near each other. This enables the delivery of more complex tactile information.
The research team is also exploring using neuromorphic computers and edge computing to further enhance the device's capabilities in the future. Currently, it uses a commercial System-on-a-Chip (SoC) with an ARM processor, Bluetooth stack and communication antenna.
The lab makes the stimuli intuitive by linking them to natural sensory experiences. This lets users quickly learn the system, often within a couple of hours, by associating specific stimuli with visual locations. With practice, users can automatically identify an object's location based solely on the sensation.
The lab aims to aid individuals who have lost sensation in their feet due to neurological conditions like stroke or spinal cord injuries. The haptic patches could assist gait and balance by enhancing sensory feedback, making walking easier and safer. This is achieved by delivering precise tactile cues to the feet, helping users regain awareness of their foot placement and improve their balance.