Danlin Huang
Augmented Soma with I
ntelligence


→ About

↓ Projects


  Body Oracle

  FeltSight

  Foldiverse


   Echovision

   Holomask

   >HtH+

   Fungi Sync

  Oops!

   Commercial Projects





FeltSight


2025

Hyper-sensitizing the Surrounding through Mixed Reality Haptic Proximity Gloves




Concept


Vision dominates the hierarchy of the human sensory system. This culturally embedded phenomenon of ocularcentrism privileges vision as the dominant pathway to knowledge. Donna Haraway's concept of tentacular thinking critiques this anthropocentric, vision-dominated worldview, advocating instead for forms of knowing that emerge through touching, feeling, and probing.


FeltSight is a mixed reality sensory-substitution experience that reimagines human perception through an alternative tactile umwelt, inspired by the star-nosed mole's unique tactile navigation. Users engage in meditative wandering guided by extended-range sensing with haptic feedback and subtle visual cues. Wearable haptic gloves and XR devices provide an enhanced tactile and diminished visual experience. By shifting focus from visual to tactile perception, FeltSight challenges ocularcentric sensory hierarchies and elevates touch as a way to perceive the world.


Inspiration



The star-nosed mole is functionally blind yet "sees" through touch---its unusual star-shaped nose contains over 100,000 nerve endings, forming an extremely sensitive tactile organ that constructs its mental map via touch, forming the mole's umwelt.
This creature's unique sensory system provided the primary biological inspiration for the design and artistic intent of FeltSight.



System


Reduce visualExtend Tactile
The FeltSight system augments pre-contact touch using a custom haptic glove paired with a Mixed Reality (MR) application running on the MR headset Apple Vision Pro. The glove and headset communicate via Bluetooth Low Energy (BLE). The MR app runs as the central device, tracking 3D environment geometry using LiDAR, classifying materials with AI using real-time image data from Vision Pro's camera feed, and tracking the user's hand and fingertip spatial poses. It calculates motion speed, amplitude, and textures based on finger pose, movement, and distance to the nearest surface, then sends per-frame haptic commands. The glove acts as the peripheral device, receiving commands at its onboard microcontroller unit (MCU) for audio haptic synthesis, amplification, and actuation. 




Glove Design


The FeltSight glove consists of a palm section and a wrist-mounted module. The main glove is cast from soft, flesh-colored silicone, with a form and texture inspired by the sensory organ of the star-nosed mole. Each of the user’s fingers controls a radial tentacle of glove, embodying the concept of spatial exploration modeled on the mole’s tactile anatomy. The flexibility of the silicone allows the fingers to move and bend freely during interaction, ensuring a smooth haptic experience.

All circuitry is embedded within the silicone structure and wristband. To accommodate different hand sizes, we designed adjustable tracks to align the actuators with the user’s fingertips. The glove is secured on the dorsal side of the hand using transparent TPU straps, allowing the XR headset to perform uninterrupted skeletal tracking.




Industrial Design of Glove: The main glove is cast in soft silicone and shaped to resemble the star-nosed mole’s sensory appendages, with each finger aligned to a radial tactile “tentacle.” The glove maintains a specific exploratory gesture in which the two hands are held close at the wrists, with all fingers splayed outward. The haptic actuators and audio amplifier modules are encapsulated in the silicone body, while the battery, microcontroller, and Teensy board are housed in the wristband unit.


XR Experience Design



FeltSight XR interface is implemented in Apple Vision Pro. Unlike conventional AR systems that emphasize visual overlays, it intentionally reduces visual cues. By default, the user sees a completely black environment. Only when an exploratory hand gesture is performed does the system display dynamic point clouds of nearby objects within a 1-meter radius, based on depth sensing. These point clouds gradually fade out after the hand is withdrawn, simulating short-term perceptual memory and preventing reliance on continuous visual input.

The system uses RGB image data and an AI-based recognition module to identify object types and material properties in the environment. It also leverages the headset’s hand-tracking functionality to obtain the position and orientation of the user’s fingertips, visualized as rays extending from the hands. When a ray intersects with the surface of an object highlighted by the point cloud, the point of contact is visualized as a colored anchor, directing the user’s attention to the fingertip. At that moment, the corresponding actuator on the glove is activated to provide a matching tactile texture, creating a real-time visual–haptic association at the remote point of contact.



In order to simulate the real touch of finger sliding across the surface of materials, we dynamically adjust the playback speed and volume of the audio. The speed of the finger movement controls the audio playback speed, and the distance between the finger and the object controls the audio volume.



The visible range is modulated by the angle between the user’s palms: a wider angle yields a larger spherical visible field (up to 1 meter in diameter), while a narrower angle reduces it (down to 0.5 meters)


Haptic Experience Design


Each fingertip of the glove is equipped with a compact vibrotactile actuator. When the user’s hand approaches an object and the fingers move, the system plays back a vibration pattern corresponding to the material, creating a detailed tactile sensation as if the user were touching the surface with their fingertips. This interaction extends the user’s tactile range from the skin of the fingertip to all objects within a distance of up to one meter from the hand. Our design also incorporates a dynamic perception modulation mechanism that tightly links the haptic feedback to the speed and distance, in order to simulate natural fingertip touch.

By combining haptic actuation with finger movement, FeltSight promotes an embodied, active sensory experience: users must move their fingers to “reveal” tactile information, engaging the perception–action loop.

Pipeline for texture-dependent Haptic feedback: a) Sound acquisition . b) Playback and haptic rendering.




Experience & Discussion


We invited ten participants (aged 22–45, with balanced gender representation) into a mixed-deciduous forest to experience FeltSight . Each session lasted approximately 20 minutes. During each session, we collected: (a) video recordings (both wide-angle and close-up) to capture participants' movement strategies, and (b) post-session interviews using semi-structured questions focused on perceived sensations and emotional states. Participants displayed a range of emergent behaviors that closely echoed the theoretical concepts outlined in our conceptual framework. These behaviors unfolded as a journey of sensory rebuilding.



Sensory Rupture
Many participants reported that upon first donning the Vision Pro headset, the initial darkness made them feel a loss of direction and control over their surroundings. The system severely narrowed their field of view, rendering the environment nearly pitch-black. One of the most immediate shifts was the adoption of slow, exploratory movements. One user noted that he could only see objects immediately around his palm by reaching out with his hand—an interaction that made him feel as if he were walking through a dark cave with a torch. He likened this to the way a star-nosed mole explores its surroundings in its natural habitat. Participants eventually began bending over, relying on continuous ground feedback in a manner reminiscent of burrowing animals. These movements demonstrated an emergent mode of exploratory, relational, and multisensory interaction that resonates with Haraway's critique of detached observation.

Extended Body Schema
As participants gradually adapted to the touch-first paradigm, one participant felt his body extending into the environment. He could not see his own hands in his field of vision, yet the haptic gloves allowed him to feel the textures of distant objects. This made him feel as if his fingers had elongated— “it reminded me of the rubber hand illusion,” he said. Participants also deviated from linear paths. Their navigation became aimless, nonlinear, and organic—guided by shifting haptic sensations that invited detours toward unseen textures. This behavior aligns with Merleau-Ponty’s account of perception as an active interplay between the sensing body and its environment, and evokes Abram’s vision of a sensorially engaged, more-than-human mode of knowing \cite{abram1997spell}. One user described it as “letting the forest guide me with its breath.”

XR Meditative Wandering
One participant, an avid practitioner of walking meditation, pointed out that the FeltSight experience gave him a completely new understanding of his practice. This was his first attempt to interact with the environment primarily through his fingers rather than the soles of his feet during walking meditation. He noted that Vision Pro’s immersive visual experience created a “magical tranquil space.” The nearly pitch-black environment and the gentle, slowly dissipating particles created an atmosphere of focus. “I realized that perhaps our overly developed vision sometimes hinders us from entering a flow state.” he reflected.

Sensory Reprioritization and Body Mechanics
Another striking behavior involved crouched, burrow-like exploration. One participant shared: “It was like playing hide-and-seek, carefully exploring in the dark to find where the tree trunks were.” This posture—bent over and close to the ground—emerged repeatedly as participants relied on subtle ground-level haptic feedback. These altered body mechanics suggest a redistribution of sensory priority: as vision receded, touch took precedence, leading to movement patterns evocative of nonhuman perceptual modes.



Exhibition & Publication


IEEE VISAP  2025, Art Gallery. Vienna, Austria.

[P1] Danlin Huang, Botao Amber Hu, Dong Zhang, Yifei Liu, Takatoshi Yoshida, Rem Rungu Lin. 2025. “Becoming Mole with "FeltSight": Hyper-sensitizing the Surrounding through Mixed Reality Haptic Proximity Gloves” . In SIGGRAPH Asia 2025 (SA '25). Art Paper. 



  • IEEE VISAP 2025, Art Gallery, Vienna, Austria




  • SIGGRAPH ASIA 2025, Art Paper, Hong Kong