Alternative Soma Sensation
 to unleash creativity

→ About

↓ Projects

   Body Oracle

   FeltSight

   Echovision

   Fungi Sync

   Foldiverse

  Oops!

   Star-Paw Craze

   Commercial Projects





FeltSight 


2025
Danlin Huang, Botao Amber Hu, Dong Zhang, Yifei Liu, Takatoshi Yoshida, Rem Rungu Lin





Key Words


Star-nose Mole    Augumented Human   More-than-Human Design   Haptic   Meditative Wandering   Mixed Reality



Concept


FeltSight is a mixed reality sensory-substitution experience that reimagines human perception through an alternative tactile umwelt, inspired by the star-nosed mole's unique tactile navigation. Users engage in meditative wandering guided by extended-range sensing with haptic feedback and subtle visual cues. The system features a pair of wearable haptic gloves with high-precision vibrotactile actuators on each finger, driven by audio-responsive patterns. These actuators re-sensitize the sense of touch and simulate material textures, enabling users to "feel" environmental surfaces as if their fingertips can tele-touch surrounding objects at a distance. The mixed reality headset provides a deliberately subtle "reduced reality," showing nearby objects as dynamic point clouds that serve as a "short-term perceptual memory," visualizing touched textures on surrounding surfaces. By shifting focus from visual to tactile perception, FeltSight challenges ocularcentric sensory hierarchies and elevates touch as a way to perceive the world. This approach invites users to experience the umgebung---the world that exists but typically goes unperceived---through substituted perception, exemplifying tentacular thinking for more-than-human awareness.





Inspiration: 
The Star-Nosed Mole's Tactile Umwelt



The star-nosed mole (Condylura cristata) possessing one of the most specialized touch organs in the animal kingdom. The defining feature of the star-nosed mole is its "star"—a radially symmetrical arrangement of 22 mobile, fleshy appendages, or rays, that surround its nostrils. The star-nosed mole lost its vision due to living underground for a long time, and uses the tentacles on its nose instead of vision to explore its surroundings.

This creature's unique sensory system provided the primary biological inspiration for the design and artistic intent of FeltSight. 






Vision dominates the hierarchy of the human sensory system, deeply embedded in our culture and known as ocularcentrism, which privileges visual perception as the primary pathway to knowledge. Donna Haraway's concept of tentacular thinking critiques this anthropocentric, vision-dominated worldview, advocating instead for forms of knowing that emerge through touching, feeling, and probing. When we zoom out to look at more-than-human perception, we can see other species construct rich perceptual environments through different sensory hierarchies—some prioritize touch. For example, the star-nosed mole is functionally blind yet "sees" through touch—its unusual star-shaped nose contains over 100,000 nerve endings, forming an extremely sensitive tactile organ that constructs its mental map via touch, forming the mole's umwelt. 

Inspired by such more-than-human perception, we introduce FeltSight: a mixed reality experience that reorders the senses by prioritizing touch over sight. Using a custom haptic glove paired with Apple Vision Pro, users can “feel” distant objects through subtle vibrational feedback. Visual information is intentionally minimal—objects are shown only as sparse LiDAR point clouds within a small sphere around the hand. This design shifts attention from eyes to skin, encouraging slow, tactile engagement with materials like tree bark, fabric, or stone.

By dampening visual dominance, FeltSight creates a meditative, awareness-rich form of interaction. It draws on posthuman theory and bio-inspired design to expand the human sensorium, blurring the boundaries between vision and touch, human and animal, technology and embodiment. The result is an embodied journey into a tactile umwelt—an often unnoticed layer of reality—inviting reflection on alternative modes of perception.



Glove Design

The FeltSight glove consists of a palm section and a wrist-mounted module. The main glove is cast from soft, flesh-colored silicone, with a form and texture inspired by the sensory organ of the star-nosed mole. Each of the user’s fingers controls a radial tentacle of glove, embodying the concept of spatial exploration modeled on the mole’s tactile anatomy. The flexibility of the silicone allows the fingers to move and bend freely during interaction, ensuring a smooth haptic experience.

All circuitry is embedded within the silicone structure and wristband. To accommodate different hand sizes, we designed adjustable tracks to align the actuators with the user’s fingertips. The glove is secured on the dorsal side of the hand using transparent TPU straps, allowing the XR headset to perform uninterrupted skeletal tracking.






Industrial Design of Glove: The main glove is cast in soft silicone and shaped to resemble the star-nosed mole’s sensory appendages, with each finger aligned to a radial tactile “tentacle.” The glove maintains a specific exploratory gesture in which the two hands are held close at the wrists, with all fingers splayed outward. The haptic actuators and audio amplifier modules are encapsulated in the silicone body, while the battery, microcontroller, and Teensy board are housed in the wristband unit.



XR Experience Design


FeltSight XR interface is implemented in Apple Vision Pro. Unlike conventional AR systems that emphasize visual overlays, it intentionally reduces visual cues. By default, the user sees a completely black environment. Only when an exploratory hand gesture is performed does the system display dynamic point clouds of nearby objects within a 1-meter radius, based on depth sensing. These point clouds gradually fade out after the hand is withdrawn, simulating short-term perceptual memory and preventing reliance on continuous visual input.

The system uses RGB image data and an AI-based recognition module to identify object types and material properties in the environment. It also leverages the headset’s hand-tracking functionality to obtain the position and orientation of the user’s fingertips, visualized as rays extending from the hands. When a ray intersects with the surface of an object highlighted by the point cloud, the point of contact is visualized as a colored anchor, directing the user’s attention to the fingertip. At that moment, the corresponding actuator on the glove is activated to provide a matching tactile texture, creating a real-time visual–haptic association at the remote point of contact.



In order to simulate the real touch of finger sliding across the surface of materials, we dynamically adjust the playback speed and volume of the audio. The speed of the finger movement controls the audio playback speed, and the distance between the finger and the object controls the audio volume.



The visible range is modulated by the angle between the user’s palms: a wider angle yields a larger spherical visible field (up to 1 meter in diameter), while a narrower angle reduces it (down to 0.5 meters)




Haptic Experience Design


Each fingertip of the glove is equipped with a compact vibrotactile actuator. When the user’s hand approaches an object and the fingers move, the system plays back a vibration pattern corresponding to the material, creating a detailed tactile sensation as if the user were touching the surface with their fingertips. This interaction extends the user’s tactile range from the skin of the fingertip to all objects within a distance of up to one meter from the hand. Our design also incorporates a dynamic perception modulation mechanism that tightly links the haptic feedback to the speed and distance, in order to simulate natural fingertip touch.

By combining haptic actuation with finger movement, FeltSight promotes an embodied, active sensory experience: users must move their fingers to “reveal” tactile information, engaging the perception–action loop.

Pipeline for texture-dependent Haptic feedback: a) Sound acquisition . b) Playback and haptic rendering.




Experience and Discussion


We invited ten participants (aged 22–45, with balanced gender representation) into a mixed-deciduous forest to experience FeltSight . Each session lasted approximately 20 minutes. During each session, we collected: (a) video recordings (both wide-angle and close-up) to capture participants' movement strategies, and (b) post-session interviews using semi-structured questions focused on perceived sensations and emotional states. Participants displayed a range of emergent behaviors that closely echoed the theoretical concepts outlined in our conceptual framework. These behaviors unfolded as a journey of sensory rebuilding.



Sensory Rupture
Many participants reported that upon first donning the Vision Pro headset, the initial darkness made them feel a loss of direction and control over their surroundings. The system severely narrowed their field of view, rendering the environment nearly pitch-black. One of the most immediate shifts was the adoption of slow, exploratory movements. One user noted that he could only see objects immediately around his palm by reaching out with his hand—an interaction that made him feel as if he were walking through a dark cave with a torch. He likened this to the way a star-nosed mole explores its surroundings in its natural habitat. Participants eventually began bending over, relying on continuous ground feedback in a manner reminiscent of burrowing animals. These movements demonstrated an emergent mode of exploratory, relational, and multisensory interaction that resonates with Haraway's critique of detached observation.

Extended Body Schema
As participants gradually adapted to the touch-first paradigm, one participant felt his body extending into the environment. He could not see his own hands in his field of vision, yet the haptic gloves allowed him to feel the textures of distant objects. This made him feel as if his fingers had elongated— “it reminded me of the rubber hand illusion,” he said. Participants also deviated from linear paths. Their navigation became aimless, nonlinear, and organic—guided by shifting haptic sensations that invited detours toward unseen textures. This behavior aligns with Merleau-Ponty’s account of perception as an active interplay between the sensing body and its environment, and evokes Abram’s vision of a sensorially engaged, more-than-human mode of knowing \cite{abram1997spell}. One user described it as “letting the forest guide me with its breath.”

XR Meditative Wandering
One participant, an avid practitioner of walking meditation, pointed out that the FeltSight experience gave him a completely new understanding of his practice. This was his first attempt to interact with the environment primarily through his fingers rather than the soles of his feet during walking meditation. He noted that Vision Pro’s immersive visual experience created a “magical tranquil space.” The nearly pitch-black environment and the gentle, slowly dissipating particles created an atmosphere of focus. “I realized that perhaps our overly developed vision sometimes hinders us from entering a flow state.” he reflected.

Sensory Reprioritization and Body Mechanics
Another striking behavior involved crouched, burrow-like exploration. One participant shared: “It was like playing hide-and-seek, carefully exploring in the dark to find where the tree trunks were.” This posture—bent over and close to the ground—emerged repeatedly as participants relied on subtle ground-level haptic feedback. These altered body mechanics suggest a redistribution of sensory priority: as vision receded, touch took precedence, leading to movement patterns evocative of nonhuman perceptual modes.



Exhibition & Publication


IEEE VISAP  2025, Art Gallery. Vienna, Austria.

[P1] Danlin Huang, Botao Amber Hu, Dong Zhang, Yifei Liu, Takatoshi Yoshida, Rem Rungu Lin. 2025. “Becoming Mole with "FeltSight": Hyper-sensitizing the Surrounding through Mixed Reality Haptic Proximity Gloves” . In SIGGRAPH Asia 2025 (SA '25). Art Paper. [To Appear]