FeltSight
2025
Danlin Huang, Botao Amber Hu, Dong Zhang, Yifei Liu, Takatoshi Yoshida, Rem Rungu Lin
Key Words
Star-nose Mole Augumented Human More-than-Human Design Haptic Meditative Wandering Mixed Reality
Concept
FeltSight is a mixed reality haptic experience that reimagines human perception by drawing inspiration from the tactile navigation of the star-nosed mole. Moving beyond traditional, vision-dominated interaction paradigms, FeltSight enables users to engage in meditative wandering guided by extended-range haptics with subtle visual cues. The system comprises a wearable haptic glove paired with an extended reality interface. As users reach toward objects in their environment, the glove's vibration actuators---driven by audio-responsive patterns---simulate material textures, producing a sensation akin to touch. Meanwhile, the mixed reality interface offers a deliberately ``reduced reality'', presenting nearby objects as dynamic point clouds that materialize only in response to exploratory hand gestures. By shifting perceptual focus from the visual to the tactile, FeltSight challenges ocularcentric sensory hierarchies and foregrounds an embodied, relational, and more-than-human mode of sensing.
Inspiration:
The Star-Nosed Mole's Tactile Umwelt
The star-nosed mole (Condylura cristata) possessing one of the most specialized touch organs in the animal kingdom. The defining feature of the star-nosed mole is its "star"—a radially symmetrical arrangement of 22 mobile, fleshy appendages, or rays, that surround its nostrils. The star-nosed mole lost its vision due to living underground for a long time, and uses the tentacles on its nose instead of vision to explore its surroundings.
This creature's unique sensory system provided the primary biological inspiration for the design and artistic intent of FeltSight.
Vision dominates the hierarchy of the human sensory system, deeply embedded in our culture and known as ocularcentrism, which privileges visual perception as the primary pathway to knowledge. Donna Haraway's concept of tentacular thinking critiques this anthropocentric, vision-dominated worldview, advocating instead for forms of knowing that emerge through touching, feeling, and probing. When we zoom out to look at more-than-human perception, we can see other species construct rich perceptual environments through different sensory hierarchies—some prioritize touch. For example, the star-nosed mole is functionally blind yet "sees" through touch—its unusual star-shaped nose contains over 100,000 nerve endings, forming an extremely sensitive tactile organ that constructs its mental map via touch, forming the mole's umwelt.
Inspired by such more-than-human perception, we introduce FeltSight: a mixed reality experience that reorders the senses by prioritizing touch over sight. Using a custom haptic glove paired with Apple Vision Pro, users can “feel” distant objects through subtle vibrational feedback. Visual information is intentionally minimal—objects are shown only as sparse LiDAR point clouds within a small sphere around the hand. This design shifts attention from eyes to skin, encouraging slow, tactile engagement with materials like tree bark, fabric, or stone.
By dampening visual dominance, FeltSight creates a meditative, awareness-rich form of interaction. It draws on posthuman theory and bio-inspired design to expand the human sensorium, blurring the boundaries between vision and touch, human and animal, technology and embodiment. The result is an embodied journey into a tactile umwelt—an often unnoticed layer of reality—inviting reflection on alternative modes of perception.
Glove Design
The FeltSight glove consists of a palm section and a wrist-mounted module. The main glove is cast from soft, flesh-colored silicone, with a form and texture inspired by the sensory organ of the star-nosed mole. Each of the user’s fingers controls a radial tentacle of glove, embodying the concept of spatial exploration modeled on the mole’s tactile anatomy. The flexibility of the silicone allows the fingers to move and bend freely during interaction, ensuring a smooth haptic experience.
All circuitry is embedded within the silicone structure and wristband. To accommodate different hand sizes, we designed adjustable tracks to align the actuators with the user’s fingertips. The glove is secured on the dorsal side of the hand using transparent TPU straps, allowing the XR headset to perform uninterrupted skeletal tracking.
Industrial Design of Glove: The main glove is cast in soft silicone and shaped to resemble the star-nosed mole’s sensory appendages, with each finger aligned to a radial tactile “tentacle.” The glove maintains a specific exploratory gesture in which the two hands are held close at the wrists, with all fingers splayed outward. The haptic actuators and audio amplifier modules are encapsulated in the silicone body, while the battery, microcontroller, and Teensy board are housed in the wristband unit.
XR Experience Design
FeltSight XR interface is implemented in Apple Vision Pro. Unlike conventional AR systems that emphasize visual overlays, it intentionally reduces visual cues. By default, the user sees a completely black environment. Only when an exploratory hand gesture is performed does the system display dynamic point clouds of nearby objects within a 1-meter radius, based on depth sensing. These point clouds gradually fade out after the hand is withdrawn, simulating short-term perceptual memory and preventing reliance on continuous visual input.
The system uses RGB image data and an AI-based recognition module to identify object types and material properties in the environment. It also leverages the headset’s hand-tracking functionality to obtain the position and orientation of the user’s fingertips, visualized as rays extending from the hands. When a ray intersects with the surface of an object highlighted by the point cloud, the point of contact is visualized as a colored anchor, directing the user’s attention to the fingertip. At that moment, the corresponding actuator on the glove is activated to provide a matching tactile texture, creating a real-time visual–haptic association at the remote point of contact.
In order to simulate the real touch of finger sliding across the surface of materials, we dynamically adjust the playback speed and volume of the audio. The speed of the finger movement controls the audio playback speed, and the distance between the finger
and the object controls the audio volume.The visible range is modulated by the angle between the user’s palms: a wider angle yields a larger spherical visible field (up to 1 meter in diameter), while a narrower angle reduces it (down to 0.5 meters)
Haptic Experience Design
Each fingertip of the glove is equipped with a compact vibrotactile actuator. When the user’s hand approaches an object and the fingers move, the system plays back a vibration pattern corresponding to the material, creating a detailed tactile sensation as if the user were touching the surface with their fingertips. This interaction extends the user’s tactile range from the skin of the fingertip to all objects within a distance of up to one meter from the hand. Our design also incorporates a dynamic perception modulation mechanism that tightly links the haptic feedback to the speed and distance, in order to simulate natural fingertip touch.
By combining haptic actuation with finger movement, FeltSight promotes an embodied, active sensory experience: users must move their fingers to “reveal” tactile information, engaging the perception–action loop.
Pipeline for texture-dependent Haptic feedback: a) Sound acquisition . b) Playback and haptic rendering.