top of page

HAPTIC TENTACLES

Haptic Tentacles is a vibrotactile XR interface that allows users to proximally and dimensionally interact with virtual objects.
(A Work in Progress)
concentric_rings_transparent.png
Right Quest Controller.png
virtual cube_edited_edited.png
Haptic Tentacles art 5 cropped_edited.jpg

What is
HAPTIC TENTACLES?

Haptic Tentacles is a touch-centric alternative controller interface that uses haptic vibrotactile feedback and ray casting within virtual reality (VR) and augmented reality (AR) simulations to present non-ocular-centric ways for users to experience and interact with virtual objects both proximally and dimensionally. This system prioritizes distance and differentiates itself accordingly from other haptic devices like haptic gloves, which are typically used for more direct contact-based interactions. In addition, the Haptic Tentacles system lacks a screen-based interface or access to visually rendered content in order to sensually prioritize haptic perception experiences over others for the user. Lastly, and importantly, the Haptic Tentacles system takes inspiration from, critiques, and reimagines the  “virtual cane” assistive device produced through sensory substitution research.

Ray Casting, Proximity, and Haptic Feedback

Haptic Tentacles was designed and generated through Unity Engine and runs natively on a Meta Quest 3 headset. Unity’s XR Interaction Toolkit XR SDK (which was used to build the virtual environments) allows controllers to haptically “detect” GameObjects in a given scene by default. When the controller’s ray casters “hover” over GameObjects, the controllers produce a palpable vibrotactile “bump,” that haptically corroborates the event. In Haptic Tentacles, the application uses ray casting to measure the distance between the controller and virtual objects in a virtual 3D environment. This measurement provides the necessary information to produce haptic feedback in the hand controllers. When the ray cast comes into contact with a customized “haptic material” (described below) on a virtual object, it produces haptic vibrotactile feedback. The Meta Quest 3 hand controllers employ three haptic engines per controller to produce haptic feedback–specifically one Linear Resonant Actuator (LRA) on the thumb rest, one Linear Resonant Actuator (LRA) on the trigger button, and one Voice Coil Motor (VCM) on the grip. Haptic feedback in the hand controllers increases with less distance and decreases with more distance between the controller and the virtual object. The far limit of the ray cast range can be set to any distance but defaults to two meters (approximately six feet).

A Custom Haptic "Material"

Unity “Materials” are extremely useful in XR applications. A custom “haptic material” provides a key component of the haptic functionality. This haptic material can be configured for different types of haptic pulses or patterns, stored, and then applied to any digital object in the simulation. In this way, different Game Objects can produce different types of haptic feedback for vibrotactile variety when the raycast from the controller collides with them. This feature is useful for situations where there are more than one digital object in the environment. This feature is especially useful for the VR courses and mazes, where different haptic patterns can be applied to different objects. For example, differing haptic patterns can be applied to the floor, the walls, and stairs. In addition, some of the courses have objects that can be “found” or located that give off numeric pulses as a means of indicating wayfinding. Typically, “materials” are used for visual aesthetics. In this case a “haptic material” repurposes the use of materials as touch-centric rather than ocular-centric.

Participation:
A Haptic Centered Experience

I designed the first round of simulation experiments to foreground the haptic stimulation. This test included only participants who could see and were blinded during the simulation. I put these tests first to reveal the users’ reactions by foregrounding the haptic interface in their subjective sensory experiences.

​

In fifteen scenarios where the participants were blinded, they generally expressed some form of disorientation. In addition, the participants expressed stress–and, in some cases, fear. Generally, the lack of environmental information other than what was coming through the haptic interface produced the most complaints. Two layers burdened their task: learning how to interpret the medium of haptic feedback through their sense of touch and then trying to finish the task while adapting to the burden of that interpretation. More explicitly, their sense of touch was very different from what the haptics were feeding them about their environment, which forced them to translate between the wave input of the haptic feedback and the pressure experience of their immediate sense of touch. Each participant would “tell a story” about their experience that typically included metaphors and emotional descriptions. For example, the word “tentacle” was used by several participants to describe what it felt like to “touch” a virtual object in a vibrotactile way over distance (which is why I eventually included the metaphor in the title of the project)--and an interesting example of Won, Lanier, & Balenson’s concept of “Ipsimodal Remapping” (discussed later).

​

Some participants wanted extra information vectors, like sound, to complement the haptic feedback. Others were interested in the ability to “grab” a virtual object with one hand and scan it with the other hand controller to have more control over how they conducted their task. Out of the four participants who attempted to complete the course VR simulation, only one participant was actually able to complete it. In this there seemed to be a rather significant difference between participants who had previous experience with XR mediums.

​

The unblinded scenarios significantly differed. These users showed significantly less cognitive dissonance and stress reactions while attempting to interact with and identify the virtual objects. In most unblinded scenarios the participants generally approached the tasks more lightheartedly and with a positive attitude indicated through facial expressions, excited movement, and emotive performance. Out of fifteen scenarios, approximately two thirds of the participants were able to confidently locate and dimensionally identify the virtual objects. In the case of virtual object identification (AR), the users were able to move around the room on their own, and seemed to use the controllers in more confident ways while trying to locate and identify the virtual objects. While unblinded, participants seemed to, judging from their demeanor, interactions, and the overall differences between blinded and unblinded scenarios, be more interested in and willing to think of the Haptic Tentacles experience as a kind of game as a means of sense making.

Training vs First Time

Generally, participants with previous XR experience found the simulation more intuitive than those without, especially during blinded tests. However, it also became apparent that all the participants noticeably adapted to using the Haptic Tentacles system over time, which indicated the possibility of normalizing the experience given enough time with the interface. The unblinded tests revealed that even when the sighted participants could not see the object, having the normal use of their eyes significantly enhanced their mood, emotional outlook, confidence, positive curiosity, willingness to see the task through, and mental “modeling” abilities.​

​

Vibro-tactile Overstimulation

In about a quarter of the scenarios (in both VR and AR), the participants complained of tactile overstimulation due to extended exposure to “intense vibrations” in the hands. Because haptic vibration is the privileged vector for transmitting information, the users seem compelled to keep the controllers engaged to maintain a steady stream of environmental information. One user related that when they turned off the controller’s ray casters, it was like they “went blind in a way that wasn’t like losing sight, but another kind of blindness.” However, between the need to find and identify the virtual object (AR), or navigate the course (VR) and the need to sooth their hands from vibrational overstimulation over time resulted in a steadily increasing fatigue feedback loop. 

Haptic Tentacles art 5 cropped_edited.jpg

Next Steps...

I am currently building the next version of this experience, which will be strictly function in an AR context. VR is a great medium with a lot of flexibility. However, for this, the ability to place a digital object in a specific location in physical space, or making the material space boundaries the container for the virtual object is important. Though the system is built for participants to have a unique, non-ocular / touch centric experience of digital objects, the physical space place a key role in that it represents the cradle in which the the virtual and material coalesce. Since haptic experience is prioritized, it seems that the embodied state or experience is also more prioritized. In normative VR, the experience typically attempts to replace the material with the virtual through a visual prioritization. In normative bodies this forces a sensory hierarchy that directs the experience in specific ways. AR considers, includes, and utilizes the material space. The virtual emphasis has served its purpose as both a testing ground and production platform. However, moving into AR evolves the experience.

 

More to come...

  • Instagram
  • LinkedIn

© 2025 Adam Wright

bottom of page