Exploring How Haptic Feedback Can Make Guidance and Navigation in Virtual Reality Accessible for Users with Visual Impairments

The current reliance on visual feedback in virtual reality head mounted display technologies can make the hardware inaccessible for users with visual impairments. Existing assistive technology devices are mostly incompatible with virtual reality hardware due largely to the bespoke three-dimensional interactions designed for virtual reality experiences. This has led to users with visual impairments feeling excluded from the technology, with our preliminary research highlighting how commercial developers are currently offering few options to tailor experiences to the individual needs of users. 

Our project is studying how vibrations from a wearable vest can help guide users in virtual worlds when they can't rely on vision for navigation. We are exploring to what degree users with visual impairments can successfully be guided to haptic-only targets in the virtual world, with direction cues and proximity vibration intensity guiding users to targets placed both on either side of the user, as well as away from the user in a fully 3D haptic-only virtual world. We are also exploring how these haptic body cues can be used to provide clarity during artificial movement in virtual worlds, such as with teleportation or handheld controller joystick movement, potentially preventing the veering issue currently encountered by visually impaired users in virtual reality.

Project Aims:

  • Analysing the current level of accessibility in mainstream virtual reality experiences, such as the options to tailor navigation to individual needs.
  • Assessing if haptic cues from a wearable vest can be used to direct users to specific points in a virtual environment without any visual or audio feedback.
  • Understanding performance and preferences for haptic feedback patterns.
  • Exploring to what degree body haptic feedback can both be used to direct users to targets and reduce veering during movement for users with visual impairments for the most popular virtual reality movement methods.

Project Team:

Craig Anderton – Project Lead – PhD Candidate

Dr Arthur Theil – Director of Studies 

Prof Chris Creed – Second Supervisor

Dr Sayan Sarcar – Second Supervisor 

Project Impacts:

This project addresses the current concerns of visually impaired users that mainstream virtual reality experiences are not accessible to their own requirements. Our preliminary published results reinforce these concerns, highlighting how users are largely unable to tailor applications to their sensory needs, with few applications providing customisation options or haptic feedback control choices during movement. Our research will establish best practice guidelines for the design of accessible haptic cues for commercially available wearable haptic vests, with the distribution of guidelines for industry practitioners additionally helping to make guidance and navigation tasks in mainstream virtual reality more accessible for visually impaired users. This research aims to break down barriers in virtual reality for people with visual impairments, allowing them to virtually navigate successfully.

Funder:

This PhD project is funded through the Computer, Engineering and Built Environment Dean's Scholarship.