Skip to main content
The Keyword

Google AR & VR

Daydream Labs: Accessibility in VR

Article's hero media

Virtual reality offers the ability to explore new worlds and have adventures without leaving home. We love the sense of freedom that VR offers, but it’s a technology that still relies mainly on visual cues—which makes it inaccessible to people with visual impairments. To bring these incredible experiences to visually impaired people, the technology needs to offer new tools. So we’ve been exploring how spatial audio cues can be used for navigating and interacting with virtual environments.

Compared to VR, other technologies offer more tools and help to visually-impaired people. One example is the alternative text found on website images, and another is Google TalkBack, which adds spoken, audible and haptic feedback to help visually-impaired people interact with their devices. With these technologies as inspiration, we created an audio tool aimed at making VR more accessible.

Accessibility in VR
10:25

Using an HTC Vive, we built a prototype of a 1:1 scale virtual room, recorded the name of every object in the room, and linked these audio labels to the individual objects—including the floor, walls and other features. Then, we made the user’s field of vision entirely black to simulate complete blindness. To enable navigation in the pitch-black room, we created a 3D audio laser system that includes a laser pointer extending from the Vive controller to select and play the audio labels, and an audio location control (touchpad click) to provide distance and direction to the last object aimed at by the laser pointer.

roomexvr

When a person aims the laser pointer at a virtual object and selects the audio location control, the VR system plays a short impulse response tone at location of the controller. Then the sound is played a few more times as it quickly progresses to the location of the virtual object. Because all audio is processed using the Google VR Spatial Audio plugin, each tone provides enough information to understand distance and relative location of the object in the virtual space.

To test our prototype, we challenged participants to find and pick up a toy laser gun within the virtual room, navigate to the window, and finally shoot at a duck moving outside the window. We ran six non-visually-impaired people through the prototype, and all of them were able to complete the challenge successfully. After the task was completed, four of them went through the experience again, this time able to see the room without vision impairment. Because they had navigated the room by sound, we found that they were already familiar with their surroundings.

DisPerVR

It’s a small step, but this experiment demonstrated that it’s possible to navigate and interact with a room in VR using only auditory cues. We hope others will also continue to explore ways to make VR accessible for everyone. There’s much more to do in this area!

You can find more details in our published technical disclosure.

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe