Translating visual information into tactile and auditory information that a blind person can use to navigate is a challenge in the physical world; it’s even more of a puzzle in the virtual world. But researchers at Microsoft are tackling the challenge of creating accessible VR. Their navigation tool, the Canetroller, is a virtual white cane that enables navigation in a VR environment. It mimics the feedback people get from touching, sweeping, and tapping items in the physical environment while using a long cane for navigational purposes.

Canetroller takes O&M skills into immersive environments

O&M—Orientation and Mobility—skills teach a blind person to understand where they are in a physical space, figure out where they want to go, and plan and implement a path to get there. Physical objects might be destinations (your office building or your desk) or obstacles (a curb or a trash can you could trip over). Some obstacles also serve as navigational markers.

A blind person using a white cane taps on different surfaces; the differences in sound and texture convey information about the size and type of object, as well as its location. Many blind people use a form of echolocation to aid their navigation, bouncing sound off of large and small objects to gauge distance and location. Guide dogs generally steer their human partners around obstacles, but both dog and human use physical landmarks to navigate.

In a virtual environment, neither the audio and physical cues, nor the cane and canine assistants, are available. Microsoft is seeking to change that with the Canetroller, which seeks to provide both audio and tactile cues to blind VR users.

This haptic device simulates the interactions a blind person has with her environment when using a navigation cane by providing haptic and auditory feedback that allows a blind person to navigate a virtual environment. Tactile feedback from a physical cane helps users understand the shape, position, and even texture of a virtual object. Audio feedback provides information on size and location.

“Current VR solutions rely mainly on realistic visual feedback to provide an immersive experience to sighted people,” researchers Yuhang Zhao et al. wrote. “Since the white cane is the most common tool for blind people to navigate the real world, our controller simulates many white cane interactions, which allows users to transfer their white cane skill from the real world into the virtual world.”

Increasing access to VR training and entertainment

Potential uses range from enabling O&M instructors to provide students with a broader range of practice environments to enabling blind users to engage with entertainment or educational VR content, navigating those spaces with the Canetroller.

Participants use a VR headset, which provides the 3D audio of the virtual environment, and they hold the Canetroller to feel the haptic feedback. They also wear a belt that includes a brake mechanism, a “voice coil,” and a tracker.

Users receive three types of feedback via the Canetroller:

  • Braking: The brake mechanism generates physical resistance to stop movement whenever the virtual cane hits a virtual object to the user’s left or right side. The initial implementation does not include braking for vertical items or when the cane hits an obstacle in front of or behind the user.
  • Audio feedback and cues: The VR system generates 3D audio as part of the immersive environment. In addition, the Canetroller beeps when the user approaches an obstacle, if the cane penetrates a virtual object, or if the user moves outside of the virtual space. Additional audio feedback includes realistic tapping and other sounds to mimic those that occur in the physical world when the user taps, sweeps, or otherwise touches a surface or an object with the cane. As in the physical world, the sounds help the user learn about the size, shape, and physical makeup of their environment. Sounds rendered vary depending on the surface type and collision speed: Different feedback is provided when a user is quickly sweeping the Canetroller vs. slowly exploring a surface or object, for example. This feedback makes it possible for a cane user to tell not only that they’ve found, say, a trashcan, but also the approximate size and shape of the item—much as it would in a physical environment.
  • Vibrotactile feedback: When the user touches an item or sweeps a surface in the VR environment, Canetroller’s vibrotactile feedback mimics the way a physical cane feels when it encounters different textures, such as a carpet vs. a tile floor.

The actual Canetroller device is compact, but the length of the virtual cane it simulates is adjustable, based on each user’s height, the length of the cane they use in the physical world, and how they hold their cane.

Testers, who included people with a range of visual impairments as well as a group of sighted and visually impaired O&M instructors, were able to successfully navigate virtual indoor and outdoor test environments. They reported being able to perceive texture, discover the size and location of objects like a table or a wastebasket, and find a virtual door in a virtual wall. The researchers plan to evaluate additional interaction techniques and explore expanding the braking mechanism to up-down and forward-backward movements.

Microsoft recently released a set of navigational tools for low-vision and blind VR users, SeeingVR. While these tools have not yet been integrated with the Canetroller, it may be possible to use them simultaneously in compatible apps.

Explore VR

Virtual and augmented reality are increasingly relevant to L&D teams. Explore these and other emerging technologies at The eLearning Guild’s Emerging Technologies Online Conference, July 17–18, 2019. Register today—or purchase an Online Conference subscription and get a full year of online events with eLearning thought leaders.