XR towards tele-guidance: mixing realities in assistive technologies for blind and visually impaired people

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

6 Citations (Scopus)

Abstract

This work proposes the use of immersive asymmetric collaboration as an assistive technology for people who are blind or visually impaired (BVI). Similar to tele-guidance (TG) systems, it is feasible to transmit in-situ spatial information from the perspective of a BVI person wearing an Augmented Reality (AR) headset to be viewed by a remote sighted person wearing a Virtual Reality (VR) headset. Through this asymmetric collaborative guidance scenario, we expect remote guides may have better spatial understanding of the environment, being immersed in its digital twin, while BVI users may find guidance to be less time-consuming in their current environment, when compared to current TG systems, solely based on video. In the current experiment, TG is done in a Wizard-of-Oz fashion, while users (N = 18) are blindfolded and follow its audio cues in search of targets in a room. Results outlined that different audio interfaces have an impact on the system's usability, with one of our tested methods providing significantly better results in multiple usability metrics. As a proof of concept, we showcase how cross-reality asymmetric collaboration can be used in day-to-day manual tasks, such as finding objects in a pantry or a closet.
Original languageEnglish
Title of host publication2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023
PublisherIEEE
ISBN (Electronic)979-8-3503-4839-2
ISBN (Print)979-8-3503-4840-8
DOIs
Publication statusPublished - 2023
Externally publishedYes
Publication typeA4 Article in conference proceedings

Fingerprint

Dive into the research topics of 'XR towards tele-guidance: mixing realities in assistive technologies for blind and visually impaired people'. Together they form a unique fingerprint.

Cite this