Augmenting VR/XR experiences using directional vibrotactile feedback and temperature variation using wearable devices

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

54 Downloads (Pure)

Abstract

As virtual and mixed reality hardware systems become more mainstream, users are spending substantial amounts of time in simulated environments. Unlike the transition from desktop to mobile devices, VR/XR utilizes 360 wrap-around space which can be challenging to master even for experienced users. Tasks and tools commonly utilized in 2D environments within mobile and personal computing devices may not always be intuitive for VR space. For that reason, it is important to study and evaluate which common graphical user interface (GUI) techniques can be extended to VR/XR and how the efficiency of common 2D tools need to be improved within a 360-degree space. In this study authors explore six commonly used GUI tools and evaluate them in a VR environment. The research looks at how participants deconstruct 360-degree GUI tasks by identifying the location of the controls, navigating through the VR space to the relevant area and finally adjusting the GUI controls as instructed. The study looks at augmenting the interaction by providing vibrotactile navigation cues along with kinaesthetic and temperature-based feedback to complete the GUI tasks. Comparing to conventional visual only techniques that are currently being used in VR environments, vibrotactile, kinaesthetic and temperature feedback provided faster task completion times and more pleasant user experience. Participants also rated the additional feedback channels as more informative and less distracting within the virtual environment. Overall results show that participants preferred the novel use of haptic feedback for most of the GUI controls assessed within the study. Moreover, results also show that some more complex GUI controls (i.e., dial, menus, and lists) may not be best suited for VR 360-degree interaction, using visual only information channels, especially with non-robust inside-out hand tracking techniques. Additional research is needed to validate these results across different VR/XR hardware and simulated environments, however, current results point towards utilizing multi-modal and multi-technology interaction tools to create more immersive and intuitive 360 virtual spaces across a wide range of VR/XR devices.
Original languageEnglish
Title of host publicationHuman Factors and Wearable Technologies
Subtitle of host publicationProceedings of the 14th International Conference on Applied Human Factors and Ergonomics and the Affiliated Conferences
Pages81-91
DOIs
Publication statusPublished - 2023
Publication typeA4 Article in conference proceedings
EventInternational Conference on Applied Human Factors and Ergonomics and the Affiliated Conferences - , United States
Duration: 20 Jul 202324 Jul 2023

Publication series

NameAHFE International
Volume85
ISSN (Electronic)2771-0718

Conference

ConferenceInternational Conference on Applied Human Factors and Ergonomics and the Affiliated Conferences
Country/TerritoryUnited States
Period20/07/2324/07/23

Keywords

  • Human Computer Interaction
  • Wearable Technologies
  • Haptics
  • Vibrotactile Interaction
  • Augmented Reality
  • Virtual Reality

Publication forum classification

  • Publication forum level 1

Fingerprint

Dive into the research topics of 'Augmenting VR/XR experiences using directional vibrotactile feedback and temperature variation using wearable devices'. Together they form a unique fingerprint.

Cite this