TY - GEN
T1 - Augmenting VR/XR experiences using directional vibrotactile feedback and temperature variation using wearable devices
AU - Farooq, Ahmed
AU - Rantala, Jussi
AU - Li, Zhenxing
AU - Raisamo, Roope
PY - 2023
Y1 - 2023
N2 - As virtual and mixed reality hardware systems become more mainstream, users are spending substantial amounts of time in simulated environments. Unlike the transition from desktop to mobile devices, VR/XR utilizes 360 wrap-around space which can be challenging to master even for experienced users. Tasks and tools commonly utilized in 2D environments within mobile and personal computing devices may not always be intuitive for VR space. For that reason, it is important to study and evaluate which common graphical user interface (GUI) techniques can be extended to VR/XR and how the efficiency of common 2D tools need to be improved within a 360-degree space. In this study authors explore six commonly used GUI tools and evaluate them in a VR environment. The research looks at how participants deconstruct 360-degree GUI tasks by identifying the location of the controls, navigating through the VR space to the relevant area and finally adjusting the GUI controls as instructed. The study looks at augmenting the interaction by providing vibrotactile navigation cues along with kinaesthetic and temperature-based feedback to complete the GUI tasks. Comparing to conventional visual only techniques that are currently being used in VR environments, vibrotactile, kinaesthetic and temperature feedback provided faster task completion times and more pleasant user experience. Participants also rated the additional feedback channels as more informative and less distracting within the virtual environment. Overall results show that participants preferred the novel use of haptic feedback for most of the GUI controls assessed within the study. Moreover, results also show that some more complex GUI controls (i.e., dial, menus, and lists) may not be best suited for VR 360-degree interaction, using visual only information channels, especially with non-robust inside-out hand tracking techniques. Additional research is needed to validate these results across different VR/XR hardware and simulated environments, however, current results point towards utilizing multi-modal and multi-technology interaction tools to create more immersive and intuitive 360 virtual spaces across a wide range of VR/XR devices.
AB - As virtual and mixed reality hardware systems become more mainstream, users are spending substantial amounts of time in simulated environments. Unlike the transition from desktop to mobile devices, VR/XR utilizes 360 wrap-around space which can be challenging to master even for experienced users. Tasks and tools commonly utilized in 2D environments within mobile and personal computing devices may not always be intuitive for VR space. For that reason, it is important to study and evaluate which common graphical user interface (GUI) techniques can be extended to VR/XR and how the efficiency of common 2D tools need to be improved within a 360-degree space. In this study authors explore six commonly used GUI tools and evaluate them in a VR environment. The research looks at how participants deconstruct 360-degree GUI tasks by identifying the location of the controls, navigating through the VR space to the relevant area and finally adjusting the GUI controls as instructed. The study looks at augmenting the interaction by providing vibrotactile navigation cues along with kinaesthetic and temperature-based feedback to complete the GUI tasks. Comparing to conventional visual only techniques that are currently being used in VR environments, vibrotactile, kinaesthetic and temperature feedback provided faster task completion times and more pleasant user experience. Participants also rated the additional feedback channels as more informative and less distracting within the virtual environment. Overall results show that participants preferred the novel use of haptic feedback for most of the GUI controls assessed within the study. Moreover, results also show that some more complex GUI controls (i.e., dial, menus, and lists) may not be best suited for VR 360-degree interaction, using visual only information channels, especially with non-robust inside-out hand tracking techniques. Additional research is needed to validate these results across different VR/XR hardware and simulated environments, however, current results point towards utilizing multi-modal and multi-technology interaction tools to create more immersive and intuitive 360 virtual spaces across a wide range of VR/XR devices.
KW - Human Computer Interaction
KW - Wearable Technologies
KW - Haptics
KW - Vibrotactile Interaction
KW - Augmented Reality
KW - Virtual Reality
U2 - 10.54941/ahfe1003629
DO - 10.54941/ahfe1003629
M3 - Conference contribution
T3 - AHFE International
SP - 81
EP - 91
BT - Human Factors and Wearable Technologies
T2 - International Conference on Applied Human Factors and Ergonomics and the Affiliated Conferences
Y2 - 20 July 2023 through 24 July 2023
ER -