Virtual and Augmented Reality for Environmental Sustainability: A Systematic Review

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

3 Downloads (Pure)

Abstract

In recent years, extended reality (XR) technology has seen a rise in use in environmental subjects, i.e., climate change or biodiversity loss, as a potential tool to inform and engage the public with current and future environmental issues. However, research on the potential of XR technology for environmental sustainability is still in the early stages, and there is no clear synthesis of the methods studied in this field. To provide a clearer view of existing approaches and research objectives, we systematically reviewed current literature dealing with XR use in environmental topics. Although the results indicate that the volume of literature exploring XR in environmental applications is increasing, empirical evidence of its impact is limited, hindering the possibility of presently drawing significant conclusions on its potential benefits. Based on our analyses, we identified thematic, theoretical, and methodological knowledge gaps and provide a guideline to aid future research in the field.
Original languageEnglish
Title of host publicationCHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
PublisherACM
Number of pages23
ISBN (Electronic)978-1-4503-9421-5
DOIs
Publication statusPublished - 19 Apr 2023
Publication typeA4 Article in conference proceedings
EventACM SIGCHI annual conference on human factors in computing systems - Hamburg, Germany
Duration: 23 Apr 202328 Apr 2023

Conference

ConferenceACM SIGCHI annual conference on human factors in computing systems
Country/TerritoryGermany
CityHamburg
Period23/04/2328/04/23

Publication forum classification

  • Publication forum level 3

Fingerprint

Dive into the research topics of 'Virtual and Augmented Reality for Environmental Sustainability: A Systematic Review'. Together they form a unique fingerprint.

Cite this