A Research Agenda for Mixed Reality in Automated Vehicles

Research output: Chapter in Book/Report/Conference proceedingsConference contributionpeer-review

1 Citation (Scopus)

Abstract

With the increasing knowledge advancements and availability of mixed reality (MR), the number of its purposes and applications in vehicles rises. Mixed reality may help to increase road safety, support more immersive (non-) driving related activities, and finally enhance driving and passenger experience. MR may also be the enabling technology to increase trust and acceptance in automated vehicles and therefore help on the transition towards automated driving. Further, automated driving extends use cases of virtual reality and other immersive technologies. However, there are still a number of challenges with the use of mixed reality when applied in vehicles, and also several human factors issues need to be solved. This paper aims at presenting a research agenda for using mixed reality technology for automotive user interfaces (UIs) by identifying opportunities and challenges.

Original languageEnglish
Title of host publicationMUM 2020 - 19th International Conference on Mobile and Ubiquitous Multimedia, Proceedings
EditorsJessica Cauchard, Markus Lochtefeld
PublisherAssociation for Computing Machinery
Pages119-131
Number of pages13
ISBN (Electronic)9781450388702
DOIs
Publication statusPublished - 22 Nov 2020
Event19th International Conference on Mobile and Ubiquitous Multimedia, MUM 2020 - Virtual, Online, Germany
Duration: 22 Nov 202025 Nov 2020

Publication series

NameACM International Conference Proceeding Series

Conference

Conference19th International Conference on Mobile and Ubiquitous Multimedia, MUM 2020
Country/TerritoryGermany
CityVirtual, Online
Period22.11.202025.11.2020

Keywords

  • automated driving
  • automotive user interfaces
  • mixed reality
  • research agenda

Fingerprint

Dive into the research topics of 'A Research Agenda for Mixed Reality in Automated Vehicles'. Together they form a unique fingerprint.

Cite this