Abstract

Emergency responders or task forces can benefit from outdoor Mixed Reality (MR) trainings, as they allow more realistic and affordable simulations of real-world emergencies. Utilizing MR devices for outdoor situations requires knowledge of real-world objects in the training area, enabling the realistic immersion of both, the real, as well as the virtual world, based on visual occlusions. Due to spatial limitations of state-of-theart MR devices recognizing distant real-world items, we present an approach for sharing geo-referenced 3D geometries across multiple devices utilizing the CityJSON format for occlusion purposes in the context of geospatial MR visualization. Our results show that the presented methodology allows accurate conversion of occlusion models to geo-referenced representations based on a quantitative evaluation with an average error according to the vertices’ position from 1.30E-06 to 2.79E-04 (sub-millimeter error) using a normalized sum of squared errors metric. In the future, we plan to also incorporate 3D reconstructions from smartphones and drones to increase the number of supported devices for creating geo-referenced occlusion models.

OriginalspracheEnglisch
TitelProceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications
Seiten113-122
Seitenumfang10
Band3
DOIs
PublikationsstatusVeröffentlicht - 7 Feb. 2022

Publikationsreihe

NameProceedings of the International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications
ISSN (Print)2184-5921

Fingerprint

Untersuchen Sie die Forschungsthemen von „Geo-Referenced Occlusion Models for Mixed Reality Applications using the Microsoft HoloLens“. Zusammen bilden sie einen einzigartigen Fingerprint.

Zitieren