Especially in the field of augmented reality, spatial orientation is of high relevance for applications. Apart from the position in the three-dimensional space, this is one of the most important factors to be able to represent virtual objects in space precisely and correctly. The orientation in the case of so-called head-mounted displays, such as the Microsoft HoloLens, is equal to the direction of view, whereby this data decides about which information is displayed in the user´s field of view and which is not. From the current state of the art this is primarily achieved marker-based with visual indications in the environment and/or with the help of sensors. Based on the augmented reality frameworks Vuforia and Kudan AR, this paper discusses two common possibilities for marker-based determination of the own orientation and describes the problem of this feature-based approach that applications depend on a single fixed position or on an enormous number of reference images for each virtual object. This lack of practicality leads to the necessity of an alternative approach from which the aim of this discourse is derived to find a feature-independent, image-based attempt, that determines the orientation based on the calculation of the image shift. This approach is based on the hypothesis, which states that the rotation around one's own axis can be determined in a simplified manner by a translation of the input images. The implementation of this hypothesis is evaluated premised on several test scenarios. Based on these results, the question of this paper "Is it possible to determine outdoor the precise orientation in space at a constant starting point with image-based processes?" is answered and is considered as underpinned because of the evaluation result of a median difference of 0.00015° between the real and the calculated rotation.
|Translated title of the contribution||Image-based Outdoor Orientation|
|Number of pages||47|
|Publication status||Published - 2017|