Today car-navigation systems are increasingly penetrating the automotive market. However, the need for location-based information systems is no longer limited to cars. Mobile outdoor navigation systems for pedestrians and electronic tourist guides are already available on PDAs. In addition, new indoor positioning technologies extend the area of application for location-based systems. Unfortunately, current navigation systems are burdened by the fact that they are bound to a specific tracking technology (e.g., a car navigation system works exclusively with GPS) and therefore cannot be employed in areas with alternative tracking equipment. Furthermore, the information is provided through an abstract metaphor that the user has to understand and translate into action. This paper presents a new augmented-reality-based paradigm and a framework for mobile navigation systems1 that pervasively extracts position and orientation information from any sensory source and enhances the association to the real world by combining video techniques and 3D-graphics in an augmented reality view.