Optimierung von Video- Okulographie Augenbewegungsmessungen bei schnellen Kopfimpulsen

Michael Platz, Thomas Haslwanter, Josef Scharinger

Research output: Chapter in Book/Report/Conference proceedingsConference contributionpeer-review


Video-based eye movement recording, often called "Video-Oculography" (VOG), is replacing other techniques as the standard tool for measuring eye movements, for clinical applications as well as for research. Since new types of video cameras now regularly offer sampling rates above 100 Hz, the biggest remaining problem is the artifacts caused by camera movement with respect to the head: an uncompensated camera movement of only 1mm causes measurement errors of approximately 5deg. This is particularly important for VOG recordings during "Rapid Head Impulses", a standardized procedure for testing the functional status of the vestibular system: there camera slippage of a few millimeters is almost impossible to prevent. The most successful approach to compensate camera slippage with respect to the head has been utilization of the location of reflections of external illumination sources on the corneal surface. This technique works well when illumination and camera are at a distance that is much larger than the focal length of the camera. When camera and illumination are head-mounted, though, the suitability of this approach has yet to be shown. Using biomechanical simulations of the pupil center as well as of specular reflections of illumination points on the human cornea, we investigate the suitability and the limitations of this approach. In particular, we compare to which extent consideration of the actual corneal shape and slippage path can improve the elimination of camera slippage during VOG recordings, compared to more established approaches like linear or polynomial curves fit to the distance between pupil center and corneal reflections. For practical applications, our compensation scheme is applied to VOG recordings during rapid head impulses. For VOG we use the "EyeSeeCam" system, which can provide 3D eye movements at up to 500 Hz; the actual camera slippage is determined by measurements of 3D head- and camera movement with a "Lukotronic" system; and the results are compared with our compensation scheme, which tries to eliminate the camera slippage using only the information from the video images.
Original languageEnglish
Title of host publicationNeuroScience 2008
Number of pages1
Publication statusPublished - 2008
Event38th annual meeting of the Society of Neuroscience - Washington D.C., United States
Duration: 15 Nov 200819 Nov 2008


Conference38th annual meeting of the Society of Neuroscience
Country/TerritoryUnited States
CityWashington D.C.
Internet address

Cite this