TY - JOUR
T1 - Orientation Sensing for Gesture-Based Interaction with Smart Artifacts
AU - Ferscha, Alois
AU - Resmerita, Stefan
AU - Holzmann, Clemens
AU - Reichör, Stefan
PY - 2005/8/2
Y1 - 2005/8/2
N2 - Orientation sensing is considered an important means to implement embedded technology enhanced artifacts (often referred to as "smart artifacts"), exhibiting embodied means of interaction based on their position, orientation, and the respective dynamics. Considering artifacts subject to manual (or "by-hand") manipulation by the user, we identify hand worn, hand carried and (hand) graspable real world objects as exhibiting different artifact orientation dynamics, justifying an analysis along these three categories. We refer to orientation dynamics as "gestures" in an abstract sense, and present a general framework for orientation sensor based gesture recognition. The framework specification is independent of sensor technology and classification methods, and elaborates an application-independent set of gestures. It enables multi sensor interoperability and it accommodates a variable number of sensors. A core component of the framework is a gesture library that contains gestures from three categories: hand gestures, gestures of artifact held permanently and gestures of artifact that are detached from the hand and are manipulated occasionally. An inertial orientation sensing based gesture detection and recognition system is developed and composed into a gesture-based interaction development framework. The use of this framework is demonstrated with the development of tangible remote controls for a media player, both in hardware and in software.
AB - Orientation sensing is considered an important means to implement embedded technology enhanced artifacts (often referred to as "smart artifacts"), exhibiting embodied means of interaction based on their position, orientation, and the respective dynamics. Considering artifacts subject to manual (or "by-hand") manipulation by the user, we identify hand worn, hand carried and (hand) graspable real world objects as exhibiting different artifact orientation dynamics, justifying an analysis along these three categories. We refer to orientation dynamics as "gestures" in an abstract sense, and present a general framework for orientation sensor based gesture recognition. The framework specification is independent of sensor technology and classification methods, and elaborates an application-independent set of gestures. It enables multi sensor interoperability and it accommodates a variable number of sensors. A core component of the framework is a gesture library that contains gestures from three categories: hand gestures, gestures of artifact held permanently and gestures of artifact that are detached from the hand and are manipulated occasionally. An inertial orientation sensing based gesture detection and recognition system is developed and composed into a gesture-based interaction development framework. The use of this framework is demonstrated with the development of tangible remote controls for a media player, both in hardware and in software.
KW - Embodied interaction
KW - Gesture recognition
KW - Inertial sensors
KW - Orientation tracking
KW - Tangible user interface
UR - http://www.scopus.com/inward/record.url?scp=23844515535&partnerID=8YFLogxK
U2 - 10.1016/j.comcom.2004.12.046
DO - 10.1016/j.comcom.2004.12.046
M3 - Article
SN - 0140-3664
VL - 28
SP - 1552
EP - 1563
JO - Computer Communications
JF - Computer Communications
IS - 13 SPEC. ISS.
ER -