Abstract
Orientation sensing is considered an important means to implement embedded technology enhanced artifacts (often referred to as "smart artifacts"), exhibiting embodied means of interaction based on their position, orientation, and the respective dynamics. Considering artifacts subject to manual (or "by-hand") manipulation by the user, we identify hand worn, hand carried and (hand) graspable real world objects as exhibiting different artifact orientation dynamics, justifying an analysis along these three categories. We refer to orientation dynamics as "gestures" in an abstract sense, and present a general framework for orientation sensor based gesture recognition. The framework specification is independent of sensor technology and classification methods, and elaborates an application-independent set of gestures. It enables multi sensor interoperability and it accommodates a variable number of sensors. A core component of the framework is a gesture library that contains gestures from three categories: hand gestures, gestures of artifact held permanently and gestures of artifact that are detached from the hand and are manipulated occasionally. An inertial orientation sensing based gesture detection and recognition system is developed and composed into a gesture-based interaction development framework. The use of this framework is demonstrated with the development of tangible remote controls for a media player, both in hardware and in software.
Original language | English |
---|---|
Pages (from-to) | 1552-1563 |
Number of pages | 12 |
Journal | Computer Communications |
Volume | 28 |
Issue number | 13 SPEC. ISS. |
DOIs | |
Publication status | Published - 2 Aug 2005 |
Keywords
- Embodied interaction
- Gesture recognition
- Inertial sensors
- Orientation tracking
- Tangible user interface