Abstract
We aim at activity and context recognition in opportunistic sensor setups. The system ought to make use of sensor modalities that just happen to be available, rather than to rely on specic sensor deployment. In order to assess opportunistic activity recognition methods, we collected a large-scale dataset of complex activities in a highly sensor rich environment, with 72 sensors of 10 modalities in the environment, in objects and on-body. The dataset contains composite and atomic activities in large numbers (>28000 hand interactions). We present the activity scenario and the sensor setup. We show the user's activities and the corresponding sensor signals side by side. We argue that such a visualization may be an efficient form of dataset documentation, especially when such a dataset is shared, as it gives an insight into the complexity of the activities and richness of the sensor setup.
Originalsprache | Englisch |
---|---|
Titel | Adjunct Proceedings of the 8th International Conference on Pervasive Computing (Pervasive 2010), Video Paper |
Publikationsstatus | Veröffentlicht - 2010 |
Veranstaltung | 8th International Conference on Pervasive Computing (Pervasive 2010) - Helsinki, Finnland Dauer: 17 Mai 2010 → 20 Mai 2010 |
Konferenz
Konferenz | 8th International Conference on Pervasive Computing (Pervasive 2010) |
---|---|
Land/Gebiet | Finnland |
Ort | Helsinki |
Zeitraum | 17.05.2010 → 20.05.2010 |