AITentive - AITentive: AI supported Attentive User Interfaces

  • Wintersberger, Philipp (PI)
  • Wang, Yu (CoI)

Project Details

Description

Imagine you are in the process of writing an important document, and suddenly, you are notified by your smartphone about a message from one of your friends. You interrupt working on the document to quickly answer them before coming back to work on the document. As you were interrupted while drafting a longer sentence, you must reorient yourself in the document to determine how the sentence should be completed. What if the notification from your friend had arrived after you completed the sentence instead of while you were in the middle of it? Notifications and interruptions have become an integral part of our “multitasking” lives, although we all know they disturb our work patterns. Research has shown that interruptions negatively affect our productivity and well-being. In safety-critical settings – for example, while driving a car – notifications are not only time-costly, they can become a severe safety risk since many humans would communicate even while driving a vehicle, which is typically prohibited by law. To counter these adverse effects, computer scientists and psychologists have proposed to develop so-called “attentive user interfaces”, computer systems that better time notifications and interruptions so that no negative side effects can occur. However, this is a highly complex goal: Such systems should not only be aware of the users and their surroundings, but they might also need awareness of the activities we all are pursuing. However, since humans and activities are highly complex and diverse, a completely domain- and task-independent attentive user interface has not been built so far. The proposed project “AITentive” (a word creation combining artificial intelligence AI and attentive) aims at solving this issue with the help of AI algorithms. Within the scope of the project, a system will be developed that can learn by itself when notifications and interruptions are most suitable so that safety and productivity can be increased. This system should work independently of particular humans, situations, or tasks/activities. For example, it should be able to automatically adapt to document writing and car driving situations, as discussed before. Successful implementation of such an interface can potentially improve humans’ interactions with computerized systems. Ultimately, the “Attentive User Interface” developed within the scope of the project may be able to improve safety and productivity while maintaining human well-being in a wide range of scenarios.
Short titleAITentive
StatusActive
Effective start/end date01.01.202331.12.2025

Funding agency

  • FWF - Stand-Alone Projects

UN Sustainable Development Goals

In 2015, UN member states agreed to 17 global Sustainable Development Goals (SDGs) to end poverty, protect the planet and ensure prosperity for all. This project contributes towards the following SDG(s):

  • SDG 3 - Good Health and Well-being

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.
  • Self-Balancing Bicycles: Qualitative Assessment and Gaze Behavior Evaluation

    Wintersberger, P., Shahu, A., Reisinger, J., Alizadeh, F. & Michahelles, F., 27 Nov 2022, Proceedings of MUM 2022, the 21st International Conference on Mobile and Ubiquitous Multimedia. Doring, T., Boll, S., Colley, A., Esteves, A. & Guerreiro, J. (eds.). Association for Computing Machinery, p. 189-199 11 p. (ACM International Conference Proceeding Series).

    Research output: Chapter in Book/Report/Conference proceedingsConference contributionpeer-review

    Open Access
    5 Citations (Scopus)