Imagine you are in the process of writing an important document, and suddenly, you are notified by your smartphone about a message from one of your friends. You interrupt working on the document to quickly answer them before coming back to work on the document. As you were interrupted while drafting a longer sentence, you must reorient yourself in the document to determine how the sentence should be completed. What if the notification from your friend had arrived after you completed the sentence instead of while you were in the middle of it? Notifications and interruptions have become an integral part of our “multitasking” lives, although we all know they disturb our work patterns. Research has shown that interruptions negatively affect our productivity and well-being. In safety-critical settings – for example, while driving a car – notifications are not only time-costly, they can become a severe safety risk since many humans would communicate even while driving a vehicle, which is typically prohibited by law. To counter these adverse effects, computer scientists and psychologists have proposed to develop so-called “attentive user interfaces”, computer systems that better time notifications and interruptions so that no negative side effects can occur. However, this is a highly complex goal: Such systems should not only be aware of the users and their surroundings, but they might also need awareness of the activities we all are pursuing. However, since humans and activities are highly complex and diverse, a completely domain- and task-independent attentive user interface has not been built so far. The proposed project “AITentive” (a word creation combining artificial intelligence AI and attentive) aims at solving this issue with the help of AI algorithms. Within the scope of the project, a system will be developed that can learn by itself when notifications and interruptions are most suitable so that safety and productivity can be increased. This system should work independently of particular humans, situations, or tasks/activities. For example, it should be able to automatically adapt to document writing and car driving situations, as discussed before. Successful implementation of such an interface can potentially improve humans’ interactions with computerized systems. Ultimately, the “Attentive User Interface” developed within the scope of the project may be able to improve safety and productivity while maintaining human well-being in a wide range of scenarios.
Short title | AITentive |
---|
Status | Active |
---|
Effective start/end date | 01.01.2023 → 31.12.2025 |
---|
In 2015, UN member states agreed to 17 global Sustainable Development Goals (SDGs) to end poverty, protect the planet and ensure prosperity for all. This project contributes towards the following SDG(s):