In our daily lives we are faced with increasingly complex systems, ranging from vacuum cleaning robots to smart environments where whole buildings are controlled automatically, which require interfaces that provide intuitive interaction protocols and affordances.
Smooth interaction requires the system to be aware of the user’s actions and intentions. We therefore focus our research on multi-modal perception to gain insights to the user’s intentions and states leading to models of users and situations. For example, intentions can be determined by interpreting the user’s attentional behavior or socio-emotional signals as cues to her evaluation of the current interactive situation.
These models of the user are an important basis for modeling interactions. More specifically, they allow the system to continuously react to user behavior by non-verbal behavior communicating the system’s attention and interpretation, thus enabling a more transparent and smooth interaction.
This can only be achieved through continuous evaluation and development cycles. In our evaluations we target fundamental questions regarding the influence of embodiment on user behavior and motivation, user behavior in smart environments, and in general detailed evaluations of specific interaction protocols in various settings. Our evaluation and development approach is supported by a systematic approach to systems engineering and integration which fosters reliability in our interactive systems. In more detail, research on system analysis provides among others not only tools for monitoring system states but also more general insights to system behavior in complex interactive, 24/7 systems.