Applied Informatics Group

Interactional Coordination and Incrementality in HRI – A Museum Guide Robot

In this project, we address the question of how to enable a robot to engage in fine-grained interactional coordination. To do so, the robot has to actively structure its interactional environment in a manageable way – all based on the robot’s permanent monitoring of the human’s multimodal conduct with its own system internal perception capabilities. The project proposal address two focus areas: (1) actively orienting visitors to an exhibit and (2) different natures and functions of user utterances (e.g. signaling trouble) and their intended addressees. The robot’s control architecture will be extended by mechanisms that enable it to autonomously engage in fine-grained interaction with human users. Using only the robot's internal capabilities, the system will be able to monitor the user's behavior, estimate a level of interest. These cues will be used as a basis for repairing strategies, enabling a proper reaction to confused users.

Recent Best Paper/Poster Awards

Goal Babbling of Acoustic-Articulatory Models with Adaptive Exploration Noise
Philippsen A, Reinhart F, Wrede B (2016)
International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob) 
PUB | PDF

 

Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees
Richter V, Carlmeyer B, Lier F, Meyer zu Borgsen S, Kummert F, Wachsmuth S, Wrede B (2016)
International Conference on Human-agent Interaction (HAI) 

 

"Look at Me!": Self-Interruptions as Attention Booster?
Carlmeyer B, Schlangen D, Wrede B (2016)
International Conference on Human Agent Interaction (HAI)
PUB | DOI

 

For members