In this scenario, a robot interacts with a human by communicative gestures in order to give directions on a map .
The humanoid robot iCub is faced by a human interaction partner with a map on a desk between them. A guest asks the robot what and where certain objects or areas in their common vicinity are. The robot is equipped with a spoken dialog system and deictic gestures skills so that objects or areas can be referenced in a multimodal interaction. For deictic references the visitor points at regions on the map which constitutes the interaction space between the guest and the robot. In turn, iCub uses the same kind of modalities (speech and deictic gestures) in order to refer to the real world location corresponding to the map location.
Recent Best Paper/Poster Awards
Philippsen A, Reinhart F, Wrede B (2016)
International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob)
Richter V, Carlmeyer B, Lier F, Meyer zu Borgsen S, Kummert F, Wachsmuth S, Wrede B (2016)
International Conference on Human-agent Interaction (HAI)
Carlmeyer B, Schlangen D, Wrede B (2016)
International Conference on Human Agent Interaction (HAI)