Applied Informatics Group

Speech-Based Dialog

We see speech-based dialog as a key aspect for virtual or robotic agents to engage in a successful interaction with the human users. Besides investigating dialog models.

Users interacting with the NAO robot.

Research Questions

One of the key aspects in cognitive social interaction for agents is the capability to successfully conduct dialog using speech and other modalities. Inspired by Clark's notion of grounding we see dialog as a process that achieves mutual understanding between all dialog participants.


Another key aspect is the capability of incremental processing for modeling asynchronous human-agent interactions to allow for closed feedback loops in HAI [1]. We thereby focus on feedback to user behavior by closely monitoring user's behavior such as attention or task related actions[2] [3].

The situated nature of human-agent interaction imposes special requirements upon dialog modeling [4]. Assessing interaction quality through measures for user engagement and establishing behavior that increases this quality is another important aim for a agent's abilities [5]. Mixed-initiative interaction enables the agent to actively request information in order to better interpret the situation in task-oriented scenarios [6].


Britta Wrede

Related Projects

  • CSRA: The Cognitive Service Robotics Apartment
  • BMBF project KogniHome
  • Incremental dialog management for human agent interaction (Birte Carlmeyer)
  • EU project HUMAVIPS - Humanoids with Auditory and Visual Abilities In Populated Spaces
  • Context-aware speech recognition (Christian Munier)
  • Engagement in robot-to-group dialog (David Klotz)
  • PaMini: Modeling Human-Robot-Interaction Based on Generic Interaction Patterns (Julia Peltason)
  • Multi-modal interaction management for a robot companion (Shuyin Li)

Related Publications

Recent Best Paper/Poster Awards

Goal Babbling of Acoustic-Articulatory Models with Adaptive Exploration Noise
Philippsen A, Reinhart F, Wrede B (2016)
International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob) 


Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees
Richter V, Carlmeyer B, Lier F, Meyer zu Borgsen S, Kummert F, Wachsmuth S, Wrede B (2016)
International Conference on Human-agent Interaction (HAI) 


"Look at Me!": Self-Interruptions as Attention Booster?
Carlmeyer B, Schlangen D, Wrede B (2016)
International Conference on Human Agent Interaction (HAI)


For members