Applied Informatics Group

Speech-Based Dialog

We see speech-based dialog as a key aspect for virtual or robotic agents to engage in a successful interaction with the human users. Besides investigating dialog models.

Users interacting with the NAO robot.

Research Questions

One of the key aspects in cognitive social interaction for agents is the capability to successfully conduct dialog using speech and other modalities. Inspired by Clark's notion of grounding we see dialog as a process that achieves mutual understanding between all dialog participants.


Another key aspect is the capability of incremental processing for modeling asynchronous human-agent interactions to allow for closed feedback loops in HAI [1]. We thereby focus on feedback to user behavior by closely monitoring user's behavior such as attention or task related actions[2] [3].

The situated nature of human-agent interaction imposes special requirements upon dialog modeling [4]. Assessing interaction quality through measures for user engagement and establishing behavior that increases this quality is another important aim for a agent's abilities [5]. Mixed-initiative interaction enables the agent to actively request information in order to better interpret the situation in task-oriented scenarios [6].


Britta Wrede

Related Projects

  • CSRA: The Cognitive Service Robotics Apartment
  • BMBF project KogniHome
  • Incremental dialog management for human agent interaction (Birte Carlmeyer)
  • EU project HUMAVIPS - Humanoids with Auditory and Visual Abilities In Populated Spaces
  • Context-aware speech recognition (Christian Munier)
  • Engagement in robot-to-group dialog (David Klotz)
  • PaMini: Modeling Human-Robot-Interaction Based on Generic Interaction Patterns (Julia Peltason)
  • Multi-modal interaction management for a robot companion (Shuyin Li)

Related Publications

  1. 2014 | Conference Paper | PUB-ID: 2906917
    Towards Closed Feedback Loops in HRI
    Carlmeyer B, Schlangen D, Wrede B (2014)
    In: Proceedings of the 2014 Workshop on Multimodal, Multi-Party, Real-World Human-Robot Interaction - MMRWHRI '14. Association for Computing Machinery (ACM).
    PUB | PDF | DOI
  2. 2016 | Conference Paper | PUB-ID: 2907320
    Exploring self-interruptions as a strategy for regaining the attention of distracted users
    Carlmeyer B, Schlangen D, Wrede B (2016)
    In: Proceedings of the 1st Workshop on Embodied Interaction with Smart Environments - EISE '16. Association for Computing Machinery (ACM).
    PUB | PDF | DOI
  3. 2017 | Conference Paper | PUB-ID: 2909295
    Ready for the Next Step?: Investigating the Effect of Incremental Information Presentation in an Object Fetching Task
    Chromik M, Carlmeyer B, Wrede B (2017)
    In: Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction - HRI '17. Association for Computing Machinery (ACM).
    PUB | PDF | DOI
  4. 2010 | Conference Paper | PUB-ID: 2034737
    Modeling Human-Robot Interaction Based on Generic Interaction Patterns
    Peltason J, Wrede B (2010)
    In: AAAI Fall Symposium: Dialog with Robots. Arlington, VA, USA: AAAI Press.
    PUB | PDF
  5. 2011 | Conference Paper | PUB-ID: 2278880
    Engagement-based Multi-party Dialog with a Humanoid Robot
    Klotz D, Wienke J, Peltason J, Wrede B, Wrede S, Khalidov V, Odobez J-M (2011)
    In: Proceedings of the SIGDIAL 2011 Conference. Association for Computational Linguistics: 341-343.
  6. 2009 | Conference Paper | PUB-ID: 1890379
    The Curious Robot - Structuring Interactive Robot Learning
    Lütkebohle I, Peltason J, Schillingmann L, Elbrechter C, Wrede B, Wachsmuth S, Haschke R (2009)
    In: International Conference on Robotics and Automation. IEEE: 4156-4162.
    PUB | PDF | DOI

Recent Best Paper/Poster Awards

Goal Babbling of Acoustic-Articulatory Models with Adaptive Exploration Noise
Philippsen A, Reinhart F, Wrede B (2016)
International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob) 


Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees
Richter V, Carlmeyer B, Lier F, Meyer zu Borgsen S, Kummert F, Wachsmuth S, Wrede B (2016)
International Conference on Human-agent Interaction (HAI) 


"Look at Me!": Self-Interruptions as Attention Booster?
Carlmeyer B, Schlangen D, Wrede B (2016)
International Conference on Human Agent Interaction (HAI)


For members