Applied Informatics Group

Perception of Humans

Multi-modal perception of an agent's environment is crucial for interacting naturally with human partners in non-laboratory settings. This applies to robots as well as to intelligent smart-homes or interactive virtual agents. We use multi-modal perception of activities on a global and local human-centered level.

Research Questions

Global activity analysis

Global activity analysis aims at detecting and recognizing interaction partners. We detect humans in the environment using multi-modal cues such as speech, rhythm, timing, attention, positioning, and posture, and person anchoring. Basal 3D information e.g. , is represented in an articulated scene model to keep track of interaction partners and as a prior to identify manipulatable objects [1].

Identification of interaction relevant cues

In order to interpret interaction situations, we focus on the recognition of semantic events of human interaction partners. With the help of human cues, we analyze joint spaces between humans and robots to research spatial behaviors and enable a smooth transition between far away and face-to-face human-robot interaction [2].

We also examine human behaviors in specific situations such as at a receptionist desk or while tooth-brushing [3]. Within these situations, we for example detect deviations from standard behavior to support the mentally inhibited. 

Contact

Sven Wachsmuth

Related Projects

Related Publications

  1.  
  2. 2011 | Journal Article | PUB-ID: 2481769
    How Can I Help? - Spatial Attention Strategies for a Receptionist Robot
    Holthaus P, Pitsch K, Wachsmuth S (2011)
    International Journal of Social Robotics 3(4): 383-393.
    PUB | PDF | DOI | WoS
     
  3. 2012 | Conference Paper | PUB-ID: 2423452
    User Behavior Recognition For An Automatic Prompting System - A Structured Approach based on Task Analysis
    Peters C, Hermann T, Wachsmuth S (2012)
    In: Proceedings of the 1st Int. Conf. on Pattern Recognition Applications and Methods (ICPRAM)., 2. SciTePress: 162-171.
    PUB | DOI
     

Recent Best Paper/Poster Awards

Goal Babbling of Acoustic-Articulatory Models with Adaptive Exploration Noise
Philippsen A, Reinhart F, Wrede B (2016)
International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob) 
PUB | PDF

 

Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees
Richter V, Carlmeyer B, Lier F, Meyer zu Borgsen S, Kummert F, Wachsmuth S, Wrede B (2016)
International Conference on Human-agent Interaction (HAI) 

 

"Look at Me!": Self-Interruptions as Attention Booster?
Carlmeyer B, Schlangen D, Wrede B (2016)
International Conference on Human Agent Interaction (HAI)
PUB | DOI

 

For members