Applied Informatics Group

AR-intercepted Communication

To investigate communication properly, one approach is to monitor as many channels that transmit communicative signals as possible. Apart from speech this includes also nonverbal channels like facial expressions, gestures, head gestures, eye gaze, body posture etc.

While monitoring usually is done with the help of scene cameras which do not participante in the interaction because of their perspective, it is also possible to use some sensors that allow the monitoring from the participants' perspective. This can be achieved by using multimodal augmented reality devices like AR goggles and headphones. Although some natural communication channels are occluded by this technique there are still channels left for investigation: speech, gestures, head gestures, body posture etc. On the other hand, this enables us to monitor exactly these visual and auditory cues that are percepted by the users. Furthermore, it offers also other possibilities for the transmitted interaction signals:

  • monitoring
  • recording
  • analysis (semi-automatic)
  • modification
  • enhancement

Further information:

Recent Best Paper/Poster Awards

Goal Babbling of Acoustic-Articulatory Models with Adaptive Exploration Noise
Philippsen A, Reinhart F, Wrede B (2016)
International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob) 
PUB | PDF

 

Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees
Richter V, Carlmeyer B, Lier F, Meyer zu Borgsen S, Kummert F, Wachsmuth S, Wrede B (2016)
International Conference on Human-agent Interaction (HAI) 

 

"Look at Me!": Self-Interruptions as Attention Booster?
Carlmeyer B, Schlangen D, Wrede B (2016)
International Conference on Human Agent Interaction (HAI)
PUB | DOI

 

For members