Applied Informatics Group

Dr.-Ing. Angelika Dierker

adierker's picture

Name: Dr.-Ing. Dierker Angelika
Phone:
Office:
Email: adierker [AT] TechFak.Uni-Bielefeld.DE

Brief CV

Angelika Dierker studied Naturwissenschaftliche Informatik with a focus on neurobiology and psychology. She received her diploma in computer science from the Neuroinformatics Group of Bielefeld University in 2006. The title of her diploma thesis was Sonifikationsbasierte Korrelationsanalyse gekoppelter FitzHugh-Nagumo-Systeme (supervised by Thomas Hermann and Elena Carbone).
 

Angelika joined the Applied Computer Science Group in April 2007 and is now working in the C5 project of the Collaborative Research Centre (SFB 673) Alignment in Communication. The goal of the project is to investigate alignment in communication by means of augmented reality. She is supervised by Marc Hanheide and Thomas Hermann. Furthermore, Angelika is interested in cognition psychology, sonification and neurobiology.

Involved in advisory for Bachelor/Master theses:
Teaching involvements:
  • Tutorium Bildverarbeitung, WS 2007/08
  • Seminar student project "The influence of head-mounted displays on head and eye movements during visual search" during the seminar "Analyse und Modellierung von Blickbewegungen" (supervised by Hendrik Koesling), SS 2010
  • Praktikum Intelligente Systeme: "AR-based Interaction with Companions and Data in a Dynamic Virtual Room", WS 2009/2010
  • Praktikum Intelligente Systeme: "AR-based Applications", SS 2010

Research Topic: Computer-aided investigation of interaction mediated by an Augmented-Reality-enabled wearable interface


Two subjects interacting with virtual objects that are augmented on top of glass cubes. The displays show a possible view for the subjects.

Screenshot of the image a user of the system might see. There are glass cubes used for interaction with attached markers and augmented virtual objects.   Hardware Setup with audiovisual interception and head gesture sensing. The interface enables modifications as well as monitoring and recording of the interaction.

The overall goal is to facilitate the recording and analysis of multimodal corpora in human-human interaction with the help of appropriate sensors and software. Thereby, a special attention is paid to on nonverbal behavior as head gestures and the visual focus of attention. The work includes:

  • approaches for facilitating the recording and analysis of experiments
  • approaches for facilitating closed-loop experiments
  • design of scenarios that can benefit from open-loop experiments 
  • evaluation of the capabilities of the system for interaction analysis
  • analysis of AR-based interaction: evaluation of the effects of the sensors (and AR goggles) on the interaction (in terms of head gestures, eye movements, speech).

Facilitating recording and analysis of experiments

  • recording: record communication from the participant's point of view instead of only from the scene camera's point of view, use body-mounted sensors to measure the behaviour instantaneously at the place where it is produced, monitor and record the communication signals during interaction
  • tagging: integrate machine learning methods to automatically analyse and tag the sensor data for patterns (gestures, speech times, objects in the field of view) 
  • transformation: automatically synchronize and transform the multimodal data (video, audio, time series, tag hypotheses) into ELAN loadable file bundles
  • analysis: develop MATLAB toolkit for a detailed analysis of complex relationships in multimodal data

(Dierker et. al. 2009Hanheide et. al. 2010, Woehler et. al. 2010)

Facilitating closed-loop experiments

Basic idea: we equip two interlocutors with head-mounted AR and encourage them to interact cooperatively with a set of physical objects. Using AR, we can augment virtual objects on top of the physical objects.

Benefits:

  • Control Experiment: use buttons, field of view or speech signals to gain feedback from the participants; use displays, sound and device vibrations to give feedback to the participants
  • Support Interaction: provide artificial communication channels that otherwise would not be available: Coloured highlighting of objects according to their position in the interaction partner's field of view
  • Induce Misunderstandings: how does compensation of mistakes in communication work?

(Mertes 2008, Mertes et. al. 2009, Dierker et. al. 2009)

Analyse AR-based Interaction

which effects does Augmented Reality show on:

  • the users' eye movements
  • the users' head movements
  • the way users solve tasks

(Kollenberg et. al. 2010, Dierker et. al. 2011, Schnier et. al. 2011)

 

Position of project and comments

The phd project is associated with the Collaborative Research Centre SFB673 --ProjectC5 Alignment in Augmented Reality based Cooperation

 

Alignment is a term from linguistics' communication research that is explained on the homepage of the SFB 673.

Augmented Reality (AR) is a methodology to add virtual information to the perceptible reality. This information is mostly provided visually but also auditive or other modalities are possible. We propose AR as a novel methodology to investigate human-human interaction in collaborative tasks.

 


Recent Best Paper/Poster Awards

Goal Babbling of Acoustic-Articulatory Models with Adaptive Exploration Noise
Philippsen A, Reinhart F, Wrede B (2016)
International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob) 
PUB | PDF

 

Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees
Richter V, Carlmeyer B, Lier F, Meyer zu Borgsen S, Kummert F, Wachsmuth S, Wrede B (2016)
International Conference on Human-agent Interaction (HAI) 

 

"Look at Me!": Self-Interruptions as Attention Booster?
Carlmeyer B, Schlangen D, Wrede B (2016)
International Conference on Human Agent Interaction (HAI)
PUB | DOI

 

For members