Evaluation of Human-System Interaction
To enable intelligent systems (i.e. smart homes, social robots, and virtual agents) to engage in interaction with the human user, we need to understand human multimodal communication and sequential structures in authentic interactions. Inspired by HHI, we develop (adapted) interactional models for robots, which are implemented in autonomous systems. We evaluate the users' perception and appraisal of the system and its technical performance using HRI studies. Based on our findings, we further investigate the emerging effects in long-term evaluations. A grand challenge for intelligent systems assisting and interacting with humans on a daily basis is to engage users in repeated interaction. Several factors can enrich such interactions and establish long-term companionship. In our projects we explore the different system requirements for long-term interaction and and evaluate the neccessary interactive capabilities in user studies.
Amongst others, we investigate the following influencing factors on Human-System Interaction:
- Social cues (distance, orientation, attention, engagement)
- Episodic memory (training progression, game statistics, user preferences)
- Contextual knowledge (game rules, social norms)
- Type of feedback (motivational, instructional, quantitative, qualitative)
- Embodiment (e.g. virtual agent vs. ambient surroundings vs. anthropomorphic robot)
- Social roles (e.g. trainer vs. companion)
Our approach in general is to design social scenarios in which naive users can explore our systems and platforms in an interactive fashion. With the help of a user-centered design in our studies, we can test the influences of our mentioned factors by manipulating the experimental conditions. From our measurements, we can draw conclusions on the factors that shape the user experience with our systems.
The measurements we use across our expriments are human cues like gaze, (head) pose, orientation, voice pitch, facial expression. Furthermore, we use questionnaires to gain knowledge about the users subjective impressions on e.g. interaction enjoyment, perceived social intelligence, intuitiveness, and human-likeness. At last, we also use objective measurements like task-success, heart-rate, or excercise repetitions.
- Cognitive Service Robotics Appartment (CSRA)
- CITEC-IP 18: A Robot as Museum Guide. Interactional Coordination and Incrementality in HRI
- KOMPASS - Socially cooperative, virtual assistants as companions for people in need of cognitive support
- VASA/Verstanden: Virtual Assistants and their Social Acceptability
- Interaction & Space. From Conversation Analysis to Dynamic Interaction Models (Personal Dilthey fellowship Karola Pitsch)
- SoziRob: Soziale Interaktion zur Leistungsmotivation zwischen Roboter und Menschen in geschlossenen Habitaten
- SFB 673: Alignment in Communication
2012 | Conference Paper | PUB-ID: 2617032"Can you answer questions, Flobi?": Interactionally defining a robot’s competence as a fitness instructorPUB | PDF
Süssenbach L, Pitsch K, Berger I, Riether N, Kummert F (2012)
In: Proceedings of the 21th IEEE International Symposium in Robot and Human Interactive Communication (RO-MAN 2012).
2011 | Journal Article | PUB-ID: 2481769How Can I Help? - Spatial Attention Strategies for a Receptionist RobotPUB | PDF | DOI | WoS
Holthaus P, Pitsch K, Wachsmuth S (2011)
International Journal of Social Robotics 3(4): 383-393.
2013 | Conference Paper | PUB-ID: 2605000A conversational virtual human as autonomous assistant for elderly and cognitively impaired users? Social acceptability and design considerationsPUB | PDF
Kramer M, Yaghoubzadeh R, Kopp S, Pitsch K (2013)
In: Lecture Notes in Informatics (LNI). Proceedings. Series of the Gesellschaft für Informatik (GI), 251. Bonn: Köllen Druck + Verlag GmbH: 1105-1119.
2013 | Conference Paper | PUB-ID: 2662159Addressing Multiple Participants: A Museum Robot's Gaze Shapes Visitor ParticipationPUB | PDF
Pitsch K, Gehle R, Wrede S (2013)
Presented at the ICSR 2013.
Recent Best Paper/Poster Awards
Philippsen A, Reinhart F, Wrede B (2016)
International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob)
Richter V, Carlmeyer B, Lier F, Meyer zu Borgsen S, Kummert F, Wachsmuth S, Wrede B (2016)
International Conference on Human-agent Interaction (HAI)
Carlmeyer B, Schlangen D, Wrede B (2016)
International Conference on Human Agent Interaction (HAI)