Completed Projects

Virtual Assistants and their Social Acceptability (VASA)
Robust spoken-dialogue interaction for people with age-related cognitive impairments (Verstanden)

Funding: CITEC, 2012 - 2013 & BMBF, 2013 - 2014, Principal Investigators K. Pitsch & S. Kopp
Researchers: Marcel Kramer, Ramin Yagoubzadeh
Cooperation: v. Bodelschwingsche Stiftungen Bethel
More information ...

  • M. Kramer, R. Yaghoubzadeh, S. Kopp & K. Pitsch (2013): A conversational virtual human as autonomous assistant for elderly and cognitively impaired users? Social acceptability and design considerations. Lecture Notes in Informatics (LNI), P-220, 1105-1119. [pre-print] [doi]
  • R. Yaghoubzadeh, M. Kramer, K. Pitsch & S. Kopp (2013): Virtual Agents as Daily Assistants for Elderly or Cognitively Impaired People. In: R. Aylett, B. Krenn, C. Pelachaud & H. Shimodaira (Eds.): Proceedings of the 13th International Conference on Intelligent Virtual Agents (IVA 2013), LNCS (LNAI): Vol. 8108. 79-91. [pre-print] [doi]
  • M. Henne, S. Kopp & K. Pitsch (2014): Virtuelle Assistenten als verbindende Schnittstelle zu verschiedenen Unterstützungssystemen. In: 7. Deutscher AAL-Kongress, Berlin, 5 pages. [pre-print]
Alignment in Augmented Reality based Cooperation (SFB 673, C5)

The project investigates interactional coordination and alignment with regard to the practices participants use in order to establish and sustain mutual orientation to an object or referent. By using an Augmented Reality-based Interception and Manipulation System (ARbInI) for co-presently interacting participants we are able to remove, manipulate or add communicatively relevant multimodal information in real-time. This methodological tool for linguistic research allows to modify specific factors, usually known as crucial interactional resources (e.g. timing, mutual monitoring), in order to investigate their impact in the establishment and maintenance of co-orientation/joint attention. Thereby, the system encompasses novel facilities for recording sensor-rich data sets which can be accessed in parallel with qualitative/manual and quantitative/computational methods. Funding: DFG-SFB 673, 2010 - 2014, Principal Investigators K. Pitsch & T. Hermann
Researchers: Christian Schnier, Alex Neumann More information ...

  • K. Pitsch, A. Neumann, C. Schnier & T. Hermann (2013): Augmented Reality as a tool for linguistic research: Intercepting and manipulating multimodal interaction. In: Proceedings of the Workshop “Multimodal Corpora: Beyond audio and video“ at IVA 2013), 5 pages.[pre-print]
  • A. Neumann, C. Schnier, T. Hermann & K. Pitsch (2013): Interaction Analysis and Joint Attention Tracking in Augmented Reality. Proceedings of the 15th ACM International Conference on Multimodal Interaction (ICMI 2013), 165-172.
  • C. Schnier, K. Pitsch, A. Dierker & T. Hermann (2011): Collaboration in Augmented Reality: How to establish coordination and joint attention? In: ECSCW 2011, Aarhus, Denmark, 405-416. [pre-print]
  • C. Schnier, K. Pitsch, A. Dierker & T. Hermann (2011): Adaptability of Communicative Resources in AR-based Cooperation. In: GESPIN 2011, Bielefeld.[pre-print]
A robot as fitness companion: Motivating in longterm human-robot-interaction

The project explores ways in which a robot system (in our case NAO and FloBi) could provide support and motivation for people undertaking daily physical exercises. Such an application is particularly interesting for astronauts in space, for whom it is vital during any long-term mission in zero or low gravity to maintain physical fitness and psychological well-being. In our scenario, the robot is supposed to assume autonomously the role of a fitness instructor and support a structured bike training which is inspired by indoor cycling (‘spinning’). To do so, the robot not only needs to announce next training actions, but importantly, needs to observe and evaluate the trainees actions and provide positive or corrective feedback. Within this interdisciplinary realm, our project part investigates - on the basis of video-data acquired in fitness studios - the interactional practices of micro-coordination and ‘motivation’ between human instructors and trainees involved in spinning classes. Results of the empirical analysis are used to build interactional models for the robot system, which are, in turn, evaluated in both short- and long-term studies of HRI. Funding: Part of the DLR-project SoziRob headed by F. Kummert & B. Wrede, 2010 - 2013
Researchers: Luise Süssenbach More information ...

  • L. Süssenbach, N. Riether, I. Berger, S. Schneider, F. Kummert, I. Lütkebohle & K. Pitsch (in print): A robot as fitness companion: Towards an interactive action-based motivation model. In Ro-Man 2014.[pre-print]
  • L. Süssenbach, K. Pitsch, I. Berger, N. Riether & Franz Kummert (2012): “Can you answer questions, Flobi?” Interactionally defining a robot’s competence as a fitness instructor. In: RoMan 2012. [pre-print] [doi]
  • L. Süssenbach & K. Pitsch (2011): Interactional Coordination and Alignment: Gestures in Indoor Cycling Courses. In: GESPIN 2011, Bielefeld. [pre-print]
iTalk. Integration and Transfer of Action and Language Knowledge in Robots (2008 - 2012)
Research staff. - Project headed by Katharina Rohlfing, Britta Wrede & Gerhard Sagerer (Applied Informatics & Emergentist Semantics, Bielefeld University)
Funded by the European Union (FP7)

The EU-project “iTalk” aims at developing artificial embodied agents that will be able to acquire complex behavioral, cognitive, and linguistic skills through individual and social learning. Within this frame, the Bielefeld team investigates social aspects of learning and focuses on a scenario, in which a human tutor presents and explains some task to a learner, who/which observes the action and, in turn, attempts to understand its structure in view of eventually imitating it. We use ‘tutoring’ in parent-infant-interaction as an empirical model to explore the topics of (i) action structuring / acoustic packaging (-> Lars Schillingmann), (ii) tutoring spotter (-> Katrin Lohan), and (iii) variability (-> Karola Pitsch). Karola’s research focuses on the interplay between the tutor’s and learner’s actions. She accesses the topic of ‘variability of tutor’s actions’ from an interactional perspective detailing the sources and consequences of observable actions in the unfolding course of action. This leads to an interactional account of “motionese” behavior and the investigation of feedback strategies in both parent-infant- and human-robot-tutoring. Methodologically, Karola uses Conversation Analysis to reveal interactional patterns and attempts - in interdisciplinary collaboration - to link qualitative research with quantitative approaches and formalization. Results of her empirical research are used as basis for the design for human-robot-experiments, which she evaluates with regard to the systems’ functioning and to understand the ways in which the users’ reactions and expectations are shaped in situ.

Student Assistant: Raphaela Gehle, Lukas Rix

PaperWorks. Interweaving Paper and Digital Documents (2005 - 2008)
Research staff. - Project headed by Christian Heath & Paul Luff (Work, Interaction & Technology, King’s College London)
Funded by the European Union (FP6)

The EU-project “PaperWorks” is concerned with developing distinctive technologies for interleaving paper documents with digital materials. The project aims to provide people with new forms of functionality in everyday environments through seemingly mundane artefacts. This involves the production of novel hardware (paper substrates, inks, and reading devices), software and information infrastructure, which are informed by empirical studies of people using paper documents and associated tools as part of their everyday activities and by studies of individuals and organizations who/which produce digital and paper content. The project examines how publishers, professionals and other ’users’ currently design material using particular media. Drawing from the findings of these studies we undertake technical interventions where we provide people with various augmented technologies and authoring tools to support the production of content and interlinking of materials. Within this framework, Karola is particularly interested in the collaborative practices of exhibit/museum designers when envisioning exhibitions. She undertakes semi-experimental user studies of participants who use novel swiping devices when attempting to control a PowerPoint presentation from paper-based slides.

Multimodality in Immersive Classroom Interaction (2002 - 2006)
PhD project
Funded by the DFG, Graduate Program ‘Task-Oriented Communication’ (Bielefeld University)

Karola’s PhD work is concerned with multimodal aspects classrooms interaction. Based on video-recording from immersive history classes (in which German and Argentinian students carry out their history courses in a foreign language) three issues are investigated on the level of micro-practices: (1) Multimodal procedures of dealing with locally occurring language problems; (2) The interactional production of blackboard inscriptions and their role as material resources (“intermediary objects”) in learning interactions; (3) Practices of interactional coordination of the ‘official’ classroom discourse and the students’ ‘private’ note-taking. On the basis of reconstructing these detailed practices and patterns and their interactional ‘risks and side effects’, the study offers a set of conceptual and methodological implications for a multimodal take of Conversation Analysis (parallel activities, participants’ on-line analysis, intermediary objects).