Zwei Personen beugen sich über eine Karte auf einem Tisch. Ihre bewegungen werden mit Kreisen hervorgehoben.

2010 - 2014Alignment in der Kooperation unter den Bedingungen von Augmented Reality

​The project investigates interactional coordination and alignment with regard to the practices participants use in order to establish and sustain mutual orientation to an object or referent. By using an Augmented Reality-based Interception and Manipulation System (ARbInI) for co-presently interacting participants we are able to remove, manipulate or add communicatively relevant multimodal information in real-time. This methodological tool for linguistic research allows to modify specific factors, usually known as crucial interactional resources (e.g. timing, mutual monitoring), in order to investigate their impact in the establishment and maintenance of co-orientation/joint attention. Thereby, the system encompasses novel facilities for recording sensor-rich data sets which can be accessed in parallel with qualitative/manual and quantitative/computational methods.

In summary, our project results (i) in the development of a novel research tool pairing a high degree of experimental control with simultaneous recording of the users’ perceptions, and (ii) contributes scientific results concerning the multimodal organization of co-orientation, obtained from both conversation analysis and data mining.

Publikationen

Konversationsanalyse & Multi-sensorielle Korpora

  • Pitsch, K., Neumann, A., Schnier, C., & Hermann, T. (2013). Augmented Reality as a Tool for Linguistic Research: Intercepting and Manipulating Multimodal Interaction. In: Multimodal Corpora: Beyond Audio and Video (IVA 2013 Workshop), Edinburgh, UK, 7 Seiten. [Online-Volltext]

  • Brüning, B., Schnier, C., Pitsch, K., & Wachsmuth, S. (2012). PAMOCAT. Automatic retrieval of specified postures. LREC 2012, S. 4143-4248. [Online-Volltext]

  • Brüning, B., Schnier, C., Pitsch, K., & Wachsmuth, S. (2011). Automatic detection of motion sequences for motion analysis. ICMI 2011, 6 Seiten. [Online-Volltext]

  • Pitsch, K., Brüning, B., Schnier, C., Dierker, H., & Wachsmuth, S. (2010). Linking Conversation Analysis and Motion Capturing. How to robustly track multiple participants? In: Workshop on Multimodal Corpora: Advances in Capturing, Coding and Analyzing Multimodality (LREC 2010), S. 63-69. [Online-Volltext]

Interaktion in Augmented Reality

  • Neumann, A., Schnier, C., Hermann, T., & Pitsch, K. (2013). Interaction Analysis and Joint Attention Tracking in Augmented Reality. In: 15th ACM International Conference on Multimodal Interaction, Melbourne, S. 165-172. [Online-Volltext]

  • Hermann, T., Neumann, A., Schnier, C., & Pitsch, K. (2013). Sonification for Supporting Joint Attention in Dyadic Augmented Reality-based Cooperations. In: AudioMostly 2013, 6 Seiten. [Online-Volltext]

  • Schnier, C., Pitsch, K., Dierker, A., & Hermann, T. (2011). Collaboration in Augmented Reality. How to establish coordination and joint attention? In: S. Bødker, N. O. Bouvin, W. Lutters, & V. Wulf (Hg.), ECSCW 2011: Proceedings of the 12th European Conference on Computer Supported Cooperative Work, 24-28 September 2011, Aarhus Denmark (S. 405-416). [DOI] [Online-Volltext]

  • Schnier, C., Pitsch, K., Dierker, A., & Hermann, T. (2011). Adaptability of Communicative Resources in AR-based Cooperation. In: Gespin 2011, Bielefeld, 6 Seiten. [Online-Volltext]

Ein Projekt des:

Projekt C5 des Sonderforschungsbereich "Alignment in Communication" (SFB 673), Universität Bielefeld.

Projektseite der Universität Bielefeld

Team des Projekts: