Veranstaltungsprogramm

Eine Übersicht aller Sessions/Sitzungen dieser Tagung.
Bitte wählen Sie einen Ort oder ein Datum aus, um nur die betreffenden Sitzungen anzuzeigen. Wählen Sie eine Sitzung aus, um zur Detailanzeige zu gelangen.

 
Sitzungsübersicht
Sitzung
MCI-WS07: Workshop on Virtual and Augmented Reality in Everyday Context (VARECo)
Zeit:
Sonntag, 08.09.2019:
9:00 - 17:30

Chair der Sitzung: Benjamin Weyers
Chair der Sitzung: Daniel Zielasko
Chair der Sitzung: Alexander Kulik
Chair der Sitzung: Eike Langbehn
Ort: Hauptgebäude Hörsaal K
Hauptgebäude, Hörsaal K, (feste Bestuhulung), Kapazität 80

Zusammenfassung der Sitzung

The ongoing commercialization of consumer VR/AR hardware opens up the field of potential scenarios and
applications for VR and AR towards everyday context but simultaneously raise new challenges that need to be
addressed in current and future research. These challenges are, for instance, the need for long-term use, the
integration of hard- and software into existing work spaces, or the need for flexible and cheap creation of content
(as in education). To support research and development in this area of interest, we organize this one day
workshop on VR and AR in Everyday Context (VARECo) bringing together interested researchers and
practitioners to discuss current and future work in this research domain.

If you are interested in User-Embodied Interaction in Virtual Reality, please refer to the UIVR workshop:

https://sites.google.com/view/uivrworkshop/


Externe Ressource: http://sites.google.com/view/vareco/home
Präsentationen

Overview of Collaborative Virtual Environments using Augmented Reality

Nico Feld, Benjamin Weyers

Universität Trier, Deutschland

Using a collaborative virtual environment (CVE) can reduce the barriers of remote communication, which makes CVEs being increasingly used to support collaborative work between spatially dispersed collaborators. Augmented reality (AR) builds a bridge between working in real and virtual environment, which makes AR a candidate technology to implement CVEs. To structure this research field and to identify possible lacks of research, this paper proposes a design space of CVEs using AR. Therefore, we mapped described solutions in research literature onto the CVE's definition, which considers user-roles as well as benefits gained by AR. Additionally, we concentrate on the consistency of using specific keywords in the field. By means of the gained design space, we identified certain gaps in research and present potential next research topics in the field.



VIGITIA: Unterstützung von alltäglichen Tätigkeiten an Tischen durch Projected AR

Raphael Wimmer1, Florian Echtler2

1Universität Regensburg, Deutschland; 2Bauhaus-Universität Weimar, Deutschland

Im BMBF-Projekt VIGITIA wollen wir herausfinden, wie projizierte AR-Inhalte physische Aktionen und Interaktionen an Tischen unterstützen und erweitern können. Dazu untersuchen wir, wie Tische im Alltag und in kreativen Domänen genutzt werden. Darauf aufbauend entwickeln wir Interaktionstechniken und digitale Werkzeuge zur Unterstützung dieser Aktivitäten. Insbesondere untersuchen wir, wie persönliche digitale Geräte integriert werden können, und wie mehrere voneinander entfernte Tischoberflächen generisch virtuell verbunden werden können. Ein besonderes Augenmerk gilt auch der Entwicklung von alltagstauglichen technischen Lösungen zur Projektion von Inhalten und zur kamerabasierten Objekterkennung. Dieses Positionspapier stellt unsere Motivationen, Ziele und Methoden vor. Ein Szenario illustriert die angestrebten Nutzungsmöglichkeiten.



Macht Teleportieren faul? Strategien zur Steigerung der natürlichen Fortbewegung in VR

Timo Mantei, Eike Langbehn

Universität Hamburg, Deutschland

Teleportation ist eine der beliebtesten Fortbewegungstechniken in Virtual Reality (VR), denn sie ist einfach anzuwenden, effizient und induziert kaum Cyber Sickness. Oft steht VR Benutzerinnen jedoch auch ein kleiner Bereich in der realen Welt zur Verfügung, indem sie sich durch natürliches Gehen in der virtuellen Welt bewegen können. Frühere Forschungsarbeiten haben gezeigt, dass reales Gehen das Präsenzgefühl steigert und die räumliche Orientierung verbessert. Es gibt jedoch Hinweise, dass Benutzerinnen mit der Zeit bequem werden, das natürliche Bewegen einstellen und nur noch Teleportation nutzen. Dadurch wird das Potenzial von VR nicht mehr voll ausgenutzt. In diesem Artikel beschäftigen wir uns mit Strategien, die die Bereitschaft zur natürlichen Fortbewegung steigern und die Nutzung von Teleportation verringern. In einer Benutzerstudie haben wir drei verschiedene Strategien mit der herkömmlichen Teleportation verglichen. Der Effekt der zunehmenden Bequemlichkeit von Teleportationsnutzerinnen konnte bestätigt werden. Außerdem zeigen unsere Ergebnisse, dass mit den getesteten Strategien signifikant weniger teleportiert und mehr gelaufen wurde.



Evaluierung der sozialen Akzeptanz verschiedener Interaktionsarten für Augmented-Reality-Datenbrillen

Nils Adrian Mack, Ludger Schmidt

Universität Kassel, Deutschland

Die soziale Akzeptanz ist neben der praktischen Akzeptanz ein wichtiger Bestandteil der Akzeptanz eines Systems durch die Nutzer. Es ist möglich, dass trotz hoher praktischer Akzeptanz ein System nicht genutzt wird, da dieses sozial nicht akzeptabel ist. Für Augmented-Reality-Datenbrillen (AR-Datenbrillen) wurden schon verschiedene Faktoren determiniert, welche die soziale Akzeptanz beeinflussen können, um zu vermeiden, dass diese von den Nutzern abgelehnt werden. Ein wichtiger Faktor, welcher die soziale Akzeptanz einer Datenbrille beeinflussen kann, ist die Interaktionsart. Es ist anzunehmen, dass die Verwendung unterschiedlicher Interaktionsarten in dem gleichen sozialen Kontext mit dem gleichen Gesamtsystem nicht in der gleichen sozialen Akzeptanz resultiert. Nachfolgend wird eine Studie mit 10 Probanden und 6 aktuell auf AR-Datenbrillen verwendeten Interaktionsarten durchgeführt, um deren soziale Akzeptanz in verschiedenen Orts- und Zuschauerkontexten vergleichend zu evaluieren. Die Studie zeigt, dass die Interaktionsarten, insbesondere diverse Arten der Sprachinteraktion, mit steigender Distanz der sozialen Beziehung zu dem Zuschauer in der Bewertung degradieren und, dass der Ort des voraussichtlichen Einsatzes der Interaktionsart berücksichtigt werden muss.



Software Engineering for AR-Systems considering User Centered Design Approaches

Thomas Schweiß1, Lisa Thomaschewski2, Annette Kluge2, Benjamin Weyers1

1Universität Trier, Deutschland; 2Ruhr Universität Bochum

Technologies like augmented reality have the potential to sup-port teams in their everyday working environment. In this pa-per, we present a user centered design approach for defining requirements based on a taxonomy for augmented reality sys-tems. Therefore, we first go into detail of the taxonomy. After-wards we present requirements engineering based on infor-mation about the context, user, and task of an AR system. Ac-cording to this information, we will gather new requirements by inducing them into the taxonomy und describe how they can be used in a user centered design process. Finally, we will present a use case based on a water treatment simulation and map the previously derived requirements to the system. Addi-tionally, we will describe two user studies to evaluate an ambi-ent awareness tool, generated due to those requirements. Our work shows, that the gathered requirements can be used in an early stage of the user centered design as well as after the stage of usability testing to serve as comparative variables for further usability analysis. Additionally, in terms of the user centered design process, we developed a first prototype of the ambient awareness tool, which will be evaluated in future work.



Functional Workspace for One-Handed Tap and Swipe Microgestures

Bastian Dewitz1, Frank Steinicke2, Christian Geiger1

1Hochschule Düsseldorf, Deutschland; 2Universität Hamburg, Deutschland

Single-hand microgestures are a promising interaction concept for ubiquitous and mobile interaction. Due to the technical difficulty of accurately tracking small movements of fingers that are exploited in this type of interface, most research in this field is currently aimed at providing a good foundation for the future application in the real word. One interaction concept of microgestures is one-handed tap and swipe interaction that resembles one-handed interaction with handheld devices like smartphones. In this paper, we present a small study that explores the possible functional workspace of one-handed interaction which describes the area on the palmar surface where tap- and swipe-interaction is possible. Additionally to thumb-to-finger interaction which has been investigated more often, we also considered other fingers. The results show, that thumb interaction with index, ring and middle finger is the most appropriate form of input but other input combinations are under circumstances worth consideration. However, there is a high deviation on which locations can be reached depending on the individual hand anatomy.



Supporting Musical Practice Sessions Through HMD-Based Augmented Reality

Karola Marky1, Andreas Weiß2, Thomas Kosch3

1Technische Universität Darmstadt, Deutschland; 2Musikschule Schallkultur Kaiserslautern, Deutschland; 3LMU München

Learning a musical instrument requires a lot of practice, which ideally, should be done every day. During practice sessions, students are on their own in the overwhelming majority of the time, but access to experts that support students "just-in-time" is limited. Therefore, students commonly do not receive any feedback during their practice sessions. Adequate feedback, especially for beginners, is highly important for three particular reasons: (1) preventing the acquirement of wrong motions, (2) avoiding frustration due to a steep learning curve, and (3) potential health problems that arise from straining muscles or joints harmfully. In this paper, we envision the usage of head-mounted displays as assistance modality to support musical instrument learning. We propose a modular concept for several assistance modes to help students during their practice sessions. Finally, we discuss hardware requirements and implementations to realize the proposed concepts.



Adjusting AR-Workflows of Care Tasks: Experiences from an Initial Study

Marc Janssen, Michael Prilla

TU-Clausthal, Deutschland

Professional caregivers need to adhere to standards when treating their patients in order to ensure a certain level of quality and hygiene. Whenever standards are refined or changed, caregivers must keep pace with them. However, these standards are interpreted differently by care providers and also offer degrees of

freedom which enable caregivers to adapt them in certain situations

and according to their own experience and practice.

Workflows are a useful tool to define, share and execute standards correctly. In this paper we investigate the possibility to adjust workflows with our Care Lenses, an Augmented Reality based tool, which can be used by caregivers during the execution of care tasks and which supports them with guidance regarding standards. We show how care practice influences the development of technical support for workflows and what kind of advantages the possibility

of adjustments grants to workflows and the integration into practice.



Designing an Interactive Visualization for Coordinating Road Construction Sites in Virtual Reality

Manuela Uhr, Sina Haselmann, Lea Steep, Joschka Eikhoff, Frank Steinicke

Universität Hamburg, Deutschland

Road works highly affect traffic in major cities, therefore coordination is key to avoid congestion on urban streets and highways. Software tools and interactive visualizations giving insight to complex road works data as well as preexisting spatial and temporal dependencies in between sites are important for the coordination process. In existing 2D visualizations, spatio-temporal dependencies are shown in multiple views, resulting in high cognitive load.

In this article we describe the design and evaluation of a visualization using Virtual Reality for exploring multi-dimensional data of road works. The relevance for expert use was reviewed in an interview with local traffic engineers. In addition, a user study was conducted to evaluate the general usability of the prototype. The results reflect an overall positive response and acceptance and show directions for further development.



The AR-Marker in the Urban Space

Simon Nestler1, Sebastian Pranz2, Klaus Neuburg3

1Technische Hochschule Ingolstadt; 2Hochschule Macromedia; 3Hochschule Hamm-Lippstadt

When considering the role of Augmented Reality (AR) in the urban space, most previous work is focusing on touristic and everyday life use cases. However, the project “Archäologie der Gegenwart” which we present in this paper illustrates the different aspects of change in Hamm during the last 50 years. Thus, our AR approach opens up a deeper understanding of the urban cultural change processes by the means of AR. Our considerations lead to adding an AR layer as a fifth social dimension in the urban space. Technically, we robustly link this fifth layer with the existing topography by marker-based tracking with six degrees of freedom (6 DOF).

When building AR applications for the urban space, the deeper understanding of the marker paradigm is crucial: During our workshops we identified and analyzed seven requirements for the utilization of markers in the public urban space. Additionally, we analyzed the general AR marker paradigm from the human-computer interaction (HCI) perspective by considering the affordances and signifiers of the marker objects themselves, analyzing the tracking technology and summarizing the marker’s role for past, present and future AR applications.

Thus, the role of the AR marker is twofold: On the one hand the marker is part of the 6 DOF tracking technology, on the other hand it makes AR layers perceivable in the urban space. We expect that the importance of these markings for guiding citizens through AR experiences will emerge in urban spaces, whereas the role of markers for technical tracking purposes will decrease.



User acceptance of augmented reality glasses in comparison to other interaction methods for controlling a hand exoskeleton

Tobias Ableitner1, Surjo Soekadar2, Andreas Schilling3, Christophe Strobbe1, Gottfried Zimmermann1

1Responsive Media Experience Research Group Stuttgart Media University; 2Universitätsklinik Tübingen, Deutschland; 3WSI / GRIS, University of Tübingen

Every year, several hundred thousand people suffer a stroke often leading to long-term motor disabilities that impair their quality of life. In this context, hemiplegia including paralysis of hand and fingers plays a key role, leaving stroke survivors unable to perform tasks requiring both hands. In case of lesions at the level of the brain stem or the spinal cord, paralysis can also affect both sides resulting in very severe constraints for performing most activities of daily living.

A neural-guided hand exoskeleton can restore motor hand function after a stroke or spinal cord injury. However, controlling such hand exoskeleton raises several challenges related to human-machine interaction. While it should be operated without the user's hands and require as little physical and cognitive strain on them as possible, it should be also as inconspicuous as possible to avoid stigmatization of the users. To tackle these challenges, we conducted a survey among 62 healthy test persons to shed more light on the aspects of user acceptance regarding 12 input and 14 output methods, as well as 3 different application contexts.

We found that there are differences in user acceptance for the various input and output methods between public contexts on the one hand and home and rehabilitation contexts on the other. In general, inconspicuous, handy and widely used devices are preferred in public. Also, we found that spectacle wearers are slightly more open to using AR glasses than non-spectacle wearers.