Attention redirection trials were carried out using a wearable interface incorporating auditory and visual cues. Visual cues were delivered via the screen on the Recon Jet – a wearable computer resembling a pair of glasses – while auditory cues were delivered over a bone conduction headset. Cueing conditions included the delivery of individual cues, both auditory and visual, and in combination with each other. Results indicate that the use of an auditory cue drastically decreases target acquisition times. This is true especially for targets that fall outside the visual field of view. While auditory cues showed no difference when paired with any of the visual cueing conditions for targets within the field of view of the user, for those outside the field of view a significant improvement in performance was observed. The static visual cue paired with the binaurally spatialised, dynamic auditory cue appeared to provide the best performance in comparison to any other cueing conditions. In the absence of a visual cue, the binaurally spatialised, dynamic auditory cue performed the best.

A BONE CONDUCTION BASED SPATIAL AUDITORY DISPLAY AS PART OF A WEARABLE HYBRID INTERFACE

Preliminary results from an on-going experiment exploring the localisation accuracy of a binaurally processed source displayed via a bone conduction headset are described. These results appear to point to decreased localisation accuracy in the horizontal plane when the vertical component is introduced. There also appears to be a significant compression in the area directly in front of the observer ± 15° in elevation from 0°. This suggests that participants tended to localise stimuli presented at elevations greater than and less than ± 30° within a 30° ‘window’ extending 15° vertically either above or below the horizontal plane defined by the 0° azimuth. The results gathered until now suggest that binaural spatialisation over a bone conduction headset can also reproduce the perception of an elevated source to an acceptable degree of accuracy.

Interaction for Handheld Augmented Reality (HAR) is a challenging research topic because of the small screen display and limited input options. Although 2D touch screen input is widely used, 3D gesture interaction is a suggested alternative input method. Recent 3D gesture interaction research mainly focuses on using RGB-Depth cameras to detect the spatial position and pose of fingers, using this data for virtual object manipulations in the AR scene. In this paper we review previous 3D gesture research on handheld interaction metaphors for HAR. We present their novelties as well as limitations, and discuss future research directions of 3D gesture interaction for HAR. Our results indicate that 3D gesture input on HAR is a potential interaction method for assisting a user in many tasks such as in education, urban simulation and 3D games.

Technology is fundamentally changing the reading experience and book design. While the invention of industry-scale printing transformed books into a mass product, interactive technology enables new types of engagement during reading. Books can have multifarious form factors; their visual representation can change in accordance to the environment and user needs. The aim of the workshop is to discuss emerging interactive book-related technologies (e.g. Augmented Reality or Tangible Interfaces) and elaborate on methodologies that can be used to evaluate content and the interplay between form and content. The workshop will investigate how novel technologies can inspire, support and enrich the reading experience.

Science fiction media has had a long lasting influence on the progression of interactive technology, however recently contradictions are emerging in the development of the two disciplines. Therefore, in this exploratory position paper we report on the insights attained through a day long workshop amongst scientists and researchers on how the collaboration between science fiction and Human Computer Interaction (HCI) can be advanced. Discussions in the workshop focused on detailing the relationship between HCI and science fiction. In conclusion, as our main contribution an action plan and agenda is presented for facilitating deeper influences amongst the two disciplines.

This paper presents the development of a mobile Augmented Reality (AR) heart rate murmur simulator that can be used for clinical teaching for medical trainees. Traditional medical training often requires the trainees to have hands on experience with real patients. However, it is not often possible to find certain types of heart murmurs with patients available for training. To overcome this limitation, we have developed a wearable clothing system using mobile audible AR that provides heart murmur simulation for facilitating medical learning experience. In this paper we describe the proposed system, a user evaluation study and directions for future work.

This paper explores different visual interfaces for sharing comments on a social live video streaming platforms. So far, comments are displayed separately from the video making it hard to relate the comments to event in the video. In this work we investigate an Augmented Reality (AR) interface displaying comments directly on the streamed live video. Our described prototype allows remote spectators to perceive the streamed live video with different interfaces for displaying the comments. We conducted a user study to compare different ways of visualising comments and found that users prefer having comments in the AR view rather than on a separate list. We discuss the implications of this research and directions for future work.

We present a Mixed Reality system for remote collaboration using Virtual Reality (VR) headsets with external depth cameras attached. By wirelessly sharing a 3D point-cloud data of a local workers' workspace with a remote helper, and sharing the remote helper's hand gestures back to the local worker, the remote helper is able to assist the worker to perform manual tasks.Displaying the point-cloud video in a conventional way, such as a static front view in VR headsets, does not provide helpers with sufficient understanding of the spatial relationships between their hands and the remote surroundings. In contrast, we propose a Mixed Reality (MR) system that shares with the remote helper, not only 3D captured environment data but also real-time orientation info of the worker's viewpoint. We conducted a pilot study to evaluate the usability of the system, and we found that extra synchronized orientation data can make collaborators feel more connected spatially and mentally.

This demonstration presents the development of a mobile augmented reality (AR) murmur simulator that can be used for clinical teaching for medical trainees. Medical training often requires educators and trainees to work with vast amount of experience-based knowledge, hearing and recognizing murmurs is part of the fundamental training for medical students. In this study, we propose a wearable clothing system that is developed to work with mobile audible AR that provides heart murmurs simulation for facilitating medical learning experience.

In this paper we report on a user study in a simulated environment that compares three types of Augmented Reality (AR) displays for assisting with car navigation: Heads Up Display (HUD), Head Mounted Display (HMD) and Heads Down Display (HDD). The virtual cues shown on each of the interface were the same, but there was a significant difference in driver behaviour and preference between interfaces. Overall, users performed better and preferred the HUD over the HDD, and the HMD was ranked lowest. These results have implications for people wanting to use AR cues for car navigation.