Arindam Dey

Postdoctoral Fellow

I am a Research Fellow at the Empathic Computing Laboratory. I completed my PhD from University of South Australia in 2013 (Supervised by: Prof. Christian Sandor and Prof. Bruce Thomas). Since the completion of my PhD and before joining UniSA again in 2016, I worked in three postdoctoral positions at James Cook University (Australia), Worcester Polytechnic Institute (USA), and University of Tasmania (Australia). I visited Technical University of Munich (Germany) for a research internship and Indian Institute of Technology, Kharagpore (India) for a summer internship (during B.Tech).
My primary research area is Mixed Reality and Human-Computer Interaction. I have more than 20 publications (Google Scholar).I am a peer-reviewer of many journals and conferences and have been a part of the organizing committee of the following conferences:

OzCHI (2016): Research Demonstration Chair

IEEE Symposium on 3D User Interfaces (2016): Poster Chair

Australasian Conference on Artificial Life and Computational Intelligence (2015): International Program Committee Member

Empathic Computing is a research field that aims to use technology to create deeper shared understanding or empathy between people. At the same time, Mixed Reality (MR) technology provides an immersive experience that can make an ideal interface for collaboration. In this paper, we present some of our research into how MR technology can be applied to creating Empathic Computing experiences. This includes exploring how to share gaze in a remote collaboration between Augmented Reality (AR) and Virtual Reality (VR) environments, using physiological signals to enhance collaborative VR, and supporting interaction through eye-gaze in VR. Early outcomes indicate that as we design collaborative interfaces to enhance empathy between people, this could also benefit the personal experience of the individual interacting with the interface.

Driving a car is a high cognitive-load task requiring full attention behind the wheel. Intelligent navigation, transportation, and in-vehicle interfaces have introduced a safer and less demanding driving experience. However, there is still a gap for the existing interaction systems to satisfy the requirements of actual user experience. Hand gesture as an interaction medium, is natural and less visually demanding while driving. This paper aims to conduct a user-study with 79 participants to validate mid-air gestures for 18 major in-vehicle secondary tasks. We have demonstrated a detailed analysis on 900 mid-air gestures investigating preferences of gestures for in-vehicle tasks, their physical affordance, and driving errors. The outcomes demonstrate that employment of mid-air gestures reduces driving errors by up to 50% compared to traditional air-conditioning control. Results can be used for the development of vision-based in-vehicle gestural interfaces.

Interfaces for collaborative tasks, such as multiplayer games can enable more effective and enjoyable collaboration. However, in these systems, the emotional states of the users are often not communicated properly due to their remoteness from one another. In this paper, we investigate the effects of showing emotional states of one collaborator to the other during an immersive Virtual Reality (VR) gameplay experience. We created two collaborative immersive VR games that display the real-time heart-rate of one player to the other. The two different games elicited different emotions, one joyous and the other scary. We tested the effects of visualizing heart-rate feedback in comparison with conditions where such a feedback was absent. The games had significant main effects on the overall emotional experience.

Teaching English to children who do not come from an English speaking background is an interesting challenge for educators. In this paper, we present an Augmented reality (AR) tool, TeachAR, for teaching basic English words (colors, shapes, and prepositions) to children for whom English is not a native language. In a pilot study we compare our AR system to a traditional non-AR system. The results indicate a potentially better learning outcome using the TeachAR system than the traditional system. It also showed that children enjoyed using AR-based methods. However, it also showed a few usability issues with the TeachAR interface, which we will improve on in the future.

Augmented Reality (AR) interfaces have been studied extensively over the last few decades, with a growing number of user-based experiments. In this paper, we systematically review most AR papers published between 2005 and 2014 that include user studies. A total of 291 papers has been reviewed and classified based on their application areas. The primary contribution of the review is to present the broad landscape of user-based AR research and to provide a high-level view of how that landscape has changed. We also identify areas where there have been few user studies, and opportunities for future research. This poster describes the methodology of the review and the classifications of AR research that have emerged.