Location based services are a common application scenario in mobile and ubiquitous computing use. A typical issue with cartographic applications in this domain is the limited size of the displayed map, which makes interaction and visualization a difficult problem to solve. With the increasing popularity of head mounted displays for VR and AR systems, an opportunity is presented for map-based applications to overcome the limitation of the small display size, as the user’s information visualization space can extend to his entire surroundings. In this paper we present a preliminary investigation into how interaction with such very large display maps can take place, using a virtual reality headset as the sole input and interaction method.

I presented a short lecture on usability evaluation methods at the Aristotle University of Thessaloniki (School of Informatics).

Obviously it's impossible to cover everything in a short lecture, so the talk focused on the origins of UEM, the concept of UX, the need for multidisciplinarity and combinatory approaches to using UEM and an example from my latest (yet unpublished work).

Text entry has been shown to have reasonable performance on smart watches but space is very tight and editing has been shown to be a major factor slowing down text entry on many devices. In this position paper we propose use of mid-air gestures to control editing functions so that screen estate and touch gestures can be focused on text entry.