Engaging User Experience with Wearable and Pervasive Computing

We push the envelope in Human Computer Interaction with wearables and other forms of pervasive computing. New modalities require new aesthetics and ergonomics. For example, hands-free navigation through the tangible reality using smart glasses needs new forms of interaction.

Wearable technologies – such as smart watches, smart glasses, smart objects, smart earbuds, or smart garments – are just starting to transform immersive user experience into formal education and learning at the workplace. These devices are body-worn, equipped with sensors and conveniently integrate into leisure and work-related activities including physical movements of their users.

Wearable Enhanced Learning (WELL) is beginning to emerge as a new discipline in technology enhanced learning in combination with other relevant trends like the transformation of classrooms, new mobility concepts, multi-modal learning analytics and cyber-physical systems. Wearable devices play an integral role in the digital transformation of industrial and logistics processes in the Industry 4.0 and thus demand new learning and training concepts like experience capturing, re-enactment and smart human-computer interaction.

This proposal of a special track is the offspring of the SIG WELL (http://ea-tel.eu/special-interest- groups/well/) in the context of the European Association for Technology Enhanced Learning (EATEL). It is a follow up proposal for the inaugural session we had at the iLRN 2015 in Prague and in iLRN 2017 in Coimbra.

In the meantime, the SIG was successful in organizing a number of similar events at major research conferences and business oriented fairs like the EC-TEL, the I-KNOW and the Online Educa Berlin OEB. Moreover, the SIG has involved in securing substantial research funds through the H2020 project WEKIT (www.wekit.eu). The SIG would like to use the opportunity to present itself as a platform for scientific and industrial knowledge exchange. EATEL and major EU research projects and networks in the field support it. Moreover, we’ll seek to attach an IEEE standard association community meeting of the working group on Augmented Reality Learning Experience Models (IEEE ARLEM).

On entering Audiomotion Studios, visitors will encounter a ginormous space for motion capture (MoCap) with over 160 Vicon cameras mounted on the rigs (see picture). The Audiomotion Studio is the largest performance capture stage in Europe. Quite a number of actor and animal movements can be recorded at the same time for production of accurate animations. Behind the MoCap space there are several rooms with green screens and Motion Control Crane for filming. PAL’s PhD candidate Yu Huang, Dr Fridolin Wild, and John Twycross went to see Brian Mitchell, the Managing Director, to explore possibilities for further collaboration on volumetric video capture.

Just in time for Halloween, we have finalised work on a major release of the ‘WEKIT.one’, our next-generation app for wearable experiences in knowledge-intensive training.

The development of the experience capturing software has been led by members of the Performance Augmentation Lab. This is one of the first of such tools that allows the generation of content to be done completely within AR. Using a HoloLens and other wearable sensors, the software guides experts to record immersive training procedures using all available AR content. Blending 2D and 3D instruction into the workplace creates a far richer and more interactive training experience.

The expert works through the procedure, capturing their actions, thoughts, and guiding instruction step by step. We are able to capture their movement in and around the workplace, their hand positions, and even some additional biophysical signals, such as heart rate variability or galvanic skin resistance. With just the technology at hand, trainees can now visualise the expert, listen to live guidance, and have access to on-demand knowledge about the task at hand.

To now we have seen experts in the field of aircraft maintenance, radiology and astronaut training use this software and, in 2019, we aim to establish new collaboration within the university, within Oxford and abroad – most imminently with the European Space Agency.

At the Augmented World Expo in Munich, we have been exhibiting the WEKIT project last week, showcasing the breakthrough achievements for our augmented reality and wearables solution in the space industry, aviation, and medicine. On stand #217, we exhibited the different versions of the e-textile garment (and its underlying sensor harness) as well as the WEKIT.one software solution. The director of our lab, Dr Fridolin Wild, gave a keynote presentation in the enterprise track about AR experience capturing and sharing for Training 4.0, explaining the technologies of the project and the findings of the pilot trials reported so far in a series of articles and papers.

The Warwick Business School has a Knowledge Innovation Network and PAL’s Dr Fridolin Wild gave the opening keynote at this year’s autumn workshop, speaking about Holographic Training and other little wonders of an Industry 4.0. In the talk, Dr Wild explored how smart glasses and wearable technologies can be used for knowledge intensive training and as job performance aids, sharing the floor with Jeremy Dalton of PwC, head of VR/AR. The workshop included several case studies and demos, including of Severn Trent Water and Kazendi, as well as a guided your to the Warwick Manufacturing Group Innovation Labs.

Dr Fridolin Wild, director of PAL, was invited to attend a Showcase and Networking meeting at the Microsoft Hololens Lounge in London – as one of twelve universities invited, on May 10, 2018. Microsoft shared some details about the Mixed Reality strategy and observations on the importance of academia as enable to industry, including an announcement for two new mixed reality apps (remote assist and layout, both now in the store). The universities shared their research. The building and demos took place in the stylish Hololens Lounge.

While presenting at the Future Tech Now Show in London on April 5, Dr Fridolin Wild was able to sneak into the new Tesla suit, experiencing the effects of electro-muscular stimulation (EMS) on his own body. The suit embeds muscle-stimulation pads and motion sensors – to be applied in anything from rehabilitation to gaming. “When you activate the six pack pads, you literally can make people feel a subtle kick in the guts”, so Dr Wild. “You still feel a strange electric tingling on your skin, but once immersed in a simulation or game, this quickly fades to the background”, he continues. With the technology, it is also possible to make people move, see here for an earlier reflection on how people feel about their bodies being remote controlled. The latest version of the suit also holds sensors for galvanic skin resistence and heat pads, promising new approaches to personalisation and adaptation.

Also exhibited at the show: electronic cocktails, using the same principle of electric stimulation – of the taste-buds on the tongue, to turn soda water – combined with fragrances – into a virtual cocktail.

Dr. Fridolin Wild gave a TEDx talk on ‘reality as a medium’, speaking about truth, reality, and perception, and how we can hack into perception to actually ‘make’ reality. The talk will be available online soon.

The WEKIT.one prototype is a platform for immersive procedural training with wearable sensors and Augmented Reality. Focusing on capture and re-enactment of human expertise, this work looks at the unique affordances of suitable hard- and software technologies. The practical challenges of interpreting expertise, using suitable sensors for its capture and specifying the means to describe and display to the novice are of central significance here. We link affordances with hardware devices, discussing their alternatives, including Microsoft Hololens, Thalmic Labs MYO, Alex Posture sensor, MyndPlay EEG headband, and a heart rate sensor. Following the selection of sensors, we describe integration and communication requirements for the prototype. We close with thoughts on the wider possibilities for implementation and next steps.

Via the Horizon 2020 funded Vertigo project, we will receive an artist in residence to work with our WEKIT project. Vertigo aims to catalyze new synergies between artists, cultural institutions, R&D projects in ICT, companies, incubators, and funds. We will host from December 2017 to October 2018 two artists, Yann Deval and Marie-Ghislaine Losseau, to work with us on exploring and investigating the new aesthetics and design plus interaction principles for ‘reality 2.0’, made possible through the advent of smart AR glasses. Yann and Marie-Ghislaine will use AR glasses as a medium of expression, creating a holographic exhibit ‘ATLAS’.

ATLAS is a work between digital arts and visual arts, in form of a interactive and sceno-graphic exhibition, mixing real and virtual worlds. Situated in an archipelago of poetical islands, the spectator will be invited to build a city. Using a ‘seed launcher’, the user will grow houses following urbanistic rules with smart homes adapting to the environment created: cities in the cloud, uprooted cities, cities on stilts, flying cities.

Yann Deval
Interactive designer, motion-designer, musical composer. After studying the history of cinema in La Sorbonne (Paris) and studying editing and audio-visual post-production in Cannes, he settled in Brussels in 2006 where he developed his activities as motion-designer and VFX artist. He works for the film industry (Mood Indigo by Michel Gondry, The Brand New Testament by Jaco Van Dormael), music-videos (Puggy, Sacha Toorop), documentaries for Arte, tv-shows for France Television). He occasionally trains professionals and students at digital creation workshops (School Arts2 Mons, EMMD Motion Design Brussels). Between 2012 and 2017, he co-directed the virtual reality performance IMMERSIO. This performance was a mix between live music and digital arts, played in a rich set of venues (SAT Montreal, ADAF Athens, SignalOFF Prague, Wisp Festival Leipzig, Bozar and Halles de Schaerbeek Brussels).

Marie-Ghislaine Losseau
Scenographer, visual art designer. She studied scenography at La Cambre / Brussels and visual arts at ISPG Brussels. She develops an activity around the topics of scenography, visual installations and the organisation of workshops with kids and adults.

Follow us

Subscribe

About PAL

The Performance Augmentation Lab (PAL) within the Department of Computing and Communications Technologies (CCT) seeks to close the dissociative gap between abstract knowledge and its practical application, researching radically new forms of linking directly from knowing something ‘in principle’ to applying that knowledge ‘in practice’ and speeding its refinement and integration into polished performance.