I’m writing up my PhD thesis at the moment and analysing a huge amount of data from over 70 surveys and 8 hours of focus group audio transcripts. Anyway, without giving away too much about the data, as I’m saving it for my thesis, here’s a little preview of my ThinkerBelle EEG Amplifying Dress. I created this dress in response to a subsection of feedback data from my field trials and focus groups, which investigated the functionality, aesthetics and user experience of wearables and in particular wearer and observer feedback on experiences with my EEG Visualising Pendant. The motivation for creating the dress was for engagement in social situations in which the wearer might find themselves in a noisy or crowded area, where it is not possible to hear others and communicate easily – where forms of non-verbal communication may be useful. The dress broadcasts the meditation and attention data of the wearer for observers to make their own interpretations. It is up to the wearer if they want to divulge information regarding the physiological source of the data being visualised.

The dress was constructed with a satin fabric and fibre optic filament woven into organza. Using a NeuroSky MindWave Mobile EEG headset signals in the form of two separate streams, ‘attention’ and meditation’, are sent via Bluetooth to the dress, which amplifies and visualises the data via the fibre optic filament. Attention data is shown as red light and meditation signal data as green light. The dress is constructed so the two streams of data light overlap and interweave. The fibre optic filament is repositionable allowing the wearer to make their own lighting arrangements and dress design. The red and green light fades in an out as the levels of attention and meditation data of wearer highten or decline.

The dress’ hardware has a choice of modes, so it is possible to record and playback the data. This makes it possible for the wearer to appear to be concentrating or relaxed if wished to influence a social situation. Also if the wearer wishes to use their EEG data to create a certain mix of colour and light on the dress. It is also possible to set the playback mode and take off the EEG headset if the wearer does not wish to wear it.

Red = attention / green = meditation

As you can see I’ve included a few initial photos of the dress in action showing the EEG data as it is received from the headset. I have not made a successful video of the dress yet, as it’s difficult to light the dress for photos and filming. I will add a video when I’ve worked around this!

I have also been experimenting with changing the form factor of the headset for aesthetic and comfort, using various materials.

Feeling relaxed = very green dress!

A bit of extra info, in case you were wondering… During my PhD research, I’ve been investigating the possibility of that wearable technology can be used with physiological data to create new forms of non-verbal communication. Since 2008 I’ve been experimenting with wearables, sensors and social situations, which led me to focus on wearables. These wearables amplify visualise and broadcast data from the body. As mentioned in previous blog posts, the field of wearable technology has blossomed and grown rapidly in recent years into a huge and mainly undefined set of devices, platforms, uses and practice. It was therefore necessary for me (a couple of years ago now) to create my own nomenclature to define the area I was creating and researching in. The first subset area being ‘responsive wearables’, which deals with wearables that respond to various physiological, environmental and other user related data and gives an output. This worked for a short while but still wasn’t definitive. I went on to drill down and make a new subset of this area to find a better definition for the emerging field I was working in, which I named ‘emotive wearables’. This area focuses on the area of wearable technology which deals with the gleaning of physiological data from the body, processes and broadcasts it in some way from the wearer. The output could be sound, movement, light, etc.

My research with sensors, social situations, ambient and physiological data has led me to work with sound signal input (decibels), temperature (Celsius), pressure (Pascal) and altitude (metres) ECG (Electrocardiography), GSR (Galvanic Skin Response), EMG (Electromyography) and EEG (Electroencephalography), but my main focus for my PhD has been on the development and research of emotive wearables with EEG data.

At the end of April I spent a very enjoyable day at Bournemouth University attending Transmission Symposium: Strategies for Brainwave Interpretation in the Arts. There were some very interesting presentations, exchanges of ideas and discussion on the intersection between art, cognition and technology. Links to the event, artists and scientists taking part can be found here. Thank you to Oliver Gingrich for inviting me to participate and to all the attendees, especially those who visited my emotive wearable exhibits, asked questions and/or tried a device and filled in a feedback survey.

At Transmission Symposium I debuted my AnemoneStarHeart, which is an ambient handheld device (smaller wearable version being tweaked!) I have developed for broadcasting, amplifying and visualising EEG and ECG data. I have been developing this device as part of the iteration process of the EEG Visualising Pendant. It brings together technology and elements from my aforementioned EEG Visualising Pendant and Flutter ECG pendant hack.

AnemoneStarHeart being used as an ambient device to observe relaxation whilst watching ‘Canal Trip’ slow TV programme, BBC4, May 15.

It can be used, for example as an aid for meditation, relaxation and concentration, as well as for personal viewing or sharing physiological data in social situations with others. Data is sent to the AnemoneStarHeart via Bluetooth and it is a battery operated, standalone device. It can either be viewed in the palm of the hand or placed in a convenient area of a room – illuminating the space with coloured light. Whilst sensors are transmitting data to the device, it constantly visualises it, changing colour and brightness based on the data it receives. The smaller, wearable version hangs from a chain as a necklace or in the style of a pocket watch so it can be brought out, looked at, then put away again. As I am interested in the commercial possibilities of bespoke couture wearables and small editions of emotive devices, at some point I aspire to crowdfund this project.

As part of my PhD research, I have spent the best part of a year organising and running focus groups with potential users of emotive wearables and the EEG Visualising Pendant in London and Amsterdam. I have also conducted field trials in various social and work situations across London and Brighton, plus collected feedback from observers of the pendant. Since the beginning of 2015 I have been analysing the resulting data. This is to discover the preferences and feedback of potential wearers of emotive wearables as well as the EEG Visualising Pendant. Out of the resulting data, so far, has evolved the AnemoneStarHeart device, for which I devised a new configuration of electronic components and code. I created a new enclosure for the electronics in 3D modelling package Rhino, with help from skills learned at Francis Bitonti’scomputational design workshop. It was selective laser sintered (SLS) in Nylon, in one of D2W’s EOS machines in London.

At the moment I am mostly out of general circulation as I’m collecting and analysing data which is feeding into the new emotive wearable devices I am building, whilst simultaneously endeavoring to write up / finish my PhD thesis to deadline.

The workshops consisted of alternating tutorials on techniques for creating 3D textile meshes in Autodesk Maya and Rhino 3D software, and also writing Processing sketches for 3D graphics. The workshops were taught by Francis Bitonti and Arthur Azoulai.

Tom’s work on wrapping mesh to make a shirt around a body.

Our first week started off by creating meshes for the body in skirt and shirt like forms in Rhino 3D. We then experimented with various mesh techniques to apply varied distributions of extruded geometric shapes on to a mesh. Going on to concentrate on creating interlocking aspects of a circle, we created a repeated template that could be used to create chainmail in Rhino 3D.

Magdalena making chain mail for textiles in Rhino 3D.

In Maya we played with primitive polygon shapes and then experimented with them in the animation timeline to flip and tween between shapes, which we could then start to turn into mesh textiles by joining them together.

L-R work by Victoria, Ezmeralda, Tom and Ioana.

After discussing our ideas and designs for what we would like to individually create, we spent a couple of days building our own meshes. Every .STL file was checked over and fixed in Materialise’s very useful app, Magics (which I wish I could afford for future work!), before sending to the SLS (Selective Laser Sintering) machine to be turned into real objects via the magic of a laser zapping powdered nylon.

L-R work by Nada, Magdalena, me and Carmen.

Whilst the objects in the machine were being turned around, which takes several hours as the cubicle inside the machine stacks up several files / containers of work to be processed at a time, we did some examples of Processing sketches to create 3D graphics. We also learned about other software packages such as ZBrush, which is a powerful 3D sculpturing tool for manipulating 3D objects and looks like amazing fun to play with.

My design shaping up in Rhino 3D.

The container with my heart halves inside just opened by Johnathan!

Of course the most exciting part of the two-weeks was receiving the containers from the SLS machine, with the fruits of our creativity neatly concealed inside! I created a heart-shaped shell enclosure /pendant with a repeated star mesh to create an anemone-like effect. This was created to house the electronics and act as a diffuser of data in the form of coloured light for the next iteration of my EEG Visualising Pendant. The pendant amplifies and visualises attention and meditation EEG data from the wearer via a NeuroSky EEG headset.

Many thanks to Francis, Arthur, Jonathan and the staff at D2W for a great two weeks of fun and excellent hospitality, plus not forgetting the lovely attendees of the workshop who were fab to hang out with.

Comments Off on Francis Bitonti’s New Skins Workshop 2015 at Digits2Widgets, London

Posted onNovember 30, 2014|Comments Off on Baroesque Barometric Skirt in New Scientist & on show at Microsoft Research, Redmond, USA

As we trundle into the dark winter days of 2014, I will be locking myself away to write, so I won’t be traveling to show my work in any exciting cities for a while.

So, just a couple of nuggets of recent news on my Baroesque Barometric Skirt – I was delighted to hear that it had been featured in the ‘One Per Cent’ column in New Scientist Magazine, September 27th issue, which reported on it being shown at the ISWC (International Symposium on Wearable Computing) Design Exhibition at the EMP Museum in Seattle last September.

The Baroesque Barometric Skirt featured in New Scientist

Me being chuffed in Smiths with a copy of New Scientist

The Baroesque Barometric Skirt was also on display at Microsoft Research Gallery during September and October, which was organised by Asta Roseway of Microsoft Research and Troy Natchtigall, chair of the ISWC Design Exhibition. The skirt, which is part of my PhD practice should be winging its way back to me soon and I’m looking forward to being reunited with it.

Baroesque Barometric Skirt exhibited at the Microsoft Research Gallery in Redmond, WA, USA. Image by kind permission of James Hallam of Georgia Tech, whose Ballet Hero garment is also featured in this photo.

Some of the other exhibits on show at Microsoft Research Gallery. Images by kind permission of James Hallam.

Whilst in Seattle at ISWC, I took advantage of the interesting decor of the Motif Hotel to make a new video of the skirt. Many thanks to Johnny Farringdon for being my cameraman :-)

Comments Off on Baroesque Barometric Skirt in New Scientist & on show at Microsoft Research, Redmond, USA

I had heard it would be big, but I wasn’t prepared for the hugeness of it, or that it would mainly be an outside event! It was comprised of several fields of stands and presentation stages, plus the entirety of the New York Hall of Science, which isn’t a small building. Because I had a big list of places I wanted to visit in Manhattan, I had intended to spend half a day on Saturday and Sunday at Maker Faire, but due to the vastness of World Maker Faire I spent two whole days there till closing each day and I still didn’t see everything or meet up with or find all the friends I had intended to say hello to.

Map of hugeness of World Maker Faire!

On day two (Sunday), on the Electronics Stage, I gave a presentation on my own work, primarily my Baroesque Barometric Skirt and EEG Visualising Pendant, which I wore around World Maker Faire, that incited much curiosity and feedback – which was a fun way to meet people! It was lovely that friends were in the audience and afterwards we had much fun wandering about and catching up. The talk slot was a bit short for me as I usually have a lot to say, so I had to wind up before my slides ran out, but I enjoyed the opportunity immensely.

Me, presenting my wearable technology work at the Electronics Stage

Was fabulous to catch up with and hang out with Ivaylo, Mandy and Ran, plus thank you for coming to my talk :-)

In terms of what was on show, it wasn’t very different from what I’d been used to seeing at UK Maker Faires, i.e. lots of electronics, crafts and technology stalls from individual makers, hackspaces and organisations, but there were loads more large stalls from the big players such as Atmel, Intel and Arduino.

Just one of the signposts around World Maker Faire!

It was great that there were many presentation stages and a multitude of talks to choose from, my favourite talk of the weekend was by one of my favourite inspirational wearables creators and thinkers, Kate Hartman, who spoke about the work her students have been up to at OCAD University in Toronto. I went up to Kate at the end to say hello, which was lovely. Check out her conceptual wearables, they’re very cool and have a look at the Social Body Lab and projects, which she runs at OCAD.

I really enjoyed Kate Hartman’s presentation on wearables

There were too many great stands and projects to document, but one of my favourites was the glorious Sashimi Tabernacle Choir, consisting of a car covered with over two hundred and fifty computer controlled lobsters, bass, trout, catfish and sharks. The Choir performs a choreographed repertoire of songs from pop songs to classical opera. It’s fabulous – enjoy the videos and info on the website!

The wonderful Sashimi Tabernacle Choir

A highlight of World Maker Faire was finally finding the OpenBCI stand. I had been conversing with Conor via email about their modular sensing kits that they had recently successfully ran a Kickstarter campaign to fund. To emphasise the vastness of World Maker Faire, it had taken me two days to find them. After asking at multiple help points, studying the map and wandering around and around the fields, I finally found the OpenBCI on the last day by grabbing a kindly information stand helper, who on hearing my plight, wandered around with me to find their stand! I’m really glad I persevered, as it was lovely to meet Conor and Joel and fascinating to chat about and view their OpenBCI wares being demonstrated, plus they had a special discount offer for that weekend, which I took advantage of and can’t wait to get my own OpenBCI kit soon!

Great to finally meet Conor from OpenBCI

Another highlight of World Maker Faire was bumping into inspirational electronics engineer and entrepreneur Limor Fried AKA Ladyada, and Phil of Adafruit. I have been following Limor’s work since I got my first LilyPad Arduino back in 2008, which I bent her ear about and also showed her my EEG Visualising Pendant. When I got back to the UK I sent details of the pendant to Adafruit and fab fellow wearable creator, (whose work I’ve also followed for years) Becky Sternput up a page up about it on the Adafruit Wearable Wednesday blog – thanks Limor and Becky!

For me, the highlight of the ISWC / UbiComp conference was exhibiting my Baroesque Barometric Skirt in the ISWC Design Exhibition and conference reception. This year the ISWC Design Exhibition was held at the Experience Music Project Museum (EMP) in Seattle, which is an amazing venue with a three-storey screen on which videos of our work were shown and also houses a permanent exhibition dedicated to pop culture and music. Because I took so many photos (and made a video) I’m giving the event it’s own page so that it doesn’t take over my main ISWC blog post! This year I didn’t meet all the other exhibitors during the Design Exhibition set up, so I can’t do a full report on all the exhibits, but a full list of the Functional and Aesthetic wearables can be found on the ISWC program (Tues: EMP Reception/Design Exhibition link).

Experience Music Project Museum (EMP), Seattle, USA.

ISWC 2014 is my third year of being honoured to have my responsive and emotive wearable tech work accepted by the Design Exhibition jury: in 2012 I had three wearables accepted for ISWC held at Newcastle University, UK, and last year in 2013, my EEG Visualising Pendant was accepted for exhibiting at ISWC at ETH Zurich, Switzerland.

This year I was extremely happy to take my Baroesque Barometric Skirt to ISWC Seattle to exhibit. The skirt visualises data in the form of four independent RGB LED strips from four sensors, three of them are environmental and are: ambient temperature, pressure and altitude, the forth is a temperature sensor that sits on the inside of the skirt and pulls in the wearer’s body temperature. My motivation for creating the skirt is that I am interested in how we can display our physiological data alongside that of the environment or ‘bigger picture’ of elements that we are surrounded by. The skirt changes visually as the wearer moves around environments and also as the body reacts to its present situation. This garment-device starts a conversation around the connections between the environmental and physiological data of the wearer. The Baroesque Barometric skirt contributes a new way of sensing and presenting environmental and physiological data together. My paper on the skirt can be found in the conference proceedings and is available here or via ACM, but if you have any problems you can get a copy from me.

Troy welcomes attendees to the Design Exhibition at the EMP.

Many thanks to Design Exhibition Chair Troy Nachtigall for heroic work on organising the whole shebang from submissions to the show at the amazing EMP Museum, which looked stunning and also to the jury: Maggie Orth, Rosa Asteway, Zoe Romano and Meg Grant and not forgetting the ISWC volunteers.

ISWC + UbiComp is my favourite international twinning of conferences: ISWC showcases some of the most exciting developments in wearable computing and because the papers are reviewed by great academics, the quality of the papers selected is, in my opinion, excellent. UbiComp is great too, because it also has a high standard of accepted papers, which cover many topics across pervasive and ubiquitous computing that crossover with wearable tech interests. The conference took place in the rather nice conference areas of the Motif Hotel in Seattle, USA, September 2014.

For me, the most compelling presentation of the conference was the keynote given by Amy Ross of NASA, which gave us a fascinating insight into the history of the evolution (to the present) of what goes into the design and creation of space suits. I really enjoyed all the details of what worked and didn’t, plus the fab examples she brought along such as wrist mirrors for looking at spacesuit components, gloves (which I tried on) and even an emergency handbook of advice for astronauts!

An interesting addition to this year’s ISWC/UbiComp was the experimental addition of a number of telepresence robots for those wishing to attend but could not physically get to Seattle. I found the robots really intriguing to watch as they weaved around the conference rooms and people stopped to chat to their controllers. These were a good addition to the conference in my opinion and I was pleased to see at least one robot personalised with a scarf. As I won’t be able to afford to attend next year’s conference in Osaka, Japan, I will definitely be applying for one of the robots if they’re used again!

During the conference there was a Seattle Quantified Self + ISWC + Ubicomp meet-up, which was great as I got to show my EEG Visualising Pendant to a new audience and meet some lovely and interesting people, including David Cooper, who organises the Seattle QS meet-ups and had coincidentally brought his Muse EEG headset along, which was nicely fortuitous as I was waiting for my Muse to be delivered at home and was eager to chat about the device. David also pointed me towards some interesting Github repositories to investigate.