Tag: Digital Health

If knowledge is power, then from 7th century BC to the 4th century AD, the most powerful women of the classical world were undoubtedly the Oracles of Delphi.

Supplicants would travel far to seek the Pythia’s wisdom, delivered in ecstatic frenzy after inhaling the spirit of Apollo. Such was the cryptic nature of their utterances that prophets would be employed to help travelers make sense of their revelations.

Important decisions in matters of war, trade, marriage, and business, all made with reference to the divine knowledge imparted by the Oracle.

In matters of medicine, doctors have been the gateway to knowledge. With that, doctors have significant power in the war against disease.

On the afternoon of 22nd February I went to Digital Health.London’s ‘Collaborate’ event. It was a celebration of the programme’s first year’s work, an award ceremony, and innovation event all rolled into one. They were kind enough to let me speak about Virtual Reality in patient care as well.

Molly Watt & Dr Keith Grimes

There was a fantastic, diverse audience in attendance too, from the cutting-edge 360 live-streaming surgeon Shafi Ahmed, to my colleague Sunil Bhudia (with whom I’m working on the PREVENT-ICU-Delirium project), and the inspirational Molly Watt, who took part in a panel discussion about accessibility and digital participation. It’s also brilliant to meet people I know from twitter face-to-face, like Victoria Betton, who led a session on hacking STPs), and Dr Robert Lloyd, who skillfully MC’d the proceedings

So why the Delphi reference? Of all the talks, the most fascinating for me was the panel discussion on Artificial Intelligence.

Patients should use Artificial Intelligence to reduce the amount of time they need to spend with healthcare professionals

The panel comprised of three heavyweights in the Digital Health field:

Professor Nicholas Peters – Professor of Cardiology & Electrophysiologist at Imperial College London

Entertainingly and insightfully chaired by Dr Jordan Schlain , a fellow US based GP and Founder of HealthLoop, the debate fell nicely into those for, against, and balanced on the fence.

The AI Panel @ DH:L Collaborate – notice the #PinkSocks!

Ali Parsa began with a blistering defense of clinical AI, delivering an impassioned argument for how it can meet the yawning gap in healthcare provision around the globe. His team at Babylon have seen an 80% reduction in conversion of clinical inquiries to video consultations since the introduction of their triage AI. As far as he is concerned, there is a moral duty to implement AI to meet the care divide, and augment the diagnostic capabilities of clinicians.

It’s difficult to come back against that kind of rhetoric, although I would posit that Babylon’s UK business deals with a small proportion of range of presentations seen in General Practice, and from a significantly healthier and wealthier cohort. It might be difficult to extrapolate the 80% reduction in demand. With an ongoing trial front-ending GP in North London though, I guess time will tell.

The progress in developing world is also laudable, but I wonder whether it can ever cover the totality of care needs? Perhaps 10% of something better than 100% of nothing.

The issue of AI treating humans well and not turning them into batteries comes up time and again, and from my own point of view I swing between optimism and existential dread. It’s not unique to patient care, but does have an interesting twist in that should AI be successful in helping the patient, it may harm the doctor.

As for empathy, this is stronger ground. I certainly believe that humans need contact with other humans, particularly when it comes to the ritual of the consultation. Who’s to say that empathy is a uniquely human property though? Any pet owner will attest to the ability of their loved companions to deliver comfort without words. In time, empathy may well be better delivered by machine, especially to those raised in the post-millennial world – Homo Digitalis .

Professor Peters was left to adopt the middle ground, adding nuance to the preceding statements. I was particularly taken by his observation that the addiction that doctors have to treating patients, felt as a need deeper than simply a method of paying the bills, coloured their opinion. AI and the fear of being replaced and made worthless affect us all.

The framing of the debate made this last point ironic – there were no patients on the panel discussion the proposition. I raised this point, which was countered with an anecdote about Henry Ford who was quoted as saying that if he’d given the public what they wanted, he’d have made a faster horse. Professor Peters also referenced Molly Watts’ support of Apple’s commitment to user interface excellence & accessibility, which comes despite their famous lack of user input. All good points, but I’d still want to hear a patient make them.

The concept of Artificial Intelligence has been around since the 1950s, and we’ve seen this level of enthusiasm before. We’ve also had our hopes dashed before. I believe we’re entering a time when some of that promise will be realised, especially in narrow specialist areas. For this reason, I see the generalist physician outliving the specialist when it comes to head-to-head performance against AI in terms of patient care. Before my GP colleagues get too comfortable, even that advantage will pass in time, leaving the nurses as the most valuable humans in the healthcare system.

It may be that we will co-exist as providers of healthcare, but surely there will come a time when AI will be superior to human in a number of areas. When this happens, if we truly respect tradition of medicine, and believe we must first do no harm, maybe doctors should stop diagnosing patients?

And with that, is the best that doctors can wish for is to become gatekeepers of the knowledge – prophets at Delphi helping the patient understand the superhuman wisdom of the AI oracle? Or will this knowledge by available to the patients directly, leaving doctors shorn of their power?

“The one thing that the NHS cannot afford to do is to remain a largely non-digital system – it is time to get on with IT”

Professor Robert Wachter, author of NHS IT review

The Health & Care Innovation Expo 2016 is now entering its third year of showcasing the very best of innovation in the NHS. Hosted by NHS England, and held in the steampunk Victorian grandeur of the Manchester Central Conference Centre, I took part in 2 packed days of talks, workshops, demonstrations and general flights of wild innovative fancy with a wide range of attendees. The importance of the event was underlined by the prestige and range of speakers, from Professor Sir Bruce Keogh opening the event and chairing numerous panels, to Professor Bob Wachter MD talking about his review into digital usage in the NHS. We even had a hirsute Simon Stevens delivering a keynote and a full hour of Jeremy Hunt’s time, where he launched the next phase of the Digital NHS roadmap.

In truth there was a little too much to wrap my head around. The show floor was packed with exhibitors large and small, and an interesting range of stands exploring the ‘feature zones’ of New Care Models, NHS Right Care, Digital Health, and Personalised Medicine. Given the long queues for some of the talks, not to mention the numerous pop-up events and side meets, the one innovation we were all in need of was more time.

The announcements were, as tradition dictates, presented in the morning papers and we heard about the coming year’s targets on the journey to a digitised NHS in 2020. Primary care is in a good place here – in fact, Jeremy Hunt commended GPs for ignoring the government advice and ploughing their own furrow when faced with Connecting for Health. Without this, he said, we would be significantly further behind. Interesting advice on avoiding governmental advice there.

The news broke down as follows:

Patients will be able to book appointments, order medications, and download records, US ‘Blue Button’ style, on a revamped www.nhs.uk to be launched at Expo 2017.

Anyone will be able to access detailed stats on performance in key areas such as dementia, diabetes, and learning disabilities

There will be online access to 111, which can lead to direct appointment, signposting, or callbacks.

By March 2017 there will be a directory of approved apps from March 2017, with subsequent support for wearables

The creation of a second round of ‘national’ excellence centres, with more detail to follow.

The creation of an NHS Digital Academy to teach Informatics skills to NHS staff and create the next generation of Clinical Chief Information Officers and Digital Health Leaders.

Response to these announcements was mixed, both at the expo and in the press. On the one hand, when you combine this with the Tech Tariff (on which there was little news), it’s yet more evidence that the NHS is making good on the promise to step into the 21st century. Entrepreneurs and startups might complain that it doesn’t go far enough, and that the route to approval is still too long-winded and narrow. There was also the usual chorus of disapproval for any non-evidenced interventions in the NHS, and possible willful misinterpretation of what was being offered as simply a way of fobbing patients off with an app instead of a doctor. Those of us with a role in innovation have a responsibility to ensure that expectations are managed appropriately: Digital Health is NOT a panacea, but is instead another weapon in our fight against illness and social problems. We also need to ensure that evidence is generated and shared whilst trying to balance the pace of technological change against that of traditional research.

My presence at the expo was as innovations lead for my CCG (Eastbourne Hailsham & Seaford, and Hastings & Rother), and so it was exciting to be able to share the stage with Professor Sir Bruce Keogh, Dr Mahiben Maruthappu(@M_Maruthappu), Mr Ashish Pradhan & Maria Slater. Our panel, ‘Achieving Innovation at scale in the NHS’ hoped to inform the debate about how we can turn small scale innovation (which the NHS is brilliant at) into widely adopted, large scale change (not so good). The vehicle of the NHS Innovation Accelerator, which I have spoken of previously, is beginning to deliver, and I was one of three speakers talking about current NIA products.

Mr Pradhan is a Consultant Subspecialist Uro-Gynaecologist at University Hospitals NHS Foundation Trust, Cambridge. Episcissors – 60 are fixed angle episiotomy scissors, which are used to assist with incisions for difficult births that avoid the complication of damage to the anal sphincter and subsequent problems with continence. Undeniably, a brilliant idea, but the point was made that a business case was hard because this cheap intervention actually reduces hospital income down the line! The NHS is littered with such perverse incentives not to innovate, all of which need addressing.

Doing a ‘Simon Stevens’ with an AliveCor in my hand

When it came to me, my story was simple – having an excellent product is NOT enough. AliveCor is, undoubtedly, a great product which works very well at identifying asymptomatic Atrial Fibrillation (AF) as well as other rhythm disturbances, but from pilot work and a wider scale roll out in my CCGs, uptake has been slow. This reinforces the need to carefully consider how to manage change when introducing innovation, as well as considering the practical aspects and the need for education and support.

Even so, with lower uptake than expected, we detected 61 new cases of AF which, if treated appropriately, would have significantly reduced the risk of stroke in the target population. In effect, we may have avoided up to 3 strokes per year even in this small group. Numbers like that surely warrant support!

It was also great to be able to celebrate East Sussex Better Together and our progress towards a single Accountable Care Organisation. By working together with acute trusts, community trusts, and social care, we are moving towards a world where the “perverse incentives” mentioned in Episcissors story are a thing of the past. Costs are no longer saved in someone else’s budget

You could have spread the event over a week and still not had the opportunity to catch the majority of the content. I attended talks about the GP Forward View, Urgent & Emergency Care Innovation, and even learning from high performance and marginal gains theory in a talk called “Black Box Thinking” from Matthew Syed(@matthewsyed). Innovation is more than just technology, and sometimes the change in mental perspective towards one of continual marginal improvement is the most difficult of all.

As ever, effortlessly cool in new tech

My personal favorite technology, Virtual Reality (VR), was a little thin on the ground. We had VR from treating Obsessive Compulsive Disorder from a company called Mindwave Ventures(@mindwave_). They are using VR to create what must be the most disgusting bathroom since Trainspotting to help patients gradually address their fears of contamination. Augmented Reality was showcased from AMA(@AMAapplications), whose Xpert Eye platform will soon be used in my area to allow doctors to remotely visit care home patients. I also have to confess that my day (and probably whole week) was made when I discovered that the MSD team had brought Microsoft Hololens(@hololens). I can only apologise to everyone that had to experience my excited swearing as I strolled around an alternate reality populated with tigers, sharks, and a ghostly vitruvian man with a glowing nervous system.

Having spoken at TEDxNHS(@TEDxNHS), it was lovely to meet Dr Jon Holley (@jonnyholley), Dr Manpreet Bains(@manpreetbains_1) and the team again at their stand. The video footage from the event is in the edit and I’m assured will be available soon. It even led to one of the more surreal moments of the event where I got pulled out of a talk on Urgent Care to demonstrate VR to Ruby Wax ahead of her talk on Mindfulness and Mental Health.

Ruby Wax tries VR Mindfulness

I’ve made no secret of my love for the US way of approaching innovation, and how they celebrate the possibilities whilst including patients, especially in the Stanford Medicine X conferences. Thanks to speakers like Roy Lilley(@RoyLilley) who talked energetically about the importance of innovation from the front line, challenged pretty much everyone he spoke to to think differently, and who then danced off to ‘Always look on the bright side of life’ after his talk, I think I can now see the British version of this optimism, and the contagion is spreading.

Innovation now has fewer barriers than ever in the NHS, although those that remain are substantial. It’s over to us to make sure that next year for Expo 2017 we have some real success stories to share, alongside the courage to share and learn from our failures.

DECLARATION OF INTERESTS

I attended in my role of CCG Innovation Lead & Governing Body Member of EHS/HR CCG. As a speaker, all travel, accommodation fees met by the event organisers. I received no speaker fee.

Oh, I also wore #PinkSocks throughout, in the spirit of #JFDI and #GSD. These were a gift from Eugene Borukhovic (@healtheugene)

My hopes were high for some earth-shattering VR announcements from Google yesterday. As I watched the keynote on YouTube and sat through a series of announcements that, on the face of it, were rather underwhelming, I started to feel a little less hopeful.

Daydream – a hardware and software VR ecosystem

When the VR came, it was in the form of an announcement about a new name (‘Daydream’), reference specs for VR ready handsets and headsets (‘coming this fall’) and a peek at the user interface, which looked interesting but somewhere south of what I can already get from Samsung Gear VR. The inclusion of a wiimote style controller was interesting though, and my mind went immediately to the possibility of using this in physical therapy and stroke-rehabilitation.

Headset and Controller – Reference model for ‘Daydream’ system

While the future of VR is, in my opinion, very bright and nausea-free, it was the remainder of the keynote that got my neurons firing with possibilities.

Earlier in the event, Google presented a rebranded messaging and video call pairing with Allo and Duo. Allo I could take or leave – it’s AI enhanced auto-reply seems well set to address all my dog-breed identifying woes. It does highlight the path to a more conversational, human-orientated AI interface, all of which strengthens Google’s core offering of a smart, intuitive natural language interface (more later)

So THAT’S what breed it is *wipes brow*

Duo introduced something called ‘Knock Knock’ to video calling though, and this is where I saw some potential benefit. Essentially, it is a live video preview of the caller before you answer. This allows the recipient to know a little more about the person calling, presented as a way to gauge the mood of the caller.

Me? I saw a way of pre-triaging a patient before the call begins, without the patient needing to speak or enter data. We’ve seen video footage being used to determine pulse, respiratory rate, and even emotional state in other AI systems – could ‘Knock Knock’ even screen for facial weakness in acute stroke? Perhaps you could even analyse changes to speech against previous calls to determine subtle early changes to voice that can happen during ischaemic events. Whether this is unique to Google Duo or whether you could integrate this into existing WebRTC clients is another matter – the ‘smooth transition’ much emphasised by the presenter is less important in this situation.

Changes to Android OS (Version N – my money is on ‘Nougat’) were also discussed, but in truth the most interesting announcement of the whole event was Google Home. Having been nicely set up with a display of the advances in the natural language interface of Google Assistant, this device appears to be the voice activated front-end of home automation. There has been a slowly growing roster of Internet-of-Things (IoT) devices in the consumer market in recent years, such as Nest and Hive thermostats, Philips Hue lights, and the like. Google Home will provide voice search, internet services, and voice control of IoT home devices. Of course, Amazon got there first with Echo, but Google does have 17 years of weapons-grade search engine experience behind it.

The demo video shows the range of ways in which it could get the model family with 2.4 kids and a science project ready for their day – lovely, but much less interesting than the massive potential for health and social care.

A natural language interface could be enormously helpful in helping to meet the health & care needs of the older population:

It could be used to easily coordinate carers and update estimated time of arrival, reducing anxiety.

Food could be ordered from online grocers, reducing the need to employ carers for the simplest of care tasks (and thereby reducing cost and easing demand).

Medication reminders could be given in a friendly and simple to follow way, which could then feed back to the patient’s electronic record.

Voice control of home devices would be a gateway to increased use of IoT enabled lamps, heaters, and cooking equipment which would improve accessibility and safety, especially when it comes to family supporting their more dependent members.

In event of an emergency, Google Home could be used to summon help if the user was unable to get up after a fall, and act as speakerphone to emergency services.

The list of possibilities goes on and on, and multiplies with every connected service – something Google is very good at. Having been at the NHS Hack Day this last weekend, I’ve been struck by how little hacking appears to take place in the Social Care arena – perhaps later this year we could see events where Google Home (and Amazon Echo) are used to provide novel services at low cost?

Google Home can potentially help with something much more human though – the need for company. So many of my older patients live alone, with no-one visiting between carer appointments. This device opens the door to a easy, natural way of communicating with others, playing music, listening to audiobooks and the radio, but also interacting with someone that is always there, ready to talk, 24/7. It may start with traffic and weather reports, pre-canned jokes, and facts about astronomy from wikipedia, but the ambition of Google and Amazon, powered by the exponential growth in the field of Artificial Intelligence, means that before long the AI in your home will not just be your butler and assistant, but also a friend and companion.

This is partly down to the fact I’m sleeping on a camp bed at my brother’s home with the sunrise peeking through the curtain, but mostly because my brain has already started fizzing with ideas and excitement ahead of my second ever NHS Hack Day.

I first went to NHS Hack Day in January 2015, when it was held in Cardiff. I’d been introduced to it through tweets with AnneMarie Cunningham (@amcunningham), GP and Primary Care Director at the Aneurin Bevan Health Board, who was organising the event. Sold as an opportunity to meet like-minded hackers and geeks, I spent a whirlwind 36 hours working on GWYB – a notification system for patients which triggered communication cascades on the event of their admission to hospital. We even won the Patient Prize for our efforts.

NHS Hack Day is a free to attend event that has been running across the country at weekends since 2012. In ‘Meeting the challenge’, they ask:

How can we build an environment where world-class NHS digital services flourish?

Through leadership that understands technology and is bold enough to modernise the delivery of digital services, including embracing openness.

To this end they sent out the call to all geeks that love the NHS and bring them together in the spirit of adventure, openness, and addiction to coffee.

I arrive at 8:30 at Kings College, London, and pitch in immediately with laying out the bottle water, coffee, tea and bin-bags. Extension cables are daisy-chained together and taped to the floor. I pop my Ricoh Theta S camera onto it’s tripod and start up Tweetbot in readiness.

By 9 it’s getting busy, and 15 minutes later we’re off at pace. It’s a speed that doesn’t really drop for the subsequent 33 hours. Everyone who has contributed to the Google Document of Pitches through the week is given 60 seconds to pitch to the assembled masses. Here’s my attempt:

Yes, that’s right: I’ve just asked a room of strangers to build a customised 360 video viewing app for Google Cardboard by the next afternoon. I’m nothing if not ambitious.

The pitches range widely, from medical dictionary and haematology data visualiser, to hospital bed finder, bleep replacement, and even personal pollution monitoring. I’m suddenly aware that there are lots of other teams I’d like to join.

10:30 am and I feel like a wall-flower at a speed-dating event.

Once you’ve pitched you stand around the side of the room with a sheet of A0 paper with the project’s name on it. It starts slowly, but gradually the fact that I have a VR headset and I’m willing to share it attracts people. Several question the scale of what I’m trying to achieve, and as a result I realise that the project needs to change. With the help of some of the people the subsequently become the team, we decide to focus on using the tools at hand and skills we share to explore using VR and 360 video to help treat Phantom Limb Pain (PLP)

At this point it’s probably important to give you a little more information. PLP is a common and distressing complication of amputation. Up to 70% of people who have had an amputation can experience pain, itching, burning or distortion of their missing limb. It’s difficult to treat with medication, and as such a number of psychological and alternative therapies have been developed.

One such treatment method is MIRROR THERAPY. First described by Ramachandran in 1995, this uses mirrors to allow patients to view their injured limb as made whole again using the other limb. This has been shown to help reduce pain and distress, both during the treatment but also on an ongoing basis.

I have two patients with phantom limb pain, and even before coming to NHSHD I’d been wondering about using VR to help treat them. This weekend started to look like I might be able to make good on that.

More coffee and a time check. 11:30. We have 6.5 hours left of the day, then a further 6 hours tomorrow, to try and deliver something that will genuinely improve patient care.

My team is comprised of people with a huge range of different skills and backgrounds. Becky is a coding and digital media student from Brighton. Helen is a registered community nurse with a passion for tech and digital health. Mussadiq is a java dev with geographic information system skills. Ali a quantitative analyst. We also have Daniel and Charlotte, both software engineers. Some of the team stay for just day 1, and we’re joined on day 2 by Reno who’s switched codes from the dark side of finance to join Team Digital Healthcare. It’s an eclectic and excellent bunch – you can meet them all on our site.

(L to R) Becky, Reno, Ali, Keith, Helen.

Given our target group, the plan is to explore using VR, 360 video, and the Gear VR headset to simulate mirror therapy in a low cost digital way. My hope is that we can develop practical methods of deploying this in a clinical setting and share our findings with the community at large. It also means we get to have fun playing with all the toys, whilst everyone gets a chance to contribute and learn something.

The team splits into three streams:

Charlotte and Daniel start on the website, which we will use to contain our work from the weekend.

Becky, Ali and Musaddiq immediately set to work on the hard coding challenge – looking at Virtual Reality and whether we can mirror a live 360 video stream from the Theta S camera.

Helen & I began the collation of research evidence, and constructing a ‘treatment protocol’ that we could create some simple 360 video footage of which we could test with the team.

Such is the focus of a Hack Day that many of us didn’t really realise that the excellent lunch had been served until the back of the queue bumped into our table. This was despite the food being served right next to us. I guess this was the first proof of the distractive powers of Virtual Reality.

For the remainder of the day each stream worked away on their particular tasks. The website came together quickly and beautifully, built on a wordpress framework. Becky and Musaddiq heroically tackled 2 things at once:

3d modelling in Unity and then 3D Studio Max, developing some great point-of-view animations of leg therapy

Tests of live streamed 360 video using OBS and YouTube – this was sadly too slow, and there did not appear to be any open source mirror plugins.

Helen introduced me to Slack, a team collaboration tool that I dared to consider as yet another Social Network until I was sternly corrected. Using a technique shamelessly borrowed from the adult entertainment industry, I duct taped the 360 camera and Gorillapod to my chest to record 5 short series of basic mirror therapy clips. You can see them all here, and watch them yourselves using any VR headset. What was immediately apparent was that by watching and copying the movements you could experience an eerie sensation that the hands you were seeing were, in fact, your own (which in my case they were)

By 6 o’clock the pub and Eurovision were calling, so we all departed.

Day 2. 5:30 am this time. Ukraine won.

Another glorious day, so with coffee in hand I took a few photos of Embankment and set off to rejoin my slightly smaller team. This was offset by the fact that overnight we had been contacted by Reno, who asked to join us. Expanding the reach of the Hack Day using social media is fantastic, and something I hope they facilitate in future. As it was, our hashtag started trending shortly into day one, which was announced by the unwelcome hijacking of our thread by a russian dating agency.

By 9:30 everyone was up-to-date and the plan for the remaining 6 hours was in place. To up the pace and demonstrate the power of what we were doing to the team, we decided to utilise the ‘Cold Pressor’ test to see whether any of the content we had created could offset the pain of holding your hand in iced water.

The Cold Pressor test can be thought of as the bespectacled, serious cousin of the ice bucket challenge. It is used in research to help provide a controlled and safe painful stimulus. It has already been used, successfully, in demonstrating the efficacy of VR in reducing pain, so I felt it was justifiable to subject Becky, Reno and myself to a bit of light Sunday torture in the name of science.

Despite our rather crude efforts, what we found was quite startling. Becky & I recorded some point-of-view footage of ourselves with both hands inverted, our left arm in an empty bucket. The bucket was duly filled, and we were timed as to how long we could keep our left hand in the iced water.

*the sound of high-pitched screaming*

Becky bowed out at 1 minute 30 seconds. I lasted even less, at 1 minute 10.

We were given a while to recover and then tried again using our personalised 360 video. What we found was that Becky increased the time she tolerated the pain to over 3 and a half minutes. I tried again and stopped at much the same time, with the feeling that I could have gone on if I wished. The sensory confusion of seeing both hands in the air versus the sensation of the left arm in water at near freezing clearly disrupted my perception of pain.

Reno stepped in next to experience the power of VR to distract patients from painful stimuli. Watching ‘Kurios’ by Cirqu du Soleil, he breezed through nearly 10 minutes of laboratory standard agony, smiling much of the time. Having checked his biography, I now see that he is an ultra-runner. This doesn’t diminish his achievement, but explains the smile.

Always keep the team happy. Or in pain.

Next came the crunch. I’d cunningly ensured that 3 of the time had frozen typing hands, so we awkwardly wrote up our findings, with Becky and Ali also finding the time to crack the problem of mirroring 360 footage in a simple and effective manner. It was this last development that will really help clinicians in creating effective personalised 360 mirror content for patients, and will form the basis of the next steps I take with my own patients.

3:30 arrived, and the final presentations in front of the judges began. With a brutally marshalled 3 minutes, each team spoke of what they had achieved in the last day and a half, before being grilled by the panel and audience.

We saw a great variety of differing presentations, but what tied them together was the incredible progress everyone had made, and the amazing creativity and skill that had been used in producing extremely polished applications that were, in many cases, ready to use. I was particularly impressed by ‘Outbreak’, a disease-outbreak management system in a box that used Raspberry-Pi’s and tablets to create a pop-up field network. I wasn’t the only one: they took home the star prize. Very well deserved.

So what about Virtual Analgesia? Well, I’m delighted to report that we won a ‘Highly Commended’ prize from panel judge Alan Thomas (@alanroygbiv) for our work on Patient Inclusion. Having had the idea come from patient needs, it was high praise indeed to have this recognised.

6 pm and it was all over, bar the wrestling over the goodies and dividing up the remaining bottled water. I’d been part of 36 hours of intense team work and creativity, and joined a group of new friends and colleagues. Most importantly we had a new tool that clinicians can consider using in managing Phantom Limb Pain. In the coming weeks I hope to share this work with my two patients and see whether they’d like to try this approach. Using VR in this way means that when they wake at 3 in the morning they’ll have something new to try to control the burning pain in the foot that’s no longer there.

This post is not a speech, so I won’t go into detail about how thankful I am for the help I had from my team – I’m banking on the fact that they know this already.

What this post must be is a loud celebration of the amazing work of the NHS Hack Day group, and most of all about the incredible reservoir of passion and talent in the developers, students, clinicians and patients of this country. The challenge of rising demand and shrinking funding of healthcare is not unique to the UK, but we have a National Health Service – free at the point of delivery, with care provided based on need, not the ability to pay. The NHS Hack Days demonstrate that it isn’t just the nurses and doctors that are committed to supporting this unique and precious institution, and that we don’t go into the fight unarmed – there’s an army of geeks out there, and they have some incredible tech to share.

To find out more about the next NHS Hack Day, visit their website www.nhshackday.com or follow them on twitter @nhshackday – they really are amazing events, and welcome everyone with a passion for healthcare.

All the notes from my team ‘Virtual Analgesia’ are available on www.virtualanalgesia.net . We’d love to hear from you with any feedback or comments. You can join the discussion on Facebook on ‘VR Doctors‘ – just apply to join.

Declaration of Interests

I attended this event in my own time and at my own expense. The hardware and software used was all either open source or owned and operated by the participating team members.