In April, the ONC released an update on health information exchange. It reported that about 75 percent of US non-federal acute care hospitals were exchanging data with outside providers (for example, non-affiliated practices or other hospitals). That’s a sea change compared to 2008, when only about 40 percent of hospitals were exchanging data with outside providers.

That growth — an 85 percent increase — isn’t confined to any particular region of the country. The ONC notes that in 2008, only 10 states had a clear majority (60 percent) of hospitals electronically exchanging key clinical data with outside providers. In 2014, 47 states (all but Idaho, Nevada and Mississippi) and the District of Columbia reported that at least 60 percent of their hospitals were exchanging key clinical data.

If, as network theory asserts, the value of a system is enhanced as more nodes are added, this bodes well for the nation’s healthcare system – and indeed, for patients. Because let’s face it, the point of federal incentives for data exchange (and penalties for failing) should never have been about simple data exchange, but rather the value that exchange can provide for physicians and patients alike. That’s why it’s called “Meaningful Use” and not “Mere Use.”

“What’s the value and the use of exchanging data?” asks Leslie Krigstein, interim Vice President of Public Policy of the College of Healthcare Information Management Executives (CHIME). “Just because you are exchanging data, you are not necessarily improving care. We need to talk more about the value to healthcare delivery.”

One response to Krigstein’s question might be found in two of the more contentious requirements proposed for Stage 3 of Meaningful Use. According to the MU Stage 3 rules proposed this spring, hospitals and practices hoping to qualify for incentives will need to provide EHR access to 80 percent of their patients, and access to patient-specific educational resources to 35 percent of their patients.

Furthermore, there are three measures of active patient engagement: Twenty-five percent of patients must access their records through specified electronic means; thirty-five percent of patients must receive a clinically relevant secure message; and providers must incorporate information from patients or "non-clinical" settings for 15 percent of patients (i.e. home health, physical therapy or perhaps wearable devices).

These are contentious requirements for many reasons – should patients be able to enter data into an EHR? Even if they should, will they? And if the federal government wants patients to actively engage with electronic health records, shouldn’t incentives and penalties be aimed at them and not the providers?

Let me flip network theory for a moment: Within an increasingly interconnected healthcare system, the value of patient engagement is significantly enhanced. It’s a blessing, not a curse.

As Krigstein told me, one driver of needed change is “a new level of understanding among patients that their data should be fluid.” And new payment models mean that it’s not just hospitals that should be exchanging data, but it should occur “up and down the continuum of care.”

In other words, MU Stage 3 doesn’t envision a world where patients routinely enter data to populate their records, or routinely pull down a copy of their EHR to print out and study. Instead, it contemplates the capacity for devices, labs and a host of providers to continuously update a fuller data representation of the patient’s current health.

That representation, in detailed form, might enable physicians to make more accurate diagnoses, for example, or to identify a public health threat in its earliest stages. Meanwhile, in dashboard form, the representation could help patients make better choices about their daily diet, exercise and wellness activities. After all, “better-informed” doesn’t mean “crushed by data.”

Most promising of all, such a representation — however presented — should enhance the patient-physician relationship, providing a new level of transparency. In medicine, ignorance is not bliss; better-informed physicians and patients will be close collaborators in the quest for healthier lives.

Here at Intel, we recently sponsored the NextGenSTEMM Women of the Future conference at the John Innes Centre in Norwich, UK, which brought together over 250 Year 10 girls aged 14-15 to inspire, educate and encourage them to consider STEMM subjects as a career. We want to help change the perceptions of what a career in the STEMM sector can be by showcasing some of the great work being done by colleagues here at Intel and educating students on the opportunities and career pathways available today.

Developing 21st Century Skills

There is a real need for these girls to follow their passion in STEMM subjects as Government makes large investments in stimulating the biotech, life sciences and genomics medicine sectors to help the UK be competitive on the global stage and prepare the NHS for precision and translational medicine. In the life sciences area, where the numbers of male and female graduates have been even for over 30 years, women are under-represented in leadership positions.

The morning session featured an impressive line-up of female presenters including Professor Jackie Hunter CBE, FMedSci, Chief Executive of the Biotechnology and Biological Sciences Research Council (BBSRC) and Space Scientist Dr Maggie Aderin-Pocock MBE, who is a familiar face on TV and is currently presenting BBC’s The Sky at Night. And to reinforce some of the key messages from the speakers a fantastic afternoon of workshops allowed the girls to talk one-to-one to organisations such as Intel.

Showcasing RealSense™ and IoT

We really wanted to capture the imagination of the girls by showcasing what we do in the Health and Life Sciences sector in a relevant but fun way, from showcasing Intel® RealSense™ technology with an interactive facial recognition app through to an Internet of Things bubble machine which responded to the girls tweeting live from the event. It allowed us to open up a conversation about how Intel technology is helping in a real and meaningful way, such as our IoT work with Mimocare to help elderly people manage their illnesses better and stay at home for longer.

The demonstrations triggered lots of discussions and questions from 'Will Moore's Law become obsolete?' to 'If the planet Mercury was put on water would it float or sink?' - we'll leave that one for you to answer in the comments below but it really highlighted how much we were opening up the mind to what is possible in the sector for these young girls. We only wish we could have spent more time with the girls but were pleased to see that we were being invited to many schools for further discussion.

Looking to the Future

An inspirational day finished up with prizes being distributed by Intel for the best questions which included Intel NUC kits while each of the 20 schools also received an Intel Galileo board. We're looking forward to not only following up on the innovations created by the prize-winners but hope to see some of the girls attending the event playing a big role in the future of our sector soon.

Now, about that Mercury question, if the planet was put on water would it float or sink?

As populations age around the world, home healthcare will become a more vital part of caring for senior patients. To learn more about this growing trend, and how technology can play a role, we sat down with Tracey Moorhead, president and CEO of the Visiting Nurse Associations of America (VNAA), which represents non-profit providers of home health, hospice, and palliative care services and has more than 150 agency members in communities across the country.

Intel: How has technology impacted the visiting nurse profession?

Moorhead: Technology has impacted the profession of home care providers, particularly, by expanding the reach of our various agencies. It allows our agencies to cover greater territories. I have a member in Iowa who covers 24,000 square miles and they utilize a variety of technologies to provide services to patients in communities that are located quite distantly from the agencies themselves. It has also impacted the individual providers by helping them communicate more quickly back to the home office and to the nurses making decisions about the course of care for the individual patients.

The devices that our members and their nurses are utilizing are increasingly tablet-based. We do have some agencies who are utilizing smartphones, but for the most part the applications, the forms and checklists that our nurses utilize in home based care are better suited for a tablet-based app.

Intel: What is the biggest challenge your members face?

Moorhead: One of the biggest challenges that we have in terms of better utilizing technology in the home based care industry is interoperability; not only of devices but also of platforms on the devices. An example is interoperability of electronic health records. Our individual agencies may be collaborating with two or more hospital systems, who may have two or more electronic health records in utilization. Combine that with different physician groups or practice models with different applications within each of those groups and you have a recipe for chaos in terms of interoperability and the rapid sharing and care coordination for these various patients out in the field. The challenges of interoperability are quite significant: they prevent effective handoffs, they cause great challenges in effective and rapid care coordination among providers, and they really continue to maintain this fragmentation of healthcare that we’ve seen.

Intel: What value are patients seeing with the integration of technology in care?

Moorhead: Patients and family caregivers have responded so positively to the integration of these new technologies and apps. Not only does technology allow for our nurses to communicate with family members and caregivers to help them understand how to best care for and support their loved ones, but it also allows the patients to have regular communication with their nurse care providers when they’re not in the home. Our patients are able to contact the home health agency or their nurse on days when there may not be a scheduled visit.

I visited a family in New Jersey with one of our agencies and they were so excited that it was visit day. When the nurse arrived not only was the wife there, but the two daughters, the daughter-in-law and also the son were there to greet the nurse and to talk with the nurse at length about the progress of the father and the challenges that they were having caring for him. That experience for me really brought home the person-centered, patient-centered, family-centered care that our patients provide and the technologies that were being utilized in that home not only when the nurse was there but the technologies that the nurse had provided with the family, including a tablet with an app to allow them to contact the home health agency, really made the family feel like they had the support that they needed to best care for their father and husband.

Intel: How are the next generation of home care providers adapting to technology?

Moorhead: The next generation of nurses, the younger nurses who are just entering the field and deciding to devote themselves to the home based care delivery system, are very accustomed to utilizing technologies, whether on their tablets or their mobile phones, and have integrated this quite rapidly into their care delivery models and processes. Many of them report to us that they feel it provides them a significant degree of freedom and support for the care delivery to their patients in the home.

Intel: Where will the home care profession be in five years from now?

Moorhead: I see significant change coming in our industry in the next five years. We are, right now, in the midst of a cataclysm of evolution for the home based care provider industry and I see only significant opportunities going forward. It’s certainly true that we have significant challenges, particularly on the regulatory and administrative burden side, but the opportunities in new care delivery models are particularly exciting for us. We see the quality improvement goals, the patient-centered goals and the cost reduction goals of care delivery models such as accountable care organizations and patient-centered medical homes as requiring the integration of home based care providers. Those organizations simply will not be able to achieve the outcomes or the quality improvement goals without moving care into the community and into the home. And so, I see a rapid expansion and increased valuation of home based care providers.

The technologies that we see implemented today will only continue to enhance the ability to care for these patients, to coordinate care and to communicate back to those nascent health delivery models, such as ACOs and PCMHs.

For years, the term “Big Data” has been thrown around the Healthcare and Life Science research fields like it was a new fashion that was trendy to talk about. In some manner, everyone knew that the day was coming that the amount of data being generated would outpace our ability to process it if major steps to stave off that eventuality weren’t taken immediately. But, many IT organizations chose to treat the warnings of impending overload much like Y2K in the aftermath, that it was a false threat and there was no real issue to prepare for in advance. That was five years ago, and, the time for big data has come.

The pace at which life science-related data can be produced has increased at a rate that far exceeds Moore’s Law, and it has never been cheaper or easier for scientists and clinical researchers to acquire data in vast quantities. Many research computing environments have found themselves in the middle of a data storm, in which researchers and healthcare professionals need enormous amounts of storage, and need to analyze the stored data with alacrity so that discoveries can be made, and cures for disease can be possible. In the wake of a lack of preparedness on the organizations’ part, researchers have found themselves in the middle of a research computing desert with nowhere to go, and the weight of that data threatening to collapse onto them.

Storage and Compute

The net result of IT calling the assumed bluff of the scientists is that they are unprepared to provide the sheer amount of storage that is necessary for the research, and, even when they can provide that storage, they don’t have enough compute power to help them get through the data (so that it can be archived), causing a back log of data storage that exponentially compounds as more and more data pours into the infrastructure. To make matters worse, scientists are left with the option of moving the data elsewhere to help them get through processing and analysis. Sometimes, well-funded laboratories purchase their own HPC equipment, sometimes cloud-based compute and storage is purchased, sometimes researchers find a collaborator with access to an HPC system that they can use to help chunk through the backlog. Unfortunately, these solutions create another barrier; how to get that much data moved from one point to another. Most organizations don’t have Internet connections much above 1Gbps for the entire organization, while most of these datasets are many terabytes (TBs) in size and would take weeks to move over those connections at saturation (which would effectively shut down the Internet connection for the organization). So, being the resourceful folks they are, scientists then take to physically shipping hard drives to their collaborators to be able to move their data, which has it’s own complex set of issues to contend with.

The depth of the issues that have arisen out of the lack of preparedness of research- or healthcare-based organizations are so profound that many of these organizations are finding it difficult to attract and hire the talent they need to actually accomplish their missions. New researchers, and those on the forefront of laboratory technologies, largely understand the requirements they have computationally. If a hiring organization isn’t going to be able to provide that, they look elsewhere.

Today and Tomorrow

As such, these organizations have finally started to make the proper investments into research computing infrastructure, and the problem is slowly starting to get better. But, many of them are taking the approach of only funding what they have to today to get today’s jobs done. This approach is a bit like expanding a highway in a busy city to meet the current population’s needs, rather than trying to build it for 10 years from now; it won’t make a difference in the problem by the time the highway is completed because the population will have already exceeded that capacity. Building this stuff the correct way for an unpredictable time at some point in the future is scary, and quite expensive, but the alternative is the likely failure of the organization to meet their mission. Research computing is now a reality in life science and healthcare research, and not investing will only slow things down and cost the organizations much more in the future.

So, if this situation describes your organization, encourage them to invest now in technologies for the 5-years-from-now timeframe. Ask them to think big, to think strategically, instead of putting tactical bandages on the problems at hand. If we can get most organizations to invest in the needed technologies, scientists will be able to stop worrying about where their data goes, and will be able to get back to work, which will result in an overall improvement in our health-span as a society.

Here at Intel in France, we recently announced a collaboration with the European-based Teratec consortium to help unlock new insights into sustainable cities, precision agriculture and personalized medicine. These three themes are closely interlinked because each of them requires significant high performance computing power and big data analysis.

Providing Technology and Knowledge

The Teratec campus, located south of Paris, is comprised of more than 80 organisations from the world of commerce and academia. It's a fantastic opportunity for us at Intel to provide our expertise not only in the form of servers, networking solutions and big data analytics software but also by utilising the skills and knowledge of our data scientists who will work closely with other scientists on the vast science and technology park.

The big data lab will be our principal lab for Europe and will initially be focused on proof of concept works with our first project being in the area of precision agriculture. As we progress techniques we will bring the learnings into the personalized medicine arena where one of our big focuses is the analysis of merged clinical data and genomic data that are currently stored in silos as we seek to advance the processing of unstructured data.

Additionally we will also be focusing on the analysis of merged clinical data and open data like weather, traffic and other publically available data in order to help healthcare organizations to enhance resource allocation and health insurers and payers to build sustainable healthcare systems.

Lab makes Global impact

You may be asking why Intel is opening up a big data lab in France. Well, the work we will be undertaking at Teratec will not only benefit colleagues and partners in France or Europe, but globally too. The challenges we all face as a collective around an ageing population and movement of people towards big cities present unique problems, with healthcare very much towards the top of that list. And France presents a great environment for innovation, especially in the 3 focus areas, as the Government here is in the process of promulgating a set of laws that will really help build a data society.

So far, we have seen research into solutions curtailed from both a technical and knowledge aspect but we look forward to overcoming these challenges with partners at Teratec in the coming years. We know there are significant breakthroughs to be made as we push towards providing personalized medicine at the bedside. Only then can we truly say we are forging ahead to build a future for healthcare that matches the future demands of our cities.

I recently had the privilege of interviewing Daniel Dura, CTO of Graphium Health recently on the subject of security on the frontlines of healthcare, and a few key themes emerged that I want to highlight and elaborate on below.

Regulatory compliance is necessary but not sufficient for effective security and breach risk mitigation. To effectively secure healthcare organizations against breaches and other security risks one needs to start with understanding the sensitive healthcare data at risk. Where is it at rest (inventory) and how is it moving over the network (inventory), and how sensitive is it (classification)? These seem like simple questions, but in practice are difficult to answer, especially with BYOD, apps, social media, consumer health, wearables, Internet of Things etc driving increased variety, volume and velocity (near real-time) sensitive healthcare data into healthcare organizations.

There are different types of breaches. Cybercrime type breaches have hit the news recently. Many other breaches are caused by loss or theft of mobile devices or media, insider risks such as accidents or workarounds, breaches caused by business associates or sub-contracted data processors, or malicious insiders either snooping records or committing fraud. Effective security requires avoiding distraction from the latest media, understanding the various types of breaches holistically, which ones are the greatest risks for your organization, and how to direct limited budget and resources available for security to do the most good in mitigating the most likely and impactful risks.

Usability is key. Healthcare workers have many more information technology tools now than 10 years ago and if usability is lacking in healthcare solutions or security it can directly drive the use of workarounds, non-compliance with policy, and additional risks that can lead to breaches. The challenge is to provide security together with improved usability. Examples include software encryption with hardware acceleration, SSD’s with encryption, or multi-factor authentication that improves usability of solutions and security.

Security is everyone’s job. Healthcare workers are increasingly targeted in spear phishing attacks. Effective mitigation of this type of risk requires a cultural shift so that security is not only the job of the security team but everyone’s job. Security awareness training needs to be on the job, gamified, continuous, and meaningful.

I’m curious what types of security concerns and risks are top of mind in your organization, challenges you are seeing in addressing these, and thoughts on how best to mitigate?

Telehealth is often touted as a potential cure for much of what ails healthcare today. At Indiana’s Franciscan Visiting Nurse Service (FVNS), a division of Franciscan Alliance, the technology is proving that it really is all that. Since implementing a telehealth program in 2013, FVNS has seen noteworthy improvements in both readmission rates and efficiency.

I recently sat down with Fred Cantor, Manager of Telehealth and Patient Health Coaching at Franciscan, to talk about challenges and opportunities. A former paramedic, emergency room nurse and nursing supervisor, Fred transitioned to his current role in 2015. His interest in technology made involvement in the telehealth program a natural fit.

At any one time, Fred’s staff of three critical care-trained monitoring nurses, three installation technicians and one scheduler is providing care for approximately 1,000 patients. Many live in rural areas with no cell coverage – often up to 90 minutes away from FVNS headquarters in Indianapolis.

Patients who choose to participate in the telehealth program receive tablet computers that run Honeywell LifeStream Manager* remote patient monitoring software. In 30-40 minute training sessions, FVNS equipment installers teach patients to measure their own blood pressure, oxygen, weight and pulse rate. The data is automatically transmitted to LifeStream and, from there, flows seamlessly into Franciscan’s Allscripts™* electronic health record (EHR). Using individual diagnoses and data trends recorded during the first three days of program participation, staff set specific limits for each patient’s data. If transmitted data exceeds these pre-set limits, a monitoring nurse contacts the patient and performs a thorough assessment by phone. When further assistance is needed, the nurse may request a home visit by a field clinician or further orders from the patient’s doctor. These interventions can reduce the need for in-person visits requiring long-distance travel.

FVNS’ telehealth program also provides patient education via LifeStream. For example, a chronic heart failure (CHF) patient experiencing swelling in the lower extremities might receive content on diet changes that could be helpful.

Since the program was implemented, overall readmission rates have been well below national averages. In 2014, the CHF readmission rate was 4.4 percent, compared to a national average of 23 percent. The COPD rate was 5.47 percent, compared to a national average of 17.6 percent, and the CAD/CABG/AMI rate was 2.96 percent, compared to a national average of 18.3 percent.

Despite positive feedback, convincing providers and even some FVNS field staff that, with proper training, patients can collect reliable data has taken some time. The telehealth team is making a concerted effort to engage with patients and staff to encourage increased participation.

After evaluating what type of device would best meet the program’s needs, Franciscan decided on powerful, lightweight tablets. The touch screen devices with video capabilities are easily customizable and can facilitate continued program growth and improvement.

In the evolving FVNS telehealth program, Fred Cantor sees a significant growth opportunity. With knowledge gained from providing the service free to their own patients, FVNS could offer a private-pay package version of the program to hospital systems and accountable care organizations (ACOs).

Is telehealth a panacea? No. Should it be a central component of any plan to reduce readmission rates and improve workflow? Just ask the patients and healthcare professionals at Franciscan VNS.

I'm often reminded that within the health IT sector we overlook some of the more simple opportunities to provide a better healthcare experience for both clinical staff and patients. A great example of this was the news that the NHS is investigating the feasibility of providing free Wi-Fi across its estate which it estimates will 'help reduce the administrative burden currently estimated to take up to 70 percent of a junior doctor's day'. I'll cover the often-talked about benefits to clinicians in a later blog but here I want to focus on how access to free Wi-Fi could impact the patient in a myriad of positive ways.

Today many of us see access to the internet via Wi-Fi just like any other utility. It's not something we think of too deeply but we expect it to be there, all day, every day. But access to Wi-Fi in an NHS hospital can either come at a price or is not available at all. The vision put forward by Tim Kelsey, NHS England’s National Director for Patients and Information, could truly revolutionise the continuum of care experience and fundamentally change the relationship between patient/family and hospital. I've highlighted five of the main benefits below:

1. Enhances Education

Clinicians will say that a better informed patient is more likely to buy in to their treatment plan. Traditionally an inpatient will be delivered updates on their condition verbally by a doctor 'doing the rounds' once or twice per day at the bedside. With the availability of free Wi-Fi in hospitals and the much-anticipated electronic patient access to all NHS funded services by 2020, I anticipate a patient being able to simply log-in to see real-time updates about their condition at any time of the day via their electronic health record. And Wi-Fi may offer opportunities to provide access to online educational material approved by the NHS too. I would add a cautionary note here though around the differing levels of interpretation of medical data by clinicians and patients.

2. Connecting Families

A prolonged stay in hospital affects not just the patient but the wider family too. Free Wi-Fi changes what can sometimes be a lonely and isolated period for the patient by bringing the family 'to the bedside' outside of traditional visiting hours through technologies such as Skype or email. And those conversations may well include patient progress updates thus reducing the strain on nurses who, at times, provide updates over the telephone. Additionally, family will be able to spend more time visiting patients while still being able to work remotely using free Wi-Fi.

Talk to patients (young or old) that have spent an extended time in hospital and they will more often than not tell you that at times they felt a drop in morale due to having their regular routine significantly disrupted. By offering free Wi-Fi patients can use their own mobile devices to pull back and continue to enjoy some of those everyday activities that go a long way to making all of us happy. That might include watching a favourite TV programme, reading a daily newspaper or simply playing an online game. Being connected brings a sense of normality to what is undoubtedly a period of worry and concern, resulting in happier patients.

5. Reducing Readmissions

When we look at the team of people providing care for patients it’s easy to forget just how important family and friends are, albeit in a less formal way than clinicians. When it comes to reducing readmission my mind is drawn to the patient setting immediately after discharge from hospital where it’s likely that family and close friends will be primary carers when the patient returns home. I’m seeing a scenario whereby the patient and caregiver in a hospital connect to family members, using Skype via Wi-Fi for example, to talk through recovery and medication to help ease and increase the effectiveness of that transition from hospital to home. I believe this could have a significant impact on readmission rates in a very positive way.

Meeting Security Needs

Wi-Fi networks in a hospital setting will, of course, bring concerns around security, especially when we talk of accessing sensitive healthcare data. This should not stop progress though as there are innovative security safeguards created by Intel Security Group that can mitigate the risks associated with data transiting across both public and private cloud-based networks. And I envisage healthcare workers and patients will access separate Wi-Fi networks which offer enhanced levels of security to clinicians.

Vision to Reality

Currently there are more than 100 NHS hospitals providing Wi-Fi to patients, in some cases free and in others on a paid-for basis. What really needs to happen though to turn this vision of free Wi-Fi for all into a reality? There are obvious financial implications but I think there are great arguments for investment too, especially when you look at the clinical benefits and potential cost-savings. A robust and clear strategy for implementation and ongoing support will be vital to delivery and may well form part of the NHS feasibility study. I look forward to seeing the report and, hopefully, roll-out of free Wi-Fi across the NHS to provide an improved patient experience.

If you enjoyed this blog please drop your details here to receive our quarterly newsletter for more insight and comment on the latest Health IT issues.

Chris Gough is a lead solutions architect in the Intel Health & Life Sciences Group and a frequent blog contributor.

How sustainable is your health IT environment? With all the demands you’re putting on your healthcare databases, is your infrastructure as reliable and affordable as it needs to be so you can stay ahead of the rising demand for services?

In Louisiana, IT leaders at one of the health systems we’ve been working with ran the numbers. Then, they migrated their InterSystems Caché database from their previous RISC platforms onto Dell servers based on the Intel® Xeon® processor E7. They tell us they couldn’t be happier—and they’re expecting the move to help them reduce TCO for their Epic EHR and Caché environment by more than 40 percent.

“Using Intel® and Dell hardware with Linux and VMware, you can provide a level of reliability that’s better than or equal to anything out there,” says Gregory Blanchard, executive director of IT at Shreveport-based University Health (UH) System. “You can do it more easily and at much lower cost. It’s going to make your life a lot easier. The benefits are so clear-cut, I would question how you could make the decision any differently.”

We recently completed a case study describing UH’s decision to migrate its Caché infrastructure. We talked with UH’s IT leaders about their previous pain points, the benefits they’re seeing from the move, and any advice they can share with their health IT peers. If your health system is focused on improving services while controlling costs, I think you’ll find it well worth a read. You’ll also learn about the Dell, Red Hat, Intel, and VMware for Epic (DRIVE) Center of Excellence—a great resource for UH and other organizations that want a smooth migration for their Epic and Caché deployments.

UH is a great reminder that health IT innovation doesn’t just happen at the Cleveland Clinics and Kaiser Permanentes of the world. Louisiana has some of the saddest health statistics in the nation, and the leaders at UH know they need to think big if they’re going to change that picture. As a major medical resource for northwest Louisiana and the teaching hospital for the Louisiana State University Shreveport School of Medicine, UH is on the forefront of the state’s efforts to improve the health of its citizens. Its new infrastructure—with Intel Inside®—gives UH a scalable, affordable, and sustainable foundation. I’ll be excited to watch their progress.

The transition to value-based care is not an easy one. Organizations will face numerous challenges on their journey towards population health management.

We believe there are five key elements and best practices to consider when transitioning from volume to value-based care: managing multiple quality programs; supporting both employed and affiliated physicians and effectively managing your network and referrals; managing organizational risk and utilization patterns; implementing care management programs; and ensuring success with value-based reimbursement.

When considering the best way to proactively and concurrently manage multiple quality programs, such as pay for performance, accountable care and/ or patient-centered medical home initiatives, you must rally your organization around a wide variety of outcomes-based programs. This requires a solution that supports quality program automation. Your platform must aggregate data from disparate sources, analyze that data through the lens of a program’s specific measures, and effectively enable the actions required to make improvements. Although this is a highly technical and complicated process, when done well it enables care teams to utilize real-time dashboards to monitor progress and identify focus areas for improving outcomes.

In order to provide support to both employed and affiliated physicians, and effectively manage your network and referrals, an organization must demonstrate its value to healthcare providers. Organizations that do this successfully are best positioned to engage and align with their healthcare providers. This means providing community-wide solutions for value-based care delivery. This must include technology and innovation, transformation services and support, care coordination processes, referral management, and savvy representation with employers and payers based on experience and accurate insight into population health management as well as risk.

To effectively manage organization risk and utilization patterns, it is imperative to optimize episodic and longitudinal risk, which requires the application of vetted algorithms to your patient populations using a high quality data set. In order to understand the difference in risk and utilization patterns you need to aggregate and normalize data from various clinical and administrative sources, and then ensure that the data quality is as high as possible. You must own your data and processes to be successful. And importantly, do not rely entirely on data received from payers.

It is also important to consider the implementation of care management programs to improve individual patient outcomes. More and more organizations are creating care management initiatives for improving outcomes during transitions of care and for complicated, chronically ill patients. These initiatives can be very effective. It is important to leverage technology, innovation and processes across the continuum of care, while encompassing both primary and specialty care providers and care teams in the workflows. Accurate insight into your risk helps define your areas of focus. A scheduled, trended outcomes report can effectively identify what’s working and where areas of improvement remain.

Finally, your organization can ensure success with value-based reimbursement when the transition is navigated correctly. The shift to value-based reimbursement is a critical and complicated transformation—oftentimes a reinvention—of an organization. Ultimately, it boils down to leadership, experience, technology and commitment. The key to success is working with team members, consultants and vendor partners who understand the myriad details and programs, and who thrive in a culture of communication, collaboration, execution and accountability.

Whether it’s PCMH or PCMH-N, PQRS or GPRO, CIN or ACO, PFP or DSRIP, TCM or CCM, HEDIS or NQF, ACG’s or HCC’s, care management or provider engagement, governance or network tiering, or payer or employer contracting, you can find partners with the right experience to match your organizations unique needs. Because much is at stake, it is necessary to ensure that you partner with the very best to help navigate your transition to value-based care.

Justin Barnes is a corporate, board and policy advisor who regularly appears in journals, magazines and broadcast media outlets relating to national leadership of healthcare and health IT. Barnes is also host of the weekly syndicated radio show, “This Just In.”

Mason Beard is Co-Founder and Chief Product Officer for Wellcentive. Wellcentive delivers population health solutions that enable healthcare organizations to focus on high quality care, while maximizing revenue and transforming to support value-based models.

The Internet of Things (IoT) is one of those subject matters that tends to include a lot of future-gazing around what may be possible in five, 10 or even 20 years’ time but we’re very fortunate in the healthcare sector to be able to show real examples where IoT is having a positive impact for both patient and provider today.

Meaningful Use Today

Here at Intel in the UK we’re working with a fantastic company in the Internet of Things space that is having a real and meaningful impact for patient and provider. MimoCare’s mission is ‘to support independent living for the elderly and vulnerable’ using pioneering sensory-powered systems. And with an ageing population across Europe and the associated rise in healthcare costs, Mimocare are already helping to ‘shift healthcare from cure to prevention’ today.

I think it’s important to highlight that MimoCare’s work focuses on measuring the patient’s environment, rather than the patient. For example, sensors can be placed to record frequency of bathroom visits and a sudden variation from the normal pattern may indicate a urinary infection or dehydration.

Medication Box

The phrase ‘changing lives’ is sometimes overused but when you read feedback from an elderly patient benefiting from MimoCare’s work then I think you’d agree that it is more than appropriate. MimoCare talked me through a fantastic example of an 89 year old male who is the primary carer for his 86 year old wife and is benefiting greatly from IoT in healthcare. The elderly gentleman has a pacemaker fitted so is required to administer warfarin but with his primary focus on caring for his wife there is a risk that he may miss taking his own medication.

Using MimoCare sensors on the patient’s pill box enables close family to be alerted by SMS if medication is missed. The advantage to the patient is that both the sensors in the home and, importantly, the alert triggers are unobtrusive, meaning that the patient remains free from anxiety. If medication is missed a gentle reminder via a phone call from a family member is all that I needed to ensure the patient takes medication. And for the healthcare provider the cost in providing care for the patient is significantly reduced too.

Big Data, Big Possibilities

I’m really excited about the possibilities of building up an archive of patient behaviour in their own home that will enable cloud analytics to produce probability curves to predict usual and unusual behaviour. It’s a fantastic example of the more data we have, the more accurate we can be in predicting unusual behaviour and being able to trigger alerts to patients, family and carers. And that can only be a positive when it comes to helping elderly patients stay out of hospital (and thus significantly reduce the cost of hospital admissions).

Intel has played a pivotal role in assisting of porting both software and hardware to give improved performance of the IoT gateway, also provided through WindRiver Linux an enhanced data and network security including down-the-wire device management for software updates and configuration changes.

Sensing the Future

But where will the Internet of Things take healthcare in the next 5-10 years? What I can say is that sensors will become more cost-effective, smaller and will be more power-efficient meaning that they can be attached to a multitude of locations around the home. Combining this sensor data with that recorded by future wearable technology will give clinicians a 360 degree view of a patient at home which will truly enable the focus to be shifted from cure to prevention.

I asked MimoCare’s Gerry Hodgson for his thoughts on the future too and he told me, “IoT and big data analytics will revolutionise the way care and support services are integrated. Today we have silos of information which hold vital information for coordinating emergency services, designing care plans, scheduling transport and providing family and community support networks. The projected growth in the elderly population means that it is imperative we find new ways of connecting local communities, families and healthcare professionals and integrating services.”

“Our cascade 3-D big data analytics provides a secure and globally scalable ecosystem that will totally revolutionise the way services are coordinated. End to end, IoT sensors stream valuable data to powerful server platforms such as Hadoop which today provides an insight into what would otherwise be unobtainable.”

“I'm very excited about the future where sensors and analytics change the way we coordinate and deliver services on a huge scale.”

Discussions around security in the healthcare IT space usually center around external threats to our Healthcare IT infrastructure. Sure, this is a big area of concern and one that should not be taken lightly. Software needs to use encryption properly, it needs to protect and monitor from known threats, it needs implement best practices in infrastructure architecture as we design cloud based systems that are more accessible via the public internet.

But while these are definitely critical items, some of the biggest threats are not technical. Many times we have to deal with threats inside our organizations. This may include ensuring that we are screening and monitoring employees for nefarious behavior but the more likely situation is when good, law abiding and well intentioned employees are putting data at risk. Many times employees have easy access to large swaths of PHI data which is critical to them performing their jobs appropriately. This access is not inherently bad, but if the software is not designed to take this into account it can actually encourage a user to do things that will lead to inadvertent data disclosure put our patient data at risk.

So the question is, is our software designed to promote good security? Here are a few of the most important techniques and guidelines that we use when designing software that help promote good security practices:

Understand your user, and understand their workflow

Good software considers how a user is going to access data and how they are going to move within the system. Provide users with the quickest and most efficient path for them to get to the data they need. Also understand how users use their mobile devices in your environment. Create clear and concise use cases that define how the mobile application will accomplish specific goals. When software is not designed to the specific workflow of a user, that user will usually figure out a workaround which sometimes involves putting patient data at risk. By providing the user a better overall experience you aren’t only protecting the data in the system, but you are likely to increase their satisfaction with it.

Ensure that users only see the data they absolutely need

This will only happen when you understand your user and their processes. Just showing information because you have it is not a good practice and one that is common in healthcare software. Curate the data and survey users to understand exactly how they use your system. Provide justification for every field, and don’t be afraid to be conservative in what you provide. You can always provide the user a way to customize what they see so that it will help them in their specific job, but err on the side of less. Also, on mobile devices we have limited real estate to display that data, and so by removing unneeded data you are ensuring a great user experience targeted to the mobile use case.

Limit the need to export data

While this may depend on the software system, but in many EHRs we find it too easy to export data. This usually is a release valve of sorts to enable unimplemented functionality. Understand why users are exporting data out of your system and provide that functionality if it is prudent. Anytime a user exports data out of a system, it is more likely to end up in the wrong hands.

Use safe password practices, but explain them to the user

Passwords are hard, but we can make managing passwords easier for the user. Make the path to resetting their password easy and explain to them in clear concise terms what the password requirements are. If the user has to attempt a password reset multiple times because they don't understand the precise rules you have, they are more likely to use a common password they have used on other systems, or to change their passwords less often. Use real time updates in the UI to show how they are complying with the rules as they are entering a new password and provide clear feedback. Also, use 2-factor authentication and PINs appropriately on mobile devices. And if you are on iPhones or iPads, make use of features such as TouchID for biometric authentication. Not only will it make the software more secure, but your users will appreciate it.

Be careful on alerts and other notifications

Our mobile devices are wonderful at surfacing valuable information to us using system specific notifications and alerts, but not all of them are necessarily appropriate to use as vehicles for sharing PHI. Avoid using patient data in notifications that can show up on device 'unlock' screens or in other places on the device that can bee seen without entering appropriate authentication credentials (for example prior to entering a PIN or showing up outside of your software.) If you are using notifications, use them to provide calls-to-action that will enable a user to understand that they software may need their attention, or provides them cues as to something they may have already seen in your app.

Some of these guidelines may be obvious, but it is something has to be constantly evaluated and improved as not only our technologies evolve, but as new devices become available for us to use. When software is designed properly, not only are we making your applications more secure, but you are creating applications that will be a joy to use and may actually save lives.

Health and Safety in the Workplace

The first group of project is related to the important issue of health and safety in the workplace.

Figure 1. Circadian Glasses

Christina Petersen’s ‘circadian glasses’ considered the dangers of habitual strains and stressors at work, particularly for individuals in careers with prolonged evening hours or excessively in light-poor conditions, which may have a cumulative effect on health over time. Although modern technologies allow for the convenience of working at will regardless of external environmental factors, what is the effect on the body’s natural systems? In particular, how does artificial lighting affect the circadian rhythm?

Her prototyped glasses use two LED screens that can adjust the type of light to help users better adjust their circadian rhythms and sleep patterns. The concept also suggests a potentially valuable intersection of personal wearable and personal energy usage (lighting) in the future workplace. Unlike sunglasses, the glasses are also a personal, portable source of light – an interesting concept in workplace sustainability, given the majority of energy expenditure is in heating/cooling systems and lighting.

While there is room to make the user context and motivation more plausible, the prototype literally helps shed light on meaningful, and specific, design interventions for vulnerable populations such as nurses or night shift workers for personal and workplace sustainability over time.

Figure 2. Smart Workplace Urinal

As we often see within our work, a city’s hubs for healthcare resources and information often are informally ubiquitous and present within the community before one reaches the hospital. Jon Rasche’s smart urinal was created to decrease the queue and waiting time at the doctor’s office even before you arrived, by creating more personal, preventative care via lab testing at the workplace.

The ‘Smart Urinal’ created an integrated service with a urinal-based sensor and a display unit, QR codes, and a mobile application (Figure 2). The system also considered concerns around patient privacy by intentionally preventing private patient information from entering the cloud. Instead, each of the possible results links to a QR codes leading to a static web page with the urinalysis information.

While the system might be perceived as too public for comfort, it connects to the technological trends toward for more personalised and accessible testing (Scanadu’s i-Phone ready urinalysis strip is a good example). It also raises the consideration of how to design for the connected ecosystem of responsibility, accountability and care – how can different environments influence, impact and support an individual’s wellbeing? How can personalised, connected care be both anticipatory, preventative, and immediate, yet, private?

Pollutants Awareness

The dynamic life of a city often means it’s in a state of constant use and regeneration – but many of the resulting pollutants are invisible to the naked eye. How do we know when the microscopic accumulation of pollutants will be physically harmful? How can we make the invisible visible in a way that better engages us with our environment?

Figure 3. Air Pollution Disk

Maria’s Noh’s ‘Air Pollution Disc’ (Figure 3) considers how we can design for information to be more physical, visible and intuitive by creating a mechanical, physical filter on our immediate environment driven by local air quality data using polarised lenses.

It’s a very simple mechanism with an elegant design that ties to some of our earlier cities research into perceptual bias around air quality substituting numeric data for physical feedback (e.g., although pollutants may not always be visible, we equate pollution with visual cues). Noh suggested two use scenarios – one was to affix device to a window of a home to understand pollution at potential destinations, such as the school; another was to potentially influence driver behaviour by providing feedback on relationship of driving style to pollution.

While there are some future nuances and challenges to either case, the immediacy of the visualisation for both adults and children, may make it interesting to see the Air Pollution Disc as a play-based, large-scale urban installation of physicalizing the hidden environment of the city.

Figure 4. Ghost 7.0

The pollutants category relates to the prototype for ‘Ghost 7.0’ by student Andre McQueen, a smart clothing system that addresses how weather and air quality affect health. The idea tries to tackle breathing problems, e.g. due to allergies, associated to weather changes. The device (Figure 4) embedded in the running clothing is designed to communicate with satellites to receive updates on weather conditions and signal warnings under certain circumstances.

When a significant meteorological change is signalled, the fabric would change colour and release negative ions (meant to help breathing under certain conditions). The student also investigated oxidisation to fight pollutants, but could not overcome the problem of the releasing some small amounts of CO2.

What we found interesting in this project was the idea that a wearable device would do something to help against poor air quality, rather than just passively detecting the problem. Too many devices currently are focusing on the latter task, leaving the user wondering about the actionability of the information they receive.

Figure 5. Dumpster diving 'smart glove'

The last selected project for this section is a project on dumpster diving by student Yuri Klebanov. Yuri built a system to make dumpster diving safer (by creating a ‘smart glove’ that reacts to chemicals) and more effective (by creating a real time monitoring system that uploads snapshots of what is thrown away on a website for users to monitor).

While the latter idea is interesting but presents several challenges (e.g. privacy around taking pictures of people throwing away things), what we liked about the project was the ‘smart glove’ idea. The solution device was to boil fabric gloves with cabbage, making them capable of changing colour when in contact with acid, liquids, fats and so on (Figure 5). This frugal technology solution made us reflect on how smart is ‘smart’? Technology overkill is not always the best solution to a problem, and something simple is always preferable to something more complex that provides the same (or little incremental) results.

In the third and final blog of our Future Cities, Future Health blog series we will look at the final theme around Mapping Cities (Creatively) which will showcase creative ideas of allocating healthcare resources and using sound to produce insights into complex health data.