Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use,
ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Email Address:

We never sell or give out your contact information.
We respect our readers' privacy.

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I like to look at questions other people in the #HIT world wonder about, and see whether I have a different way of looking at the subject, or something to contribute to the discussion. This time I was provoked by one asked by Chad Johnson (@OchoTex), editor of HealthStandards.com and senior marketing manager with Corepoint Health.

In a recent HealthStandards.com article, Chad asks: “What do CIOs need to know about the future of data exchange?” I thought it was an interesting question; after all, everyone in HIT, including CIOs, would like to know the answer!

In his discussion, Chad argues that #FHIR could create significant change in healthcare infrastructure. He notes that if vendors like Cerner or Epic publish a capabilities-based API, providers’ technical, clinical and workflow teams will be able to develop custom solutions that connect to those systems.

As he rightfully points out, today IT departments have to invest a lot of time doing rework. Without an interface like FHIR in place, IT staffers need to develop workflows for one application at a time, rather than creating them once and moving on. That’s just nuts. It’s hard to argue that if FHIR APIs offer uniform data access, everyone wins.

Far be it from me to argue with a good man like @OchoTex. He makes a good point about FHIR, one which can’t be emphasized enough – that FHIR has the potential to make vendor-specific workflow rewrites a thing of the past. Without a doubt, healthcare CIOs need to keep that in mind.

As for me, I have a couple of responses to bring to the table, and some additional questions of my own.

Since I’m an HIT trend analyst rather than actual tech pro, I can’t say whether FHIR APIs can or can’t do what Chat is describing, though I have little doubt that Chad is right about their potential uses.

Still, I’d contend out that since none other than FHIR project director Grahame Grieve has cautioned us about its current limitations, we probably want to temper our enthusiasm a bit. (I know I’ve made this point a few times here, perhaps ad nauseum, but I still think it bears repeating.)

So, given that FHIR hasn’t reached its full potential, it may be that health IT leaders should invest added time on solving other important interoperability problems.

One example that leaps to mind immediately is solving patient matching problems. This is a big deal: After all, If you can’t match patient records accurately across providers, it’s likely to lead to wrong-patient related medical errors.

In fact, according to a study released by AHIMA last year, 72 percent of HIM professional who responded work on mitigating possible patient record duplicates every week. I have no reason to think things have gotten better. We must find an approach that will scale if we want interoperable data to be worth using.

And patient data matching is just one item on a long list of health data interoperability concerns. I’m sure you’re aware of other pressing problems which could undercut the value of sharing patient records. The question is, are we going to address those problems before we began full-scale health data exchange? Or does it make more sense to pave the road to data exchange and address bumps in the road later?

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Once every year or two, some technical development leads the HIT buzzword list, and at least at first it’s very hard to tell whether that will stick. But over time, the technologies that actually work well are subsumed into the industry as it exists, lose their buzzworthy quality and just do their job.

Once in a while, the hot new thing sparks real change — such as the use of mobile health applications — but more often the ideas are mined for whatever value they offer and discarded. That’s because in many cases, the “new thing” isn’t actually novel, but rather a slightly different take on existing technology.

I’d argue that this is particularly true when it comes to hospital IT, given the exceptionally high cost of making large shifts and the industry’s conservative bent. In fact, other than the (admittedly huge) changes fostered by the adoption of EMRs, hospital technology deployments are much the same as they were ten years ago.

Of course, I’d be undercutting my thesis dramatically if I didn’t stipulate that EMR adoption has been a very big deal. Things have certainly changed dramatically since 2007, when an American Hospital Association study reported that 32% percent of hospitals had no EMR in place and 57% had only partially implemented their EMR, with only the remaining 11% having implemented the platform fully.

Today, as we know, virtually every hospital has implemented an EMR integrated it with ancillary systems (some more integrated and some less). Not only that, some hospitals with more mature deployments in place have used EMRs and connected tools to make major changes in how they deliver care.

That being said, the industry is still struggling with many of the same problems it did in a decade ago.

The most obvious example of this is the extent to which health data interoperability efforts have stagnated. While hospitals within a health system typically share data with their sister facilities, I’d argue that efforts to share data with outside organizations have made little material progress.

Another major stagnation point is data analytics. Even organizations that spent hundreds of millions of dollars on their EMR are still struggling to squeeze the full value of this data out of their systems. I’m not suggesting that we’ve made no progress on this issue (certainly, many of the best-funded, most innovative systems are getting there), but such successes are still far from common.

Over the longer-term, I suspect the shifts in consciousness fostered by EMRs and digital health will gradually reshape the industry. But don’t expect those technology lightning bolts to speed up the evolution of hospital IT. It’s going take some time for that giant ship to turn.

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Blockchain technology is gradually becoming part of how we think about healthcare data. Even government entities like the ONC and FDA – typically not early adopters – are throwing their hat into the blockchain ring.

In fact, according to recent research by Deloitte, healthcare and life sciences companies are planning the most aggressive blockchain deployments of any industry. Thirty-five percent of Deloitte’s respondents told the consulting firm that they expected to put blockchain into production this year.

Many companies are tackling the practical uses of blockchain tech in healthcare. But to me, few are more interesting than Google’s DeepMind, a hot new AI firm based in the UK acquired by Google a few years ago.

DeepMind has already signed an agreement with a branch of Britain’s National Health Trust, under which it will access patient data in the development healthcare app named Streams. Now, it’s launching a new project in partnership with the NHS, in which it will use a new technology based on bitcoin to let hospitals, the NHS and over time, patients track what happens to personal health data.

The new technology, known as “Verifiable Data Audit,” will create a specialized digital ledger which automatically records every time someone touches patient data, according to British newspaper The Guardian.

In a blog entry, DeepMind co-founder Mustafa Suleyman notes that the system will track not only that the data was used, but also why. In addition, the ledger supporting the audit will be set to append-only, so once the system records an activity, that record can’t be erased.

The technology differs from existing blockchain models in some important ways, however. For one thing, unlike in other blockchain models, Verifiable Data Audit won’t rely on decentralized ledger verification of a broad set of participants. The developers have assumed that trusted institutions like hospitals can be relied on to verify ledger records.

Another way in which the new technology is different is that it doesn’t use a chain infrastructure. Instead, it’s using a mathematical function known as a Merkle tree. Every time the system adds an entry to the ledger, it generates a cryptographic hash summarizing not only that latest ledger entry, but also the previous ledger values.

DeepMind is also providing a dedicated online interface which participating hospitals can use to review the audit trail compiled by the system, in real-time. In the future, the company hopes to make automated queries which would “sound the alarm” if data appeared to be compromised.

Though DeepMind does expect to give patients direct oversight over how, where and why their data has been used, they don’t expect that to happen for some time, as it’s not yet clear how to secure such access. In the mean time, participating hospitals are getting a taste of the future, one in which patients will ultimate control access to their health data assets.

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Back in 2013 I argued that we needed a lot less talk and a lot more action when it came to interoperability in healthcare. It seemed very clear to me then and even now that sharing health data was the right thing to do for the patient. I have yet to meet someone who thinks that sharing a person’s health data with their providers is not the right thing to do for the patient. No doubt we shouldn’t be reckless with how we share the data, but patient care would improve if we shared data more than we do today.

While the case for sharing health data seems clear from the patient perspective, there were obvious business reasons why many organizations didn’t want to share their patients health data. From a business perspective it was often seen as an expense that they’d incur which could actually make them lose money.

These two perspectives is what makes healthcare interoperability so challenging. We all know it’s the right thing to do, but there are business reasons why it doesn’t make sense to invest in it.

While I understand both sides of the argument, I wondered if we could make the financial case for why a hospital or healthcare organization should invest in interoperability.

The easy argument is that value based care is going to require you to share data to be successful. That previous repeat X-ray that was seen as a great revenue source will become a cost center in a value based reimbursement world. At least that’s the idea and healthcare organizations should prepare for this. That’s all well and could, but the value based reimbursement stats show that we’re not there yet.

What are the other cases we can make for interoperability actually saving hospitals money?

I recently saw a stat that 70% of accidental deaths and injuries in hospitals are caused by communication issues. Accidental deaths and injuries are very expensive to a hospital. How many lives could be saved, hospital readmissions avoided, or accidental injuries could be prevented if providers had the right health data at the right place and the right time?

My guess is that not having the right healthcare data to treat a patient correctly is a big problem that causes a lot of patients to suffer needlessly. I wonder how many malpractice lawsuits could be avoided if the providers had the patients full health record available to them. Should malpractice insurance companies start offering healthcare organizations a doctors a discount if they have high quality interoperability solutions in their organization?

Obviously, I’m just exploring this idea. I’d love to hear your thoughts on it. Can interoperability solutions help a hospital save money? Are their financial reasons why interoperability should be implemented now?

While I still think we should make health data interoperability a reality because it’s the right thing to do for the patients, it seems like we need to dive deeper into the financial reasons why we should be sharing patient’s health data. Otherwise, we’ll likely never see the needle move when it comes to health data sharing.

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

According to state officials, Colorado occupies the unenviable position of second worst in the US for prescription drug misuse, with more than 255,000 Coloradans misusing prescribed medications.

One way the state is fighting back is by running the Colorado Prescription Drug Monitoring Program which, like comparable efforts in other states, tracks prescriptions for controlled medications. Every regular business day, the state’s pharmacists upload prescription data for medications listed in Schedules II through V.

While this effort may have value, many physicians haven’t been using the database, largely because it can be difficult to access. In fact, historically physicians have been using the system only about 30 percent of the time when prescribing controlled substances, according to a story appearing in HealthLeaders Media.

As things stand, it can take physicians up to three minutes to access the data, given that they have to sign out of their EMR, visit the PDMP site, log in using separate credentials, click through to the right page, enter patient information and sort through possible matches before they got to the patient’s aggregated prescription history. Given the ugliness of this workflow, it’s no surprise that clinicians aren’t searching out PDMP data, especially if they don’t regard a patient as being at a high risk for drug abuse or diversion.

But perhaps taking some needless steps out of the process can make a difference, a theory which one of the state’s hospitals is testing. Colorado officials are hoping a new pilot program linking the PDMP database to an EMR will foster higher use of the data by physicians. The pilot, funded by a federal grant through the Bureau of Justice Assistance, connects the drug database directly to the University of Colorado Hospital’s Epic EMR.

The project began with a year-long building out phase, during which IT leaders created a gateway connecting the PDMP database and the Epic installation. Several months ago, the team followed up with a launch at the school of medicine’s emergency medicine department. Eventually, the PDMP database will be available in five EDs which have a combined total of 270,000 visits per year, HealthLeaders notes.

Under the pilot program, physicians can access the drug database with a single click, directly from within the Epic EMR system. Once the PDMP database was made available, the pilot brought physicians on board gradually, moving from evaluating their baseline use, giving clinicians raw data, giving them data using a risk-stratification tool and eventually requiring that they use the tool.

Researchers guiding the pilot are evaluating whether providers use the PDMP more and whether it has an impact on high-risk patients. Researchers will also analyze what happened to patients a year before, during and a year after their ED visits, using de-identified patient data.

It’s worth pointing out that people outside of Colorado are well aware of the PDMP access issue. In fact, the ONC has been paying fairly close attention to the problem of making PDMP data more accessible. That being said, the agency notes that integrating PDMPs with other health IT systems won’t come easily, given that no uniform standards exist for linking prescription drug data with health IT systems. ONC staffers have apparently been working to develop a standard approach for delivering PDMP data to EMRs, pharmacy systems and health information exchanges.

However, at present it looks like custom integration will be necessary. Perhaps pilots like this one will lead by example.

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If you’re reading this blog, you already know that distributed, connected devices and networks are the future of healthcare. Connected monitoring devices are growing more mature by the day, network architectures are becoming amazingly fluid, and with the growth of the IoT, we’re adding huge numbers of smart devices to an already-diverse array of endpoints. While we may not know what all of this will look when it’s fully mature, we’ve already made amazing progress in connecting care.

But how will these trends play out? One nice look at where all this is headed comes from Jeroen Tas, chief innovation and strategy officer at Philips. In a recent article, Tas describes a world in which even major brick-and-mortar players like hospitals go almost completely virtual. Certainly, there are other takes out there on this subject, but I really like how Tas explains things.

He starts with the assertion that the hospital of the future “is not a physical location with waiting rooms, beds and labs.” Instead, a hospital will become an abstract network overlay connecting nodes. It’s worth noting that this isn’t just a concept. For an example, Tas points to the Mercy Virtual Care Center, a $54 million “hospital without beds” dedicated to telehealth and connected care. The Center, which has over 300 employees, cares for patients at home and in beds across 38 hospitals in seven states.

While the virtual hospital may not rely on a single, central campus, physical care locations will still matter – they’ll just be distributed differently. According to Tas, the connected health network will work best if care is provided as needed through retail-type outlets near where people live, specialist hubs, inpatient facilities and outpatient clinics. Yes, of course, we already have all of these things in place, but in the new connected world, they’ll all be on a single network.

Ultimately, even if brick-and-mortar hospitals never disappear, virtual care should make it possible to cut down dramatically on hospital admissions, he suggests. For example, Tas notes that Philips partner Banner Health has slashed hospital admissions almost 50% by using telehealth and advanced analytics for patients with multiple chronic conditions. (We’ve also reported on a related pilot by Partners HealthCare Brigham and Women’s Hospital, the “Home Hospital,” which sends patients home with remote monitoring devices as an alternative to admissions.)

Of course, the broad connected care outline Tas offers can only take us so far. It’s all well and good to have a vision, but there are still some major problems we’ll have to solve before connected care becomes practical as a backbone for healthcare delivery.

After all, to cite one major challenge, community-wide connected health won’t be very practical until interoperable data sharing becomes easier – and we really don’t know when that will happen. Also, until big data analytics tools are widely accessible (rather than the province of the biggest, best-funded institutions) it will be hard for providers to manage the data generated by millions of virtual care endpoints.

Still, if Tas’s piece is any indication, consensus is building on what next-gen care networks can and should be, and there’s certainly plenty of ways to lay the groundwork for the future. Even small-scale, preliminary connected health efforts seem to be fostering meaningful changes in how care is delivered. And there’s little doubt that over time, connected health will turn many brick-and-mortar care models on their heads, becoming a large – or even dominant – part of care delivery.

Getting there may be tricky, but if providers keep working at connected care, it should offer an immense payoff.

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

One might assume that by this point, virtually every provider with a shred of IT in place is doing some form of patient data exchange. After all, many studies tout the number of healthcare data send and receive transactions a given vendor network or HIE has seen, and it sure sounds like a lot. But if a new survey is any indication, such assumptions are wrong.

According a study by Black Book Research, which surveyed 3,391 current hospital EMR users, 41% of responding medical record administrators find it hard to exchange patient health records with other providers, especially if the physicians involved aren’t on their EMR platform. Worse, 25% said they still can’t use any patient information that comes in from outside sources.

The problem isn’t a lack of interest in data sharing. In fact, Black Book found that 81% of network physicians hoped that their key health system partners’ EMR would provide interoperability among the providers in the system. Moreover, the respondents say they’re looking forward to working on initiatives that depend on shared patient data, such as value-based payment, population health and precision medicine.

The problem, as we all know, is that most hospitals are at an impasse and can’t find ways to make interoperability happen. According to the survey, 70% of hospitals that responded weren’t using information outside of their EMR. Respondents told Black Book that they aren’t connecting clinicians because external provider data won’t integrate with their EMR’s workflow.

Even if the data flows are connected, that may not be enough. Researchers found that 22% of surveyed medical record administrators felt that transferred patient information wasn’t presented in a useful format. Meanwhile, 21% of hospital-based physicians contended that shared data couldn’t be trusted as accurate when it was transmitted between different systems.

Meanwhile, the survey found, technology issues may be a key breaking point for independent physicians, many of whom fear that they can’t make it on their own anymore. Black Book found that 63% of independent docs are now mulling a merger with a big healthcare delivery system to both boost their tech capabilities and improve their revenue cycle results. Once they have the funds from an acquisition, they’re cleaning house; the survey found that EMR replacement activities climbed 52% in 2017 for acquired physician practices.

Time for a comment here. I wish I agreed with medical practice leaders that being acquired by a major health system would solve all of their technical problems. But I don’t, really. While being acquired may give them an early leg up, allowing them to dump their arguably flawed EMR, I’d wager that they won’t have the attention of senior IT people for long.

My sense is that hospital and health system leaders are focused externally rather than internally. Most of the big threats and opportunities – like ACO integration – are coming at leaders from the outside.

True, if a practice is a valuable ally, but independent of the health system, CIOs and VPs may spend lots of time and money to link arms with them technically. But once they get in house, it’s more of a “get in line” situation from what I’ve seen. Readers, what is your experience?

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Recently two of the bigger players working on health data interoperability – Carequality and the CommonWell Health Alliance – agreed to share data with each other. The two, which were fierce competitors, agreed that CommonWell would share data with any Carequality participant, and that Carequality users would be able to use the CommonWell record locator service.

That is all well and good, but at first I wasn’t sure if it would pan out. Being the cranky skeptic that I am, I assumed it would take quite a while for the two to get their act together, and that we’d hear little more of their agreement for a year or two.

But apparently, I was wrong. In fact, a story by Scott Mace of HealthLeaders suggests that Boston Children’s Hospital and its physicians are likely to benefit right away. According to the story, the hospital and its affiliated Pediatric Physicians Organization at Children’s Hospital (PPOC) will be able to swap data nicely despite their using different EMRs.

According to Mace, Boston Children’s runs a Cerner EMR, as well as an Epic installation to manage its revenue cycle. Meanwhile, PPOC is going live with Epic across its 80 practices and 400 providers. On the surface, the mix doesn’t sound too promising.

To add even more challenges to the mix, Boston Children’s also expects an exponential jump in the number of patients it will be caring for via its Medicaid ACO, the article notes.

Without some form of data sharing compatibility, the hospital and practice would have faced huge challenges, but now it has an option. Boston Children’s is joining CommonWell, and PPOC is joining Carequality, solving a problem the two have struggled with for a long time, Mace writes.

Previously, the story notes, the hospital tried unsuccessfully to work with a local HIE, the Mass Health Information HIway. According to hospital CIO Dan Nigrin, MD, who spoke with Mace, providers using Mass Health were usually asked to push patient data to their peers via Direct protocol, rather than pull data from other providers when they needed it.

Under the new regime, however, providers will have much more extensive access to data. Also, the two entities will face fewer data-sharing hassles, such as establishing point-to-point or bilateral exchange agreements with other providers, PPOC CIO Nael Hafez told HealthLeaders.

Even this step upwards does not perfect interoperability make. According to Micky Tripathi, president and CEO of the Massachusetts eHealth Collaborative, providers leveraging the CommonWell/Carequality data will probably customize their experience. He contends that even those who are big fans of the joint network may add, for example, additional record locator services such as one provided by Surescripts. But it does seem that Boston Children’s and PPOC are, well, pretty psyched to get started with data sharing as is.

Now, back to me as Queen Grump again. I have to admit that Mace paints a pretty attractive picture here, and I wish Boston Children’s and PPOC much success. But my guess is that there will still be plenty of difficult issues to work out before they have even the basic interoperability they’re after. Regardless, some hope of data sharing is better than none at all. Let’s just hope this new data sharing agreement between CommonWell and Carequality lives up to its billing.

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A couple of months ago, HIMSS released some statistics from its survey on US hospitals’ plans for IT investment over the next 12 months. The results contain a couple of data points that I found particularly interesting:

While I had expected the most common type of planned spending to be focused on population health or related solutions, HIMSS found that pharmacy was the most active category. In fact, 51% of hospitals were planning to invest in one pharmacy technology, largely to improve tracking of medication dispensing in additional patient care environments. Researchers also found that 6% of hospitals were planning to add carousels or packagers in their pharmacies.

Eight percent hospitals said that they plan to invest in EMR components, which I hadn’t anticipated (though it makes sense in retrospect). HIMSS reported that 14% of hospitals at Stage 1-4 of its Electronic Medical Record Adoption Model are investing in pharmacy tech for closed loop med administration, and 17% in auto ID tech. Four percent of Stage 6 hospitals plan to support or expand information exchange capabilities. Meanwhile, 60% of Stage 7 hospitals are investing in hardware infrastructure “for the post-EMR world.”

Other data from the HIMSS report included news of new analytics and telecom plans:

Researchers say that recent mergers and acquisitions are triggering new investments around telephony. They found that 12% of hospitals with inpatient revenues between $25 million and $125 million – and 6% of hospitals with more than $500 million in inpatient revenues — are investing in VOIP and telemedicine. FWIW, I’m not sure how mergers and acquisitions would trigger telemedicine rollouts, as they’re already well underway at many hospitals — maybe these deals foster new thinking and innovation?

As readers know, hospitals are increasingly spending on analytics solutions to improve care and make use of big data. However (and this surprised me) only 8% of hospitals reported plans to buy at least one analytics technology. My guess is that this number is small because a) hospitals may not have collected their big data assets in easily-analyzed form yet and b) that they’re still hoping to make better use of their legacy analytics tools.

Looking at these stats as a whole, I get the sense that the hospitals surveyed are expecting to play catch-up and shore up their infrastructure next year, rather than sink big dollars into future-looking solutions.

Without a doubt, hospital leaders are likely to invest in game-changing technologies soon such as cutting-edge patient engagement and population health platforms to prepare for the shift to value-based health. It’s inevitable.

But in the meantime it probably makes sense for them to focus on internal cost drivers like pharmacy departments, whose average annual inpatient drug spending shot up by more than 23% between 2013 and 2015. Without stanching that kind of bleeding, hospitals are unlikely to get as much value as they’d like from big-idea investments in the future.

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

This week I got a look at a story appearing in a recent issue of Harvard Business Review which offers a description of Geisinger Health System’s recent big data initiatives. The ambitious project is designed not only to track and analyze patient outcomes, but also to visualize healthcare data across cohorts of patients and networks of providers and even correlate genomic sequences with clinical care. Particularly given that Geisinger has stayed on the cutting edge of HIT for many years, I think it’s worth a look.

As the article’s authors note, Geisinger rolled out a full-featured EMR in 1996, well ahead of most of its peers. Like many other health systems, Geisinger has struggled to aggregate and make use of data. That’s particularly the case because as with other systems, Geisinger’s legacy analytics systems still in place can’t accommodate the growing flood of new data types emerging today.

Last year, Geisinger decided to create a new infrastructure which could bring this data together. It implemented Unified Data Architecture allowing it to integrate big data into its existing data analytics and management. According to the article, Geisinger’s UDA rollout is the largest practical application of point-of-care big data in the industry. Of particular note, Geisinger is crunching not only enterprise healthcare data (including HIE inputs, clinical departmental systems and patient satisfaction surveys) and consumer health tools (like smartphone apps) but even grocery store and loyalty program info.

Though all of its data hasn’t yet been moved to the UDA, Geisinger has already seen some big data successes, including:

* “Close the Loop” program: Using natural language processing, the UDA analyzes clinical and diagnostic imaging reports, including free text. Sometimes it detects problems that may not be relevant to the initial issue (such as injuries from a car crash) which can themselves cause serious harm. The program has already saved patient lives.

* Early sepsis detection/treatment: Geisinger uses the UDA to bring all sepsis-patient information in one place as they travel through the hospital. The system alerts providers to real-time physiologic data in patients with life-threatening septic shock, as well as tracking when antibiotics are prescribed and administered. Ninety percent of providers who use this tool consistently adhere to sepsis treatment protocols, as opposed to 40% of those who don’t.

* Surgery costs/outcomes: The Geisinger UDA tracks and integrates surgical supply-chain data, plus clinical data by surgery type and provider, which offers a comprehensive view of performance by provider and surgery type. In addition to offering performance insight, this approach has also helped generate insights about supply use patterns which allow the health system to negotiate better vendor deals.

To me, one of the most interesting things about this story is that while Geisinger is at a relatively early stage of its big data efforts, it has already managed to generate meaningful benefits from its efforts. My guess is that its early successes are more due to smart planning – which includes worthwhile goals from day one of the rollout — than the technology per se. Regardless, let’s hope other hospital big data projects fare so well. (Meanwhile, for a look at another interesting hospital big data project, check out this story.)