Related Events

Quality professionals have been told for years, regularly and with great enthusiasm, that they should use “big data” to radically improve quality and outcomes, but many found that doing so was a challenge and didn’t live up to expectations. Now, however, it might finally be possible to truly use the wealth of data available to healthcare leaders for meaningful change.

The difference now is that the amount of data has greatly increased and it is much more accessible, say several experts. The widescale adoption of the electronic health record (EHR) resulted in a rapid accumulation of more data and created an infrastructure that made it much easier to access in useful ways, says Mark Wolff, PhD, chief health analytics strategist with SAS, a data analytics firm based in Cary, NC.

“Over the years there have been many pronouncements that not only technology, but also the amount of information available to that computational power, will finally create some dramatic paradigm shift,” Wolff says. “And that once that occurs, it will be truly transformative.”

Big data is different from the type of data routinely used in healthcare because it involves extremely large data sets analyzed to reveal patterns, trends, and associations. The idea of using big data for significant change goes back as far as 1959, when the first paper on the use of big data was published, Wolff notes. Another prominent paper with the same declaration appears every 10 or 15 years, he says.

“So here we are in 2017, effectively saying the same thing. Computational power is not a limitation now so perhaps now the data are going to drive this revolution,” Wolff says. “There are reasons now that this time we probably will see a dramatic shift in outcomes analysis, quality, and the standardization of care as a prerequisite to delivering higher-quality, lower-cost care.”

Digital Data Makes a Difference

The first reason is the availability of the data in a digitized form, he explains. Healthcare providers have always had a great deal of information they could use, but that data usually was in the basement in manila folders with colored tags. Now that data is digitized, there also is the ability to deconstruct data in medical imaging to data that can be analyzed. Mathematical matrices for imaging data is leading to potential advancements such as the ability to automate the diagnosis of lung cancer.

Combining that volume and type of data with the technological improvements for analyzing massive amounts of information creates a significant opportunity for healthcare quality improvement, Wolff says.

“We’ve never seen anything like this,” he says. “It really is quite dramatic.”

Vast amounts of healthcare data can transform population health analytics, Wolff says. Statistical sampling was developed because the technology did not exist to look at large amounts of data, he notes. The data had to be sampled to bring the volume down to a manageable size. Today, sampling is much less necessary, almost not at all. It is possible to look at the data from hundreds of millions of patients all at once in an analytic platform. (For examples of how big data helped improve population health and reduce drug errors, see the stories later in this issue.)

“Without sampling, we have the power to identify small groups, outliers, unique events,” Wolff says. “Medicine is about looking for something that is different, a similarity among individuals, a genetic combination that is meaningful, behavior information. With massive data, I can go to the hospital and they can do patient matching, comparing my condition to hundreds of millions of individuals over time. You can identify the people who look most like me and then identify what was done and what the outcome was.”

That allows personalization of medicine that does not rely solely on genome sequencing or other complicated applications, he says. Some cancer researchers have even suggested that oncology is so complex that it is unethical to treat patients without computer algorithms.

“We’ve approached and surpassed the limit of human cognitive abilities to understand not only the volume of information available, but what is relevant and needs to be addressed,” Wolff says. “The complexity of the disease and the treatments, with different combinations producing different results for different cancers in different people, means that we have to use technology to deal with the information overload.”

Industry More Interested

Healthcare providers are looking more at how to use the data available to them, says Anne McGeorge, U.S. and global national managing partner of healthcare with the Chicago-based consulting firm Grant Thornton. The movement to a value-based payment system puts more pressure on healthcare organizations to use available data to make good clinical decisions for population health, she says. In effect, she says, hospital leaders are learning how to make money by keeping their patients out of the hospital rather than by bringing them into the hospital.

“A big piece of that is the strategic use of data,” she says. “By being able to capture the data at least in their electronic record systems, they feel they can more effectively manage the health of their patients. Healthcare leaders are realizing this is an opportunity and they are looking for ways to make it happen and move forward.”

McGeorge worked with one hospital that had 250,000 patients in its EHR database, and they know within that group there are 50,000 smokers. They used the data to reach out to each individual and offer smoking cessation assistance and other information encouraging a healthy lifestyle. Many diseases and other issues can be tracked in the same way and individuals identified for intervention, she notes.

In comparison to other industries, the healthcare industry is lagging in making the most of big data, she says. The retail industry, for example, uses customer data to target individual customers with good results. She recalls one department store that identified customers buying pregnancy tests and offered those customers information on pregnancy care, diapers, and other baby items. The same rationale could be used for identifying patients who are likely to follow a pattern and respond to them at appropriate stages, she says.

“The healthcare industry has focused on providing good clinical care to their patients, and they typically achieve that with high marks,” McGeorge says. “But sometimes the organization doesn’t see the analysis of big data as a clinical function or even a core function. By moving more toward analyzing big data, hospitals can evaluate how they’re doing in treating sick patients and in keeping well patients out of the hospital.”

Start with Understanding EHR

Hospital leaders seeking to make more use of big data should start with a deep dive into their EHR system, McGeorge says. Hospitals with an enterprise resource planning (ERP) system also will be a good source for data.

“The ones with a robust ERP system can actually do a deep dive into the analytics covering, for example, the cost of certain procedures, even to the cost of caring for that patient from the moment he or she walks in the door to the moment the patient is discharged. Knowing the fully loaded cost of taking care of a patient is a huge step toward being able to analyze profitable departments, procedures, and pharmaceutical protocols, which can change behaviors and create more efficiencies in how some of the clinical care is delivered.”

Hospitals also should look at the capabilities of their existing systems, she says. Many hospital leaders do not fully understand that their existing technology can do, sometimes if a big data effort would require a significant investment in more technology, she says. System interoperability also is important, she says.

Objective, Unbiased Analysis

New technology can make use of data that researchers never could analyze in large volumes, notes Josh Bach, managing director of the enterprise improvement group at Van Conway & Partners, a management consulting firm based in Birmingham, AL. He recently was involved with a project that used the IBM Watson machine to study information available from films of interviews with Parkinson’s patients.

“Utilizing and leveraging big data allowed us to feed into Watson all of the interviews and writing samples to find facial expressions, verbal expressions, hints, or degradation in the handwriting, to earlier predict the disease,” Bach says. “The analysis also enabled us to monitor those with the potential for the disease and follow their progress for any similarity to the data from people who had Parkinson’s.”

Bach worked on a similar project with a team of researchers and clinicians who were using big data to review all oncology pre- and post-marketing trials to see if Watson could glean better treatment algorithms that would be free of the researcher bias that has plagued many studies in the past. The clear majority of cancer trials are on single drugs, but most treatment is approached with combination therapy.

Oncology, with its regimented dosing schedules and comparatively immense data tracking, is a more attractive target for cognitive computing, Bach notes. Unlike other areas of medicine, oncology is the most likely to have accurate inputs on dosing and compliance due to the life-threatening nature of the disease and high cost of the medication, he says.

The big data analysis helped overcome biases that may have affected other assessments, he says.

“You’re not doing double-blind, placebo-controlled trials on cancer patients because it’s unethical. And you have cases in which the researcher has some sort of bias or a certain hypothesis about treatment or dosing, and thus, many times a well-intentioned and well-qualified group of people will come up with a conclusion that supports their preconceived bias,” Bach says. “With the big data and the Watson computer, there is no preconceived bias. We’re able to look at all research that has been published and digest in such a way that you’re going to get outputs suggesting certain treatment algorithms are going to be favorable, without the bias of a few key opinion leaders or research institutions that may have had a vested interest in one conclusion over another.”

Big data also can be influential with monitoring compliance and key performance indicators, says David Kaufman, JD, a partner in the Healthcare Practice Group with the law firm of Freeborn & Peters in Chicago.

“Quality through the lens of efficiency, standardization, and compliance can be enabled pro-actively to ensure efficiencies and make immediate improvements,” he says. “For example, quality KPIs [key performance indicators] can be launched via business rules or by a triggered alert to acquire data based on outliers or thresholds. If those specified KPIs are not met, it’s immediately reported, signaling where, when, and, in some cases, why the threshold was violated.”

Big Data Can Address Population Health

Population health management and care coordination can benefit from the use of big data, including not only clinical records, but also data indicating social determinants of health, says Linda Lockwood, RN, MBA, PCMH CCE, managing director of the health solutions group for Computer Task Group, an IT staffing company based in Buffalo, NY.

“Social determinants of health” is a term and concept that has been getting more attention lately, focusing on the idea that socioeconomic factors play a significant role in an individual’s health and the outcomes of healthcare.

“Big data has a connection in driving that forward,” she says. “When we look at care coordination and empowering someone to be healthy, we have to understand more than the tests that were ordered and drugs prescribed. We have to understand where they live and what those environmental factors are that influence their health”

ZIP code data, for instance, is readily available and can provide insight into the social determinants of health, Lockwood says. That information can be helpful in identifying how many grocery stores or pharmacies are in the area, for example, or the availability of public transportation.

“In one case with a community health center and medical home, we found there were no food resources except fast food and they had a high population of diabetics,” she says. “That told us we needed to offer alternatives and educate people about the importance of healthy eating. We ended up setting up a farmer’s market at the hospital and a clinic where patients could come in and get their A1c, meet in a group setting, prepare food with a nutritionist, and have lunch.”

The same community had few parks or exercise opportunities, partly because the neighborhood was dangerous and people were not comfortable exercising outdoors or going to a gym. The solution was organizing an exercise group that would meet regularly at a mall.

“When you start to think of non-healthcare data that influences how that patient is going to take care of themselves, big data becomes very important,” Lockwood says. “Merely assigning a care coordinator and calling them to remind them to check their blood pressure isn’t going to be enough to move the needle and improve those outcomes.”

Children’s Hospital Uses Big Data to Stop Overdose Errors

Dosing errors are a concern for pediatric facilities because patient weights vary dramatically, and most medication doses are based on weight. A dose that is appropriate for an adolescent would be an overdose for an infant, and in the reverse, the adolescent would likely receive no benefit from the medication.

In addition, with premature babies and other small patients, even a slight deviation from the correct dose could be life-threatening.

Chief Medical Information Officer Vinay Vaidya, MD, led a team that developed a dose ranging checking system intended to eliminate 100% of overdosing errors at the prescribing stage, before the error could affect the patient.

Big data was essential to finding the solution, according to a summary provided by the hospital. The team analyzed more than 750,000 prescription orders from Phoenix Children’s across an eight-year period, combining that information with drug dosage reference information. The analysis also factored in feedback from pharmacists and physicians to define thresholds for “very high” and “dangerously high” doses.

The extensive analysis prompted Phoenix Children’s to develop a two-tiered alert system within the hospital’s electronic health record (EHR) that uses the big data information to instantly assess the accuracy of a prescription to determine if it is within the acceptable range for that patient. A drug order that deviates by a relatively small amount returns a “soft stop” that suggests double checking the order. An order that deviates significantly — such as when an adolescent dose is prescribed for an infant — will produce a “hard stop” that requires the prescriber to consult a pharmacist about amending the dosage before the order can be fulfilled.

The hospital has not had a dosing error reach the patient since the pediatric dose range checking system was introduced in 2011. In 90% of hard stops, the physician either reduces the dose to a safe level or determines that the order was erroneous and cancels the prescription.

In the first four months after launch, the system generated 11 hard stops for intravenous potassium, which can be lethal in overdoses.

Post a comment to this article

Name*

E-mail(will not be displayed)*

Subject

Comment*

Report Abusive Comment

Thank you for helping us to improve our forums. Is this comment offensive? Please tell us why.

Restricted Content

You must have JavaScript enabled to enjoy a limited number of articles over the next 360 days.