Health Catalysthttps://www.healthcatalyst.com
Health Catalyst - providing a transformational approach to healthcare analytics that starts with a data warehouseFri, 16 Mar 2018 04:03:41 +0000en-UShourly1https://wordpress.org/?v=4.9.4https://www.healthcatalyst.com/wp-content/uploads/2016/12/cropped-HealthCatalyst-Icon-32x32.pngHealth Catalysthttps://www.healthcatalyst.com
3232The Best Solution for Declining Medicare Reimbursementshttps://www.healthcatalyst.com/insights/best-solution-for-declining-medicare-reimbursement/
https://www.healthcatalyst.com/insights/best-solution-for-declining-medicare-reimbursement/#commentsTue, 09 Jul 2013 17:48:09 +0000http://healthcatalyst.wpengine.com/?p=2161Editor’s Note: Bobbi has updated this piece for March 2018. Given the continued trends for declining Medicare reimbursements, I am one of the brave souls who takes the time to read the report issued each spring by the Medicare Payment Advisory Commission (Medpac). MedPAC is a nonpartisan agency that provides Congress with analysis about the […]

Given the continued trends for declining Medicare reimbursements, I am one of the brave souls who takes the time to read the report issued each spring by the Medicare Payment Advisory Commission (Medpac). MedPAC is a nonpartisan agency that provides Congress with analysis about the Medicare program. Many of their reports are used to set policy for the Medicare program. The push for quality is strong and MedPAC is working to provide the right incentives for quality.

Though the reading can be a bit dry, the information contained in the report is important—particularly considering the fact that the commission’s recommendations are influential in shaping policy. Medicare payment policies tend to set a precedent for other payers.

I had the chance to read a variety of the documents published by MedPAC (Medicare Payment Advisory Commission). I reviewed two documents:

The Report to Congress covers 10 areas from stand-alone ED to provider payment. I am just focusing on a couple of the areas and showing a few key findings from the data distributed.

Spending

The reports show the spending trends are not affordable long-term. We have been able to lower the per beneficiary spend but we need to sustain that trend

Medicare per beneficiary spend has fallen to an annual increase rate of 1 percent for the period 2010 to 2015.

Projections for 2016-2025 show a higher annual growth rate of 4 percent in the per beneficiary spend.

During the same time-period the aging of the baby-boom generation will cause an increase of enrollment of 3 percent annually. By 2030 there will be 81 million beneficiaries in the Medicare program.

Overall Medicare spending will increase 6-7 percent each year and this is a higher projection than the GDP (Gross Domestic Product). GDP is projected to increase 5 percent annually.

Utilization trends

From 2006 to 2015 in the Medicare program, outpatient visits per beneficiary increased 47 percent and inpatient admissions declined by almost 20 percent. See the chart below illustrating this shift in services.

During the same time, Medicare length of stay decreased 7.7 percent. These facts will impact the location of services for the populations served.

Source: MedPAC analysis of CMS claims

There have been a couple of shifts in the type of cases admitted to the hospital between 2006 and 2015.

Circulatory system cases accounted for one-fifth of all inpatient discharges in 2015, and this is a decline of 6 percentage points from 2006.

Musculoskeletal system cases accounted for 14 percent of all inpatient cases and this is an increase of 2 percentage points from 2006.

Up 5 percentage points is infectious and parasitic disease cases which accounted for 9 percent of all inpatient discharges.

The outpatient shift will continue, and we need to plan for facilities and consumer preference.

Readmission

In 2013, the Hospital Readmission Reduction Program (HRRP) started calculating penalties for hospitals that have above-average readmission rates for selected conditions. From 2010 to 2015 the potentially preventable readmission rates declined from 12.9 percent to 10.5 percent for all conditions. The three conditions covered under the HRRP beginning in 2013 have experienced declines in potentially preventable readmission rates. Readmissions for AMI declined 3.6 percentage points, heart failure declined 3.1 percentage points and pneumonia declined 2.5 percentage points.

CMS started a program that involves a negative payment and hospitals have responded to the penalty by working to lower readmissions.

Discharge disposition

In 2015, about 46 percent of all Medicare fee-for-service patients were discharged from an acute care hospital to home, without any organized post-acute care. This represents a decrease 7 percentage points from 2006. More Medicare patients are being discharged to post-acute care services. Twenty-one percent are discharge to skilled nursing care and seventeen percentage are going home with organized home-health care services. MedPAC is recommending implementation of a unified payment system for post-acute care. Medicare currently has seen differences for similar patients based on the post-acute venue chosen. The supply and use of post-acute providers varies considerably across the country. MedPAC would like payments based on patient characteristics rather than site of service.

Bundle payments would help in this area and place accountability with the organization that assumes the risk. Also CMS may revamp the payment system for post-acute care by 2020.

Concentration of spending

The costliest 25 percent of beneficiaries account for 84 percent of the spend. The costliest areas of spend include dual eligible (Medicare, Medicaid recipients), beneficiaries with multiple chronic conditions, users of inpatient services and those in last year of life.

Low-value care

MedPac devoted several charts to display low-value care. The definition of low-value care is the provision of a service that has little or no clinical benefit. They used a set of 31 measures developed by a team of researchers. Then they applied dollars to the list, and the spend for 2014 was $2.4 to $6.5 billion. The list includes testing, imaging and screening. Some high spend examples are stress testing for stable coronary disease and imaging for nonspecific low back pain. They stated this is just a start to determining the cost of low-value care.

Providers should be evaluating the 31 measures in their institutions. We may currently get revenue for the services, but we need to shift to a value lens.

Addressing Declining Medicare Reimbursements

I am hoping you can see from these facts that those recommending new policies will continue to push providers for quality and value. More beneficiaries are entering the program and there will be constant pressure to control the cost. There is a willingness to pay for better outcomes. Using data to derive insightful analytics through platforms such as the Health Catalyst® Data Operating System (DOS), healthcare organizations can approach these changes with confidence that they can adapt and thrive in a more value-based environment.

]]>https://www.healthcatalyst.com/insights/best-solution-for-declining-medicare-reimbursement/feed/1Five Data-driven Patient Empowerment Strategieshttps://www.healthcatalyst.com/insights/5-data-driven-patient-empowerment-strategies
https://www.healthcatalyst.com/insights/5-data-driven-patient-empowerment-strategies#respondTue, 13 Mar 2018 19:22:40 +0000https://www.healthcatalyst.com/?p=29865Most healthcare outcome improvements are attributed to clinical, operational, or financial process changes in the acute care setting. It’s often skilled clinicians who get the credit for reducing readmissions, length of stay, or costs of care—and rightfully so. But patients, too, can play a considerable role in improving their own care. Many patients, especially those […]

]]>Most healthcare outcome improvements are attributed to clinical, operational, or financial process changes in the acute care setting. It’s often skilled clinicians who get the credit for reducing readmissions, length of stay, or costs of care—and rightfully so. But patients, too, can play a considerable role in improving their own care. Many patients, especially those with chronic conditions, want to play this role; they just need the assets (e.g., tools, information, environments) to participate. Data and digitized health—and instruction on how to use them—are proven means for building patient empowerment strategies. This shifts the balance of care from the hospital and clinician to the community and patient, and produces meaningful, patient-centered outcomes.

If clinicians can understand how data empowers their patients, and why empowerment is important, they can realize greater efficiencies in clinical processes and achieve outstanding outcome improvements. As highlighted in this article, unique case studies around the world show the powerful impact of data-driven patient engagement.

What Empowering Patients Means

Empowering patients is giving patients control over healthcare in ways that clinicians may not typically consider. Patients want ownership over the things that matter most to them, such as measuring and monitoring their own biometrics and having access to condition-specific information when and where they want it. Clinicians can help patients understand the important role patients have in managing their own health challenges.

Beyond the time they spend with doctors and nurses, patients with chronic disease spend 5,000 hours a year on other activities that directly affect their health: deciding what to eat or drink, deciding how certain exercise and activities will impact their lives, and deciding what medications to take and what medical advice to follow. These decisions take place outside the clinical setting, yet the typical healthcare model focuses on optimizing the limited time patients spend with their doctors. Empowering patients means giving them data—and the technology and education to process it—so they can more effectively use, or reduce, those 5,000 hours. Many unique programs exist to demonstrate the positive impact and time-saving potential of empowering patients with technology.

The NHS in the United Kingdom is successfully using a program that gives patients more control over their daily, at-home care and optimizes the time they spend on these activities. A team at NHS Stoke on Trent (now Stoke Clinical Commissioning Group) created the program, Florence (Flo). Flo is a simple telehealth system that uses mobile text messages to help people manage their chronic conditions and other healthcare issues. Flo sends patients reminders and helpful tips customized to their needs, whether it’s managing diabetes, living with chronic obstructive pulmonary disease, or learning how to breastfeed. It’s a perfect illustration of using data and digital health to empower patients to manage their disease.

Five Ways Data Builds Patient Empowerment Strategies

Empowered patients are a potent resource for generating outcomes improvement. Empowering them with data and innovative ways to manage their own care can generate impressive results. Here are five ways data empowers patients to become owners of their own healthcare journeys.

1) Data Promotes Patient Engagement

Patient engagement has been likened to a blockbuster drug that, if not used, should be considered a form of malpractice. Empowering patients with easily accessible and actionable data increases their engagement and results in better clinical outcomes, greater patient satisfaction, and lower costs.

Kaiser Permanente ran a Collaborative Cardiac Care Service pilot program that used technology and data to develop patient engagement through proactive patient outreach and education. Patients in the program had an 88 percent reduced risk of dying of a cardiac-related cause when enrolled within 90 days of a heart attack, compared to those not in the program. Clinical care teams reduced overall mortality by 76 percent and cardiac mortality by 73 percent. Patients engaged by the coordinated care program clearly fared better.

2) Data Produces Patient-Centered Outcomes

Data helps patients achieve the outcomes they desire and gives them independence rather than simply having them follow clinical plans. When clinicians ask patients what matters to them, rather than what’s the matter with them, a very different picture can emerge of what’s important in their healthcare. What’s important to patients are outcomes that define their ability to function well, feel well, and survive longer. And costs often decrease when clinicians follow patient wishes and focus on patient-centered outcomes, rather than following strict clinical protocol.

My Life, My Dialysis Choice effectively puts data and technology into patients’ hands. It’s a patient-centered app that takes patient preferences into account while they make dialysis choices. The app asks questions about a patient’s preferred lifestyle and goals while on dialysis, then tailors a program to match those preferences. How important is travel? Spending time with family? Keeping a pet? Avoiding pills? Based on the answers, the app asks for a star ranking of different treatment options, then advises on next steps.

My Life, My Dialysis Choice focuses on what matters to the patient and promotes patient engagement. It’s an example of how data empowers patients to choose appropriate care and improve their own health.

3) Data Helps Patients Practice Self-Care

Data and technology can help patients practice self-care. When patients administer their own care, the outcome improvements can be remarkable. Ryhov County Hospital in Jönköping, Sweden, evolved into a self-dialysis center because one patient asked about doing the procedure himself. He was taught to use the dialysis machine, interpret lab values, and document his care. Self-dialysis led to less variation and fewer complications. The patient and staff educated other patients and, eventually, a new wing was dedicated solely to deeply engaged patients performing self-dialysis. Today, 60 percent of all dialysis patients self-administer, and the center is heading toward a goal of 75 percent. There are fewer complications, some patients avoid subsequent care, and per-patient costs are lower by one-third to one-half.

4) Data Improves Communication Between Clinicians and Patients

Patients often misunderstand the language of healthcare. Care instruction for patients with chronic and long-term health conditions requires carefully targeted communication in terms that patients understand. Giving patients access to their own data and control over generating that data (e.g., self-dialysis, self-measuring, self-screening) significantly improves how patients comprehend and complete their care instructions.

Sharon Rising, a midwife at Boston Medical Center, created a new model for prenatal care that rethinks data and data-generating tools. When a woman learns she is pregnant, she can join the Centering Pregnancy program, a group care model that includes 10 two-hour, group visits with other moms due in the same month. Visits are conducted in a conference room rather than an exam room. Each patient measures and records her own vital signs, then shares the record with her physician and midwife, shifting the balance of care from the clinician to the patient. The group dynamic involves sharing, relationship building, and reassurance from other moms who are going through the same life-changing process.

This model of care reduced the risk of preterm birth by 33 percent, reduced racial disparities for preterm births, and nearly doubled the number of Centering Pregnancy participants who breastfed over those in a comparison study.

5) Data Leads to Faster Healing and Independence for Patients

Data helps clinicians think outside the walls of the acute care setting to develop new protocols that directly impact patients and lead to substantial outcome improvements. Sheffield Teaching Hospitals in the United Kingdom needed a better way to discharge older patients. It considered discharging these patients to their homes first, and then assessing their support needs in the environment where they live rather than inside the hospital. Early in the process, the team was unsure that the “admit to home” idea would work, but through several plan-do-study-act cycles, processes improved and a new program emerged. It eventually grew to serve 10,000 patients in one year, who got home three to four days faster, saving 30,000 to 40,000 acute-care bed days. Data and a unique patient engagement strategy generated dramatic cost, process, and patient experience improvements.

Healthcare needs to adapt to numerous changes sweeping through the industry. Patient expectations of healthcare delivery and outcomes are changing as technology permeates society and the healthcare economy. Healthcare providers can take advantage of these opportunities and adapt in ways that shift the conventional thinking of how healthcare should be delivered.

Patients want to be involved in practicing their own care, managing their own conditions, monitoring their own results, and producing their own outcomes. Asking what matters to patients, rather than what’s the matter with them, reduces the burden on clinical staff and resources, improves communication between patients and clinicians, and significantly increases healthcare’s value.

Clinicians must empower patients with data and tools that give them appropriate decision-making power and control over their own care. Shifting the balance of care from clinicians and hospitals to patients and the community can improve the patient experience, reduce costs, and improve processes to make healthcare more efficient.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/insights/5-data-driven-patient-empowerment-strategies/feed/010 Motivational Interviewing Strategies for Deeper Patient Engagement in Care Managementhttps://www.healthcatalyst.com/insights/motivational-interviewing-healthcare-10-strategies
https://www.healthcatalyst.com/insights/motivational-interviewing-healthcare-10-strategies#respondThu, 08 Mar 2018 19:31:16 +0000https://www.healthcatalyst.com/?p=29835How do care managers get patients to enroll in care management and keep them actively involved in their own care? With patient engagement. Patients who are engaged in their care become partners with their care team, setting goals and finding solutions that best meet their individual needs and circumstances. To effectively engage patients, care managers […]

]]>How do care managers get patients to enroll in care management and keep them actively involved in their own care? With patient engagement. Patients who are engaged in their care become partners with their care team, setting goals and finding solutions that best meet their individual needs and circumstances. To effectively engage patients, care managers need the right interviewing techniques and the right technology to support and sustain patient engagement.

This article explains how motivational interviewing can help care managers more effectively engage patients and partner with them to better understand patient care needs, goals, and concerns. In care management, motivational interviewing is a collaborative approach, between the care manager and the patient, that’s focused on strengthening the patient’s motivation to adhere to the care plan and change behaviors that interfere with better health. With supportive technology, care managers can further optimize motivational interviewing and achieve the best possible care management outcomes.

In healthcare today, patients are the center of the care team, making patient engagement pivotal in care management. Historically, clinicians took a more directive approach to care, giving patients a limited role in the decision-making process. But as healthcare continues its shift to a patient-centered approach, care managers are increasingly seeking effective ways to engage patients.

Patients Make a Commitment to Better Health on Their Own Terms

The importance of patient engagement in care management starts before a patient even enters a care management program. Some patients may be hesitant to enroll in a care management program; they may feel they can manage their own care or are worried about the cost of care management and how it impacts health insurance.

The first task for care managers is to engage patients in a way that encourages them to participate in care management and communicates that care managers are partners in care, not instructors or directors. Patients will be more likely to be make better healthcare choices and identify realistic, achievable goals they can sustain—and feel empowered doing so.

To fully leverage the benefits of patient engagement, care managers need an interviewing technique, such as motivational interviewing, that helps clinicians and patients set healthcare goals that reflect the patient’s desires and circumstances. Together, care managers and patients will set weekly goals, as determined by the patient, and decide on the best ways to communicate with the patient to keep them involved. For example (depending on factors including age and personal preference), some patients will prefer to communicate via text or in an app, whereas others will respond better to a phone call or visit.

Motivational Interviewing in Healthcare Empowers Patients

Care managers can use motivational interviewing to empower patients in their own care, rather than projecting outside goals onto an individual’s situation. The motivational interviewing method of engaging patients was developed by clinical psychologist William Miller in 1983 to address substance abuse disorders. Over the years, however, research has shown that the technique is effective at reducing many potentially risky behaviors (e.g., gambling and excessive drinking) and promoting healthy behaviors (e.g., improving diet, exercising more, and adhering to a medication regimen).

The heart of motivational interviewing is the ability to sustain empathy with patients during conversations, rather than being directive. With motivational interviewing care managers can also identify the type of talk that well best serve patients and encourage them to follow their care plans.

The heart of motivational interviewing is the ability to sustain empathy with patients during conversations, rather than being directive. With motivational interviewing, care managers can also identify the type of talk that well best serve the patient and encourage them to follow their care plan.

What’s known as change talk includes three levels:

The desire to change (“I want to take my medication as prescribed”).

The ability to change (“I can ask a family member to go to the pharmacy for me”).

The need to change (“If I don’t take my medication, I may be readmitted to the hospital”).

Patients sometime get stuck in what motivational interviewing calls sustain talk; they express that they cannot commit to positive change (e.g., “I quit smoking two years ago, but I started again and can’t quit this time”). The care manager can reframe that statement and say, “What did you do before that helped you quit?”

An important part of motivational interviewing is to guide the patient towards change talk and a commitment to achieve a positive goal (commitment talk). Expressed at the end of a motivational interviewing session, commitment talk seals the patient’s commitment to a care management goal (e.g., “I am going to take my medication every day, as prescribed”).

10 Motivational Interviewing Strategies

Care managers who are accustomed to teaching patients about their care needs and plan (a directive approach) risk missing a critical element: the patient perspective on and motivation around improving their overall health. By understanding a patient’s concerns (e.g., stress related to a new diagnosis, medication, care schedules, or financial concerns), care managers can better identify barriers to care and work with the patient to find the best solution.

Care managers can use 10 strategies for motivational interviewing to build trust with patients, engage them in their own care, and help them find motivation to adhere to their care plans:

Strategy #1: Ask a question that will prompt change talk as an answer. For example, “What are some things you can do to make sure you take you medication regularly?”

Strategy #2: Ask for the pros and cons of both changing and staying the same. For example, “How will taking for medication lower your risk of hospital readmission? How will another hospital readmission (i.e., continuing to miss medication doses) impact you?”

Strategy #3: Ask about the positives and negatives of the target behavior. For example, “How will taking your medication improve your condition? What are the negative impacts of taking your medication (e.g., cost, side effects)?”

Strategy #4: When the patient expresses change talk theme emerges, ask for more details. For example, “In what ways? Tell me more? When was the last time that happened?”

Strategy #5: Ask about a time before the patient enrolled in care management. For example, “How were things different before your care management program?”

Strategy #6: Ask what may happen if the patient makes the changes according to their care management plan. For example, “If you follow all your care management recommendations, what will be different? How do you see your health five years from now?”

Strategy #7: Ask about extreme outcomes. For example, “What are the worst things that might happen if you don’t follow your care management plan? What are the best things that might happen if you follow the plan?”

Strategy #8: Offer ways to clearly measure the impact of care management. For example, “On a scale from one to 10 (where one is not at all important and a 10 is extremely important), how important is it to improve your health? What do you think you can do to get closer to a 10?”

Strategy #9: Ask about the patient’s main health goals. For example, “Do you want to be healthy enough to travel to this summer? What upcoming family events do you want to attend?”

Strategy #10: Think like the patient and reframe any barriers into a positive strategy. For example, “Taking your medication every night before bed is a hassle. How about taking it in the morning instead?”

Technology to Make the Most of Motivational Interviewing

With the ten motivational interviewing strategies in place, care managers are prepared to help patients find their own motivation to follow their care management plans. To further strengthen motivational interviewing, care managers can use technologies specifically developed to support care management. The Health CatalystCare Management Suite, for example, hosts applications that can support the motivational interviewing process and help care managers optimize patient engagement.

Because Health Catalyst’s care management tools are customizable, health systems can build conversation guides that lead the care manager through motivational interviewing. The care manager can continue to tailor the tool for each patient by adapting questions based on patient feedback and reframing the language to recognize the patient’s goals and commitment to those goals.

In addition, other care management tools can survey patients to capture patient-reported outcome measures and assess patient perception of their functional well-being and health. These surveys can encourage patients to think about new questions and look at their health in a different way. Throughout the motivational interview process, the care manager can use a patient channel on the care coordination application to monitor patient progression and provide positive feedback and education.

Patients also remain engaged in the motivational interviewing process with care management applications and can communicate with their care teams in real-time. For example, by using the Health Catalyst Care Companion, a mobile app available on a smart phone, patients can securely message their care team and access their health-related content. With the assistance of these tools and applications, the care manager can optimize the motivational interviewing process and partner with the patient to achieve positive behavioral change.

Patient Engagement Is the Heart of Care Management

As healthcare increasingly moves toward patient-centered care, and new technologies support more patient involvement, patient engagement will continue to be a critical component of healthcare delivery. This is particularly true in care management, where long-term partnerships between care managers and patients will drive better outcomes at reduced costs.

By following a 10-step strategy for motivational interviewing and leveraging technologies that facilitate and sustain patient engagement, care managers can help patients become true partners in their own care, empowering them to make changes to achieve better health.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/insights/motivational-interviewing-healthcare-10-strategies/feed/0Health Equity: Why it Matters and How to Achieve ithttps://www.healthcatalyst.com/health-equity-why-it-matters-how-to-achieve-it
https://www.healthcatalyst.com/health-equity-why-it-matters-how-to-achieve-it#respondTue, 06 Mar 2018 22:21:25 +0000https://www.healthcatalyst.com/?p=29822Health inequities—defined by the World Health Organization as systematic differences in the health status of different population groups—have been in the national spotlight for years, which isn’t surprising given that the U.S. ranks last on measures of health equity compared to other industrialized countries. Health inequity is a multiple-industry issue with significant impacts (health, social, […]

Health inequity is a multiple-industry issue with significant impacts (health, social, economic, etc.) on people and communities. Racial health disparities alone are projected to cost health insurers $337 billion between 2009 and 2018.

Healthcare organizations are increasingly making health equity a strategic priority, with varying degrees of success.

How can we tell if health equity has been achieved? “When everyone has the opportunity to attain full health potential, and no one is disadvantaged from achieving this potential because of social position or any other socially defined circumstance,” according to a Robert Wood Johnson Foundation (RWJF)-commissioned Communities in Action: Pathways to Health Equity report (a year-long analysis by a 19-member committee of experts in national public health, healthcare, civil rights, social science, education, research, and business).

Many healthcare organizations, such as Allina Health, have initiated efforts to improve health equity by making it a systemwide strategic priority and investing in the right resources, infrastructure, and programs, which we’ll outline in this article. Other systems, however, are still largely unaware of the inequities and disparities within their walls.

Healthcare has a long way to go to effectively address health inequity, but there are evidence-based approaches to start tackling—or continue the battle against—health inequities. This article explores approaches, both simple and complex, health systems can implement to work toward restoring health equity.

Figure 1: Life expectancy of babies varies by neighborhood in New Orleans

We can start to answer these questions by understanding what causes health inequity, as described in RWJF’s Communities in Action report:

Intrapersonal, interpersonal, institutional, and systemic mechanisms (i.e., structural inequities) that organize the distribution of power and resources differently across lines of race, gender, class, sexual orientation, gender expression, and other dimensions of individual and group identify.

Health inequities are the result of more than individual choice or random occurrence; they are the result of poverty, structural racism, and discrimination. Health systems are just one cog in the wheel of the health inequity issue, but the role they play in the problem is a big one.

Healthcare’s Role in Disparities

Looking at race- and ethnicity-related disparities, for example, differences in access to care, receipt of needed medical care, and receipt of life-saving technologies for certain populations “may be the result of system-level factors or may be due to individual physician behavior” according to an NCBI article. The article states that “patient race/ethnicity has been shown to influence physician interpretation of patients’ complaints and, ultimately, clinical decision making.”

The literature shows that clinicians have biases toward certain populations that impede their ability to provide effective care. Over time, these biases become institutionalized and harder to eliminate. Given that the perceived quality of healthcare (or lack thereof) can significantly impact health outcomes (e.g., adherence to medical advice, cancer screening recommendations, and medication regiments), many health systems find themselves in a self-perpetuating cycle of health inequities and poor health outcomes. Health systems exacerbate their health inequity problems when they don’t have the required data (e.g., socioeconomic) or healthcare delivery structure to discover and correct disparities.

Given that health disparities are shaped by multiple determinants of health (social, economic, environmental, structural, etc.), achieving health equity requires engagement from not just healthcare, but also education, transportation, housing, planning, public health, and many other industries and businesses. Achieving health equity is a communitywide effort.

How to Make Health Equity a Strategic Priority

IHI says “health care professionals can—and should—play a major role in seeking to improve health outcomes for disadvantaged populations.” Healthcare organizations committed to outcomes improvement must also be committed to health equity, and their first step is making it a systemwide, leadership-driven priority.

Make health equity a leader-driven priority (healthcare leaders must articulate, act on, and build the vision into all decisions).

Develop structures and processes that support equity (health systems must dedicate resources and establish a governance structure to oversee the health equity work).

Take specific actions that address the social determinants of health (health systems must identify their health disparities and the needs and assets of people who face disparities, and then act to close the gaps). Some patient populations need additional support to achieve the same health outcomes as other patient populations (e.g., they need someone to drive them to appointments, they need home visits, etc.).

Confront institutional racism within the organization (health systems must identify, address, and dismantle the structures, policies, and norms that perpetuate race-based advantage).

Partner with community organizations.

Making health equity a strategic priority is the first step. Next, healthcare organizations need to tackle the disparities with proven interventions designed for their disadvantaged populations. The RWJF outlines specific steps health systems can take to address disparities:

Adopt new vital signs to screen for the nonmedical factors influencing health.

Commit to helping low-income and non-English-speaking patients get the care they need.

Guard against the potential for bias to influence medical care.

Make sure elderly, women, and racial/ethnic minorities are adequately represented in clinical trials.

Understand the effects of adverse childhood experiences and use trauma-informed care.

Address the Socioeconomic Determinants of Health

Let’s take a closer look at how health systems can incorporate nonmedical vital signs into their health assessment processes to paint a more detailed picture of their patients’ health. RWJF’s Time to Act: Investing in the Health of Our Children andCommunities report states that adding nonmedical vital signs (employment, education, food insecurity, safe housing, exposure to discrimination or violence, etc.) to existing ones (heart rate, blood pressure, weight, etc.) can help clinicians make better-informed decisions about treatment and care.

The article notes that new vital signs should be objective, readily comparable to population-level data, and actionable.

Adding nonmedical vital signs to health assessments facilitates healthcare and community collaboration by prompting patient referrals to community resources and improving clinician understanding of patients’ lives outside of the hospital or clinic. It’s this blurring of the lines between health systems and community organizations that will ultimately bridge the health inequity gap.

Health System-Community Collaboration Is Critical

A recurring theme in recommendations to improve health equity is community collaboration. One organization tackling health inequity with a community-based mindset is Health Share of Oregon, a local coordinated care organization (CCO) serving more than 240,000 Oregon Health Plan members. Its community-based approach connects its members with the services they need to be healthy:

Training and education

Support groups

Care coordination

Home improvement (i.e., home environment items, such as air conditioning or athletic shoes to improve mobility, access, hygiene, etc.)

Transportation

Community health programs (e.g., farmers markets in food deserts)

Housing supports (e.g., shelter, utilities, and critical repairs)

Resource assistance (e.g., referral to job training or social services)

Health Share’s The Power of Together: Five Years of Health Transformation, 2012-2017 report details its health equity progress and how its “local communities come together to improve the health and health outcomes of Oregon Health Plan members, while simultaneously contributing cost savings to the system.” Another health system, Allina Health, is also working to restore health equity for its underserved patients.

How One Health System Is Tackling Health Inequity—And Achieving Results

Illness, disability, and death are more prevalent and more severe for minority groups in the U.S., and Minnesota is no exception to this problematic trend:

In Minnesota, African-American and American Indian babies die in the first year of life at twice the rate of white babies.

In Minnesota, the rate of HIV/AIDS among African-born persons is nearly 16 times higher than among white, non-Hispanic persons.

In 2011, Minnesota started requiring healthcare providers to collect race, ethnicity, and language (REAL) data. The inequities revealed by this data motivated Allina Health, a not-for-profit healthcare system serving communities throughout Minnesota and western Wisconsin, to take targeted actions to reduce inequities for some of its racial/ethnic minority patient populations.

To complete the picture of its patients’ health, Allina Health conducted research and focus groups to understand values, beliefs, and barriers impeding certain patient populations from completing the recommended colorectal cancer (CRC) screenings (e.g., concerns about discomfort with the procedure, based on prior healthcare experiences in a patient’s home country where pain medication wasn’t used, a lack of familiarity with the word screening, basic needs, such as food, housing, and bills, may take priority over preventive health treatment).

With an improved understanding of its patients’ health beliefs and needs, Allina Health developed targeted interventions:

Harnesses the power of community (e.g., social media campaigns that better engage African-American and Spanish-speaking patients).

Uses analytics to monitor effectiveness of interventions on populations at highest risk for poorest screening rates.

Allina Health’s data-driven approach to reducing health inequities is beginning to make a difference: it has achieved a three percent relative improvement in CRC screening rates for targeted populations. And with REAL data embedded in its dashboards and workflow, it can identify and address additional disparities. Allina Health is one of many health systems making progress toward health equity by making it a strategic priority and implementing evidence-based, analytics-driven, community-informed, targeted interventions.

Beyond Patient Outcomes: Broadening Equity’s Scope

Healthcare organizations can broaden equity’s scope to include more than the health outcomes of the patients they serve; they can use their resources and status as employers to address equity in myriad other ways:

Many healthcare organizations have moved out of poor neighborhoods to capture market share, increasingly building new hospitals in more affluent areas. These organizations can positively impact health inequity by building in deprived areas, making healthcare available to underserved patient populations.

Use a diverse pool of contractors and suppliers:

Supplier diversity efforts can positively improve the economic health of communities. Kaiser Permanente’s supplier diversity initiative provides minority-owned, women-owned, veteran-owned, and small businesses the opportunity to participate in contracting and subcontracting activities. With the goal of spending $1 billion on goods and services, Kaiser Permanente fuels the economic growth of its communities, providing training internally and externally, attending and promoting supplier outreach events, and researching opportunities to engage diverse suppliers.

Henry Ford Health System’s supplier diversity process includes more than 300 active minority- and women-owned businesses. Its transparent sourcing policy requires that all bids for more than $20,000 include one or more women- or minority-owned firms in the bid process (when available). The request for proposals process includes preference points for certified women- or minority-owned business and those that have supplier diversity processes in place.

Make healthcare investments beyond the required community benefit and invest back into the community:

Provide monetary support to increase the number of community spaces (e.g., parks, walkable trails, etc.).

Invest in high school education programs that prepare students for healthcare careers and provide them with high school and college credit. Higher education translates to higher wages.

Achieving Health Equity is a Collaborative Effort and Industrywide Imperative

Success in healthcare requires organizations to improve quality and clinical effectiveness while decreasing costs. Healthcare organizations must include health equity as a strategic priority, broaden health equity’s scope, invest in the structures and processes that improve health equity, and dismantle institutionalized racism.

In pursuit of health equity, organizations must also provide culturally competent care to many different patient populations who need clinicians to understand their lives, address their population-specific healthcare needs, change practices to be inclusive, collect data in a non-judgmental way, and build trusting relationships that enable them to openly participate in care—improvement strategies that are driven by a commitment to health equity.

Although the systemic root causes of health inequities and disparities in the U.S. will take time and hard work to eliminate, health systems can start now by making health equity a strategic priority championed by C-suites. Systems can tackle their data-exposed inequities with interventions of varying degrees of complexity, from adding nonmedical vital signs (e.g., employment) to health assessments, to forging and fostering community partnerships.

The statistics speak for themselves: U.S. Healthcare isn’t equitable. Health systems must act promptly and strategically to remedy this nationwide underperformance and demonstrate their commitment to not only health equity, but also healthcare quality and outcomes improvement.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/health-equity-why-it-matters-how-to-achieve-it/feed/0Introducing Touchstone: The Next-Generation Healthcare Benchmarking and Opportunity Prioritization Toolhttps://www.healthcatalyst.com/how-touchstone-transforms-healthcare-benchmarking
https://www.healthcatalyst.com/how-touchstone-transforms-healthcare-benchmarking#respondTue, 27 Feb 2018 15:58:19 +0000https://www.healthcatalyst.com/?p=29749When it comes to healthcare benchmarking today, organizations need to be able to do four things quickly, effectively, and efficiently: Identify where they are underperforming. See where they are performing well. Decide where to focus their improvement efforts. Prioritize their improvement efforts. Many things stand in the way of accomplishing these objectives: Click to View […]

When it comes to healthcare benchmarking today, organizations need to be able to do four things quickly, effectively, and efficiently:

Identify where they are underperforming.

See where they are performing well.

Decide where to focus their improvement efforts.

Prioritize their improvement efforts.

Many things stand in the way of accomplishing these objectives:

Click to View Infographic

The process of selecting opportunities for performance improvement is under-informed by data.

Healthcare benchmarking data, used to compare an organization’s processes to those of top-performing organizations nationwide, is notoriously difficult to explore.

Analysts must sift through hundreds of benchmarks to uncover opportunities for improvement, yet lack visibility into areas of high performance and causes of underperformance.

Most benchmarks address only inpatient performance, ignoring the comparative data from care settings outside the hospital that is essential to population health management.

Health Catalyst® Touchstone not only resolves these problems, but also expands the definition of benchmarking to include the full continuum of care. Touchstone proactively delivers prioritized, artificial intelligence (AI)-powered improvement recommendations in an intelligent, intuitive user interface. It’s the next-generation benchmarking and prioritization tool healthcare needs to accelerate outcomes improvement.

Current Approaches to Opportunity Identification and Healthcare Benchmarking Miss the Mark

Current benchmarking tools are built on antiquated technologies and are primarily focused on performance monitoring rather than recommending and driving sustained improvements. Unable to drill into healthcare data to reveal the cause of benchmark failures, current tools cannot tell organizations how to make improvements. Their high-cost, rudimentary user interfaces and limited use by a small number of people hampers their ability to democratize data and accelerate improvements. Current benchmarking tools provide either inpatient or ambulatory benchmarks, but rarely both. As a result, it is difficult to monitor performance across the continuum of care, making population health management less effective.

Unlike these traditional, antiquated approaches, Touchstone proactively recommends the top improvement opportunities from across the full continuum of care.

Developed in partnership with some of the nation’s most prestigious healthcare organizations and by leveraging Health Catalyst’s proven ability to manage trillions of data points, Touchstone combs through data from EHRs, claims, cost-accounting datasets, operations, and external benchmarks to provide:

Risk-adjusted benchmarking—Reveals an organization’s performance data across the continuum of care and in context, at the point of decision-making via an intuitive user interface.

AI-powered intelligent direction—Uses AI and Elasticsearch, an in-memory NoSQL database capable of querying billions of records in less than one second, to guide users to the data and analyses that are most relevant to their work and to the organization’s goals. AI accounts for variation in care, external benchmarks, trending analysis, and outlier detection to recommend how improvements can be made.

Touchstone’s prioritization matrix (Figure 2) and easy-to-use interface turns average analysts, usually hindered by very manual processes, into extremely efficient, world-class analysts. It answers analysts’ most common questions:

Where do I have the highest length of stay (LOS) compared to risk-adjusted benchmarks?

Which patient cohorts do I have the greatest opportunity to improve when considering my performance on all my cost and outcome metrics simultaneously?

How can I combine clinical variation analysis and comparisons to risk-adjusted benchmarks to find opportunities to improve or areas where I excel?

Touchstone lets analysts look at many different factors simultaneously: a global perspective that’s rare in healthcare benchmarking and performance improvement, and helps analysts identify the factors underlying each opportunity. In the same way Netflix finetunes its recommendations based on user preferences over time, Touchstone’s recommendations improve over time based on user interest (e.g., surgical services, labor and delivery, etc.).

Inpatient Module—Discovers areas of excellence within an organization compared to national benchmarks. It identifies inpatient performance opportunities against risk-adjusted benchmarks for metrics such as mortality, LOS, readmissions, and variable direct costs (e.g., supplies, labor, etc.). It also identifies care processes that offer the greatest opportunity for improvement: those with high variation in cost and poor outcomes.

Population Health Module—Enables organizational performance against risk-adjusted benchmarks for metrics such as cost (per member per month), utilization, mortality, LOS, and readmission. It also gives organizations insight into their highest-performing primary care physicians in a network, patient populations based on chronic conditions (and for whom care can be improved), and health plan performance compared to similar plans nationwide or throughout a specific region.

Touchstone was created to solve common problems (and the associated headaches) health system staff (analysts, leaders, etc.) encounter daily.

From Analysts to C-Suites: Touchstone Was Built with Everyday Use Cases in Mind

Touchstone’s state-of-the-art outcomes improvement and benchmarking capabilities were designed with multiple users and everyday use cases in mind:

Population Health Analyst Who Wants to Improve ACO Performance

A population health analyst wants to understand how to improve ACO performance. Touchstone recommends an opportunity to reduce inpatient utilization in the diabetic population attributed to the organization’s Medicare MSSP contract. After drilling into Touchstone’s explore tab, the population health analyst discovers that high inpatient utilization is being driven by patients attributed to five primary care providers who have visit utilization rates 80 percent lower than the risk-adjusted benchmark. The analyst recommends scheduling the physicians’ patients for wellness visits.

A health plan analyst looking for opportunities to improve benefit design is directed by Touchstone to an orthopedics doctor who performs knee replacements that are 20 percent less expensive than the benchmark, with a readmission rate that is 15 percent lower. The following year, the plan augments its benefit design to drive more volume to the high-performing doctor.

Program Manager Who Wants to Improve Outcomes and Lower Costs

A program manager in a hospital’s surgical department uses Touchstone to look for opportunities to lower costs and improve outcomes. Touchstone recommends an opportunity to improve the mortality rate for CABG procedures, currently running 90 percent higher than the risk-adjusted benchmark would expect. Drilling into the data, the manager finds that three physicians’ cases are driving most of the high mortality rates and that the variable direct cost of supplies is correlated with these higher-than-expected mortality rates. The manager deploys a clinical analyst to investigate the specific drivers of the cost differential, positing that if there are discrepancies in the way these three providers deliver care they could be rectified to reduce overall mortality and cost.

No matter the user or the use case, Touchstone’s intelligent user experience makes it simple for everyone, even non-analysts, to accomplish their specific goals.

Touchstone Is the Industry’s Next-Generation Tool for Healthcare Benchmarking and Performance Improvement

Touchstone transforms the industry’s traditional, myopic approach to healthcare benchmarking and performance improvement. It proactively recommends prioritized opportunities using AI-powered risk models; it democratizes benchmarking for all staff with an intelligent, intuitive user interface; it includes risk-adjusted benchmarks for the full continuum of care (not just the inpatient setting); and it allows for quick, iterative root cause analyses to understand the elusive “why” behind each opportunity.

Touchstone is the next-generation healthcare benchmarking and prioritization tool healthcare organizations can rely on to accelerate outcomes improvement, enhance performance, and achieve a competitive advantage.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>Success in today’s data-driven healthcare industry will be increasingly defined by leaders who understand data science. This knowledge will be critical as executives build and guide teams toward a harmonious, well-planned vision for healthcare improvement that fully harnesses data’s capabilities.

Up to 30 percent of the world’s warehoused data comes from the healthcare industry. There’s significant opportunity for healthcare improvement in this information cache, including an estimated $300 billion in annual cost savings. But the industry can only welcome these prospects if health systems fully leverage data to identify areas for improvement and promote evidence-based care. Even with this massive data potential, healthcare too often relies on outdated technology. For example, up to 75 percent of medical communication still occurs via fax machine (in an era where automotive companies use data science to add navigation capabilities to cars).

Technology has laid out the opportunities, but, to realize gains in the digital era, healthcare leaders must understand data science and the urgency of investing in data science resources (technology and people).

The Bird’s Eye View of What Data Scientists Do

As a broad term, data science means pulling information out of data, or converting raw data into actionable insights. Data scientists are knowledgeable in their subject matter (e.g., healthcare clinical data) and statistics, and use computer programming skills to tell the computer how to leverage data to derive insights. Data scientists augment traditional data analysis by automating the process of insight delivery through code. This automation can bring efficiency gains and new depths of insight to analytics, and enables real-time predictive analytics by reducing the time it takes to go from data to prediction.

The heart of data science is machine learning models, which are basically statistical models that can be used to extract patterns from data. Data science and machine learning can also be thought of as using the power of modern computing to leverage statistics. Some machine learning models, such as regularized regression and decision trees, lend themselves well to deriving insights and explaining patterns in data (e.g., which clinicians are over-utilizing costly materials). Other machine learning models, such random forests and neural networks (deep learning), are primarily used for prediction (e.g., each patient in a population’s likelihood of readmission after discharge).

Healthcare has long relied on data and data analysis to understand health-related issues and find effective treatments. For example, researchers have used double blind placebo-controlled studies as the foundation of evidence-based medicine. Such studies generate data about the treatment under evaluation and analyze that data to determine whether the treatment is effective, as well as understand its side effects. As a method of generating data and insight, this study process works in a spirit similar to data science, but is costlier and more time consuming.

Today, healthcare needs data to optimize patient outcomes with evidence-based practices more than ever; those insights are waiting to be discovered in data that has already been collected. With data science, the industry can find efficient, cost-effective ways to harness vast amounts of existing healthcare data—to maximize its potential to transform healthcare with faster, more accurate diagnosis and more effective, lower-risk treatment.

Data Science for Healthcare in Action

Researchers from Stanford University have developed a model that can diagnose irregular heart rhythms (arrhythmias) from single-lead ECG signals better than a cardiologist. Clinicians record more than 300 million ECGs annually, so the data needed for improved arrhythmia diagnosis already exists. With data science, health systems can leverage this information to make more accurate and more efficient diagnoses.

Another group of Stanford researchers has developed a diagnostic model for skin cancers that uses AI to classify images of skin lesions as benign marks or malignant skin cancers. This model, which can classify lesions as accurately as board-certified dermatologists, can potentially save health systems and patients time and cost by transforming the multistep process of diagnosing skin cancer (visual diagnosis, clinical screening, and possible dermoscopic analysis, biopsy, and tissue examination) into a single-step data analysis. While models are not designed to replace clinicians, they can provide valuable diagnostic guidance, making the care process both more efficient and more effective.

Mission Health wanted to improve the accuracy of its readmission risk assessment, so it leveraged machine learning to develop a predictive model based on its own patient population. Mission was using the LACE index to predict risk for readmission, which, while somewhat helpful, was developed using a patient population from Canada that was notably different from Mission’s demographic. With a machine learning model that used its own population, Mission improved its readmission risk prediction to outperform LACE and achieve a readmission rate 1.2 percentage points lower than its top hospital peers.

The Time to Understand and Leverage Data Science Is Now

Data will continue to be a dominant factor in healthcare delivery and outcomes improvement. For organizations to successfully navigate the complexity of a data-driven world and embrace improvement opportunities, healthcare leaders must understand data science; they must become students of data science, understanding how it’s working in other companies and its implications for their health systems. And, if they haven’t already, leaders must start developing data scientist skills on their teams.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/data-science-healthcare-what-leaders-must-know/feed/0The Impact of FDA Digital Health Guidance on CDS, Medical Software, and Machine Learninghttps://www.healthcatalyst.com/fda-digital-health-guidance-impact-on-cds-and-software
https://www.healthcatalyst.com/fda-digital-health-guidance-impact-on-cds-and-software#respondTue, 20 Feb 2018 21:02:50 +0000https://www.healthcatalyst.com/?p=29496The FDA recently released guidance documents on select provisions of the 21st Century Cures Act (the Cures Act). Digital health products and services providers are most concerned about guidance on the use of clinical decision support (CDS) and medical software. For healthcare organizations working with these fast-changing technologies and constantly pushing digital innovation, federal guidance […]

]]>The FDA recently released guidance documents on select provisions of the 21st Century Cures Act (the Cures Act). Digital health products and services providers are most concerned about guidance on the use of clinical decision support (CDS) and medical software. For healthcare organizations working with these fast-changing technologies and constantly pushing digital innovation, federal guidance that hints at changing the regulation of their pioneering efforts may send up warning flags.

However, there is little cause for alarm. The FDA regularly issues clarification on how it is going to enforce statutes and regulations to provide useful interpretive information to the industry. Although the FDA digital health guidance documents are not regulation (they are non-binding, informational communications), digital health executives and developers can still use the FDA’s guidance to understand the current approach it is taking toward regulation, and how this approach impacts CDS and medical software development, as well as provider workflows.

FDA Guidance on CDS

Healthcare organizations use CDS to reduce errors and improve efficiency, standardization, and cost savings. The FDA guidance on CDS (the CDS Guidance) describes what types of software tools the FDA considers to be CDS, and which CDS is excluded from FDA regulation as a medical device.

The CDS Guidance states that software functions must meet the following three criteria to be considered CDS:

Not intended to acquire, process, or analyze a medical image or a signal from an in vitro diagnostic device or a pattern or signal from a signal acquisition system;

Intended for the purpose of displaying, analyzing, or printing medical information about a patient or other medical information (such as peer-reviewed clinical studies and clinical practice guidelines);

Intended for the purpose of supporting or providing recommendations to a health care professional about prevention, diagnosis, or treatment of a disease or condition.

Having met these three criteria, CDS must then meet a fourth criterion to fall outside the FDA’s definition of a medical device and be exempt from regulation:

Intended for the purpose of enabling such health care professional to independently review the basis for such recommendations that such software presents so that it is not the intent that such health care professional rely primarily on any of such recommendations to make a clinical diagnosis or treatment decision regarding an individual patient.

The CDS Guidance provides examples of CDS software functions that do and do not meet these criteria. Two key elements are provided in the guidance to further define the criteria for exemption:

The healthcare practitioner must be able to independently evaluate the basis of the CDS recommendations.

The CDS recommendations must be based on publicly available clinical guidelines (i.e., studies, published literature, clinical practice guidelines).

These two elements beg the question: what about using algorithms and machine learning as the basis for CDS recommendations?

Machine Learning as a Basis for CDS Recommendations

Healthcare machine learning models produce information that improves patient care. This information may not always be based on published literature, but its reliability is correlated to significant historical patient data and actual outcomes. Providers will exercise clinical judgment before taking any action on the output of machine learning, which is often in the form of recommendations, alerts, and risk prioritizations.

What’s concerning is that an algorithm could operate on unspecified health data within a “black box” and deliver a result, the basis of which a provider might not understand. The FDA is appropriately addressing this issue by requiring disclosure of the inputs used in order for software to be excluded from regulation. Health Catalyst agrees with the principle that any applications using algorithms should have functionality that allows users to view the input features and variables, including the clinical guidelines or findings that helped shape the predictive model.

It is also important to note that, as discussed in an earlier article, a clinical intermediary will always be part of machine learning-enabled CDS to impose judgement and connect any recommendations (produced by the machine learning model) to relevant clinical guidelines.

Health Catalyst and other organizations will seek clarification from the FDA on the use of algorithms and machine learning as the basis for providing CDS recommendations, in lieu of, or in addition to, publicly available clinical guidelines.

The use of the term “providing recommendations” in the CDS Guidance could also be clarified. Many CDS systems, including Health Catalyst’s analytics applications, provide prioritizations and alerts, but they don’t always recommend a course of action. The FDA’s CDS Guidance only addresses recommendations, but innovative healthcare systems may need more nuance to address their advanced clinical environments. The FDA is expected to review and respond to the public comments when it issues its final guidance documents.

Enforcement Discretion

The CDS Guidance provides informative examples of CDS functionality that fall into the following three categories:

CDS functionality that does not constitute a medical device—these are CDS functions that do not meet the definition of a medical device;

CDS functionality that may meet the definition of a medical device but for which the FDA does not intend to enforce compliance; and

CDS functionality that the FDA intends to regulate as a medical device.

Where the FDA “does not intend to enforce compliance” means that the decision to not regulate non-medical-device software is based on the FDA’s discretion. In the future, the FDA could modify its policy, which could result in a more significant impact on vendors that use CDS with machine learning functionality. While the FDA’s approach introduces uncertainty, it is our opinion that, because there have been enough policies issued over a sustained period, radical shifts in enforcement policy are unlikely.

Patient Decision Support

The CDS Guidance also addresses patient decision support (PDS) software, regarding functionality for use by patients rather than clinicians. This guidance has similar criteria to CDS and indicates that PDS will not be regulated as a device. The FDA appears to be drawing the distinction because PDS likely applies to a different audience (patients). For example, PDS guidance applies to patient-accessed modules and functions within Health Catalyst’s Care Coordination application in the Care Management suite.

FDA Guidance on Medical Software

The FDA guidance on changes to existing medical software policies (the Medical Software Guidance) provides clarification on policy changes made by the Cures Act. The Medical Software Guidance builds upon several previous FDA guidance documents and clarifies that software functions classified into any of four areas are exempt from regulation as a medical device:

Software function intended for administrative support of a healthcare facility.

Software function intended for maintaining or encouraging a healthy lifestyle.

The Cures Act had already exempted these software functions, so this guidance simply provides more detail related to the Cures Act provisions, as well as prior guidance on mobile medical applications. Software, like Health Catalyst’s team-based fitness and wellness application, falls under this guidance, which excludes software for maintaining or encouraging a healthy lifestyle, and which is unrelated to the diagnosis, cure, mitigation, prevention, or treatment of a disease or condition.

Additional Ramifications of the FDA Digital Health Guidance

As long as software vendors comply with the FDA guidance, any cost impact should be minimal, provided that their software does not seem to fall within a category that the FDA indicates is a medical device. If vendors aren’t already providing transparency through their applications, then workflow processes could be impacted as they move from a black box approach to one in which data inputs used by algorithms are disclosed to users. Providing these disclosures could potentially involve more time and effort by providers because it could add an input review phase with an internal clinician panel, although this would be discretionary.

The CDS Guidance and Medical Software Guidance were issued in draft form to allow the FDA to consider public comments. While possible, it is not likely that the exemptions will change significantly in the final documents. Vendors might expect to see clarification on how to disclose the basis for recommendations or conclusions derived by CDS, and where machine learning fits into these disclosures. The FDA will also likely include a few more examples of what functionality it considers to be exempt from, or subject to, regulation when it issues its final guidance later in 2018.

Good News for Digital Health Innovators

There are no major surprises with the new FDA guidance documents in terms of what will be regulated or not. The CDS Guidance reassures practitioners that CDS functions that meet the non-device criteria will either be excluded from regulation or the FDA will not enforce regulation. This approach can help promote innovation by ensuring a quicker time to market—and value—for developers and CDS system users.

In its own words, the FDA’s intent is to foster, not inhibit, innovation, and to encourage the development of tools that can help people be more informed about their health. Though these documents don’t establish rules, they should be taken seriously because they contain useful information about how the FDA will enforce its rules, and how it interprets the Cures Act.

The industry response to these new guidance documents should be positive. The policies provide appropriate oversight, promote software transparency, and let digital health developers “know where they stand relative to the FDA’s regulatory framework.” Most importantly, the guidance supports the success of vendors and healthcare organizations as they continue working to improve the features and functionality of their CDS systems and medical software.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/fda-digital-health-guidance-impact-on-cds-and-software/feed/0Five Lessons for Building Adaptive Healthcare Data Models that Support Innovationhttps://www.healthcatalyst.com/how-to-build-adaptive-healthcare-data-models-5-lessons
https://www.healthcatalyst.com/how-to-build-adaptive-healthcare-data-models-5-lessons#respondThu, 15 Feb 2018 23:34:07 +0000https://www.healthcatalyst.com/?p=29488With healthcare delivery business models actively changing throughout this decade, our industry has witnessed a flood of digital technology startups working to solve new problems that arise with these changes. Traditional investors and health systems alike have welcomed this interest: the funding of digital healthcare companies has quadrupled in the past six years. However, a […]

]]>With healthcare delivery business models actively changing throughout this decade, our industry has witnessed a flood of digital technology startups working to solve new problems that arise with these changes. Traditional investors and health systems alike have welcomed this interest: the funding of digital healthcare companies has quadrupled in the past six years. However, a number of these digital healthcare companies are failing due to a lack of access to healthcare data.

While it’s true that there is a wealth of data available within healthcare organizations, there is not a single, concise data model that supports use cases in a sustainable way. Because of this limited access to data, some of the greatest ideas for solving today’s healthcare problems may never come to fruition. The industry’s history is littered with failed attempts to model healthcare data, from comprehensive enterprise data models that try to model every healthcare concept that might ever be encountered, to home-grown data models driven by ad-hoc requests for data points in which the value of the information added to the model is not assessed.

We live in a time of great opportunity to leverage new technology to improve healthcare, but realizing this opportunity requires concise, purpose-driven data models that also adapt to each organization’s specific needs.

How to Build Adaptive Healthcare Data Models that Support Innovation: Five Lessons

At Health Catalyst, we are on our third iteration of how we approach data. We are two years into our journey of building and deploying the data models that support the Health Catalyst®Data Operating System (DOS). Based on this extensive experience—successes and failures—here are the five pragmatic lessons we’ve learned about building adaptive healthcare data models:

#1. Focus on the Most Relevant Content

Health Catalyst built concise yet adaptive data models using the most widely employed data elements from the products and analyses we’ve developed with our health system clients since our founding in 2008. Following the principles of late binding, we employ several strategies for including only the most relevant content in our data model:

Include data elements where there is comprehensive, persistent agreement about their definitions.

Support access to the raw source data to allow analyses that aren’t constrained to the data model.

It’s important to focus on the most relevant content and specific use cases: what people are looking for on a regular basis versus all the data points users may ever need for data analysis.

#2. Externally Validate the Model

While we have not found an industry-defined data model that’s pragmatic to use in business-critical analytics and outcomes improvement, great research and experiential knowledge have gone into several data models, such as HL7’s FHIR model. We’ve analyzed and compared our data models to FHIR specifically, as there is an increasing convergence towards FHIR for data exchange. We also benefit from alignment with FHIR as we move to add FHIR-compliant APIs to DOS. We found a great deal of overlap, yet diverged from it in a few areas where FHIR wasn’t well-suited for our analytics use cases.

No existing data model is perfect, but we can learn from what exists and leverage the great work that’s already been done in healthcare data modeling.

#3. Commit to Providing Vital Documentation

I’ve worked at three data-driven healthcare companies over the past two decades. I’ve trained hundreds of data analysts from dozens of healthcare organizations on a handful of different data models, and simple entity-relationship diagrams (ERDs) are the most effective tools I’ve found for quickly bringing a data analyst up to speed. In the past, my team compiled this documentation into a binder that we published, which clients and team members referred to as their “bible.”

Even experienced data analysts need a guide to the tables that exist and how they relate to each other before they can get started. Analysts want to know where data comes from, so the system we built self-documents the data pipeline, showing analysts where the data is sourced from. Once data analysts understand the high-level data model and the data’s origins, they can dive in and make use of the data. Other detailed questions about the data model will inevitably come up: “Which date/times are used to calculate LOS?” and “What’s the exception criteria (e.g. transfers)?” This is where a detailed data dictionary comes in, with definitions of each table and each column, including any nuances of what data should be populated within it.

Data model documentation should both enable its users to quickly come up to speed on the model and serve as a reference manual when detailed questions come up.

#4. Prioritize Long-term Planning

At Health Catalyst, each data model has a product owner responsible for the long-term success of the product. Each data model’s owner and team are embedded in the groups that use the data model; they are experts in their data model’s domain. Product owners work with deployment teams to understand what’s working and what’s not, build product feature backlogs and roadmaps, and manage regular releases of the data model. These teams treat data models like products by leveraging effective development tools, such as source control and integrated development environments (IDEs).

An example of long-term planning comes from our recent evaluation of one of Health Catalyst’s larger data models, in which we discovered unused fields and duplicative fields that were causing slower data refreshes each night. We successfully trimmed the data model’s size by 24 percent, resulting in faster load times.

This focus on the long-term success of the data model, versus only fixing acute issues as they arise, is key to ensuring it stays healthy and relevant.

#5. Automate Data Profiling

In nearly every analytics project that I have participated in, testing data quality is a very manual endeavor that’s done once, up front. For example, when a data warehouse goes in, experienced data analysts are asked to evaluate the quality of the data. Expert analysts know which areas are most prone to issues, like the downstream effects of duplicates within a referring provider table on data quality, and they aren’t fooled by false positives, like “missing” orders for reflex lab results.

While an expert review is necessary, it’s terribly labor intensive and much of the work is looking for well-known patterns. We strongly advocate using automated data profiling tools to support experts in data quality reviews. Automated data profiling can identify the most common issues and patterns that affect data quality. For example, our automated profiling tools look at column fill rates and value distributions within the data—information that helps expert analysts quickly identify problems, such as a 20 percent fill rate on the primary diagnosis field for inpatient visits. Data profiling tools allow an analyst at a pediatric cancer center, for example, to use expert judgement to interpret a 19:1 annual ratio of prescription fills per patient as normal, whereas this would be a red flag for data quality at a health system with a healthier patient population.

Pair automated data profiling with an expert review to ensure the highest data quality. Validate data quality upon initial system installation and proactively monitor through automated data profiling against production systems.

Adaptive Healthcare Data Models are the Foundation of Innovation

Healthcare has made incredible advancements in developing healthcare data models—models that underlie the innovative technologies designed to solve healthcare’s problems. The five lessons outlined in this article come from Health Catalyst’s experience as we continuously evolve our standard data models, but they also reflect an industrywide data model evolution.

Healthcare continues to build and invest in innovative digital solutions. As a result, the need for concise yet adaptive healthcare data models that supports this innovation—and the corresponding methodologies, tools, and best practices that go into building the model—is stronger than ever.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/how-to-build-adaptive-healthcare-data-models-5-lessons/feed/0Four Effective Opioid Interventions for Healthcare Leadershttps://www.healthcatalyst.com/best-practice-opioid-intervention-for-healthcare-leaders
https://www.healthcatalyst.com/best-practice-opioid-intervention-for-healthcare-leaders#respondThu, 10 Aug 2017 20:28:50 +0000http://healthcatalyst.wpengine.com/?p=28224At one time, pain was undertreated, even among patients dying from cancer, and opioids were reserved to treat only the most severe pain. This gradually changed and assessing and treating pain became so important, it was named the fifth vital sign. Pharmaceutical companies began aggressively developing and selling different configurations of opioids (long acting, different […]

At one time, pain was undertreated, even among patients dying from cancer, and opioids were reserved to treat only the most severe pain. This gradually changed and assessing and treating pain became so important, it was named the fifth vital sign. Pharmaceutical companies began aggressively developing and selling different configurations of opioids (long acting, different routes, etc.), deeming them safe for patients. Regulatory boards and professional organizations, such as the American Pain Society, began pushing against the undertreatment of pain. Social attitudes toward opioids changed, as did clinician prescribing patterns, as they attempted to relieve their patients’ pain.

Click to View Infographic

Now, the U.S. is facing an opioid epidemic. The increase in prescribing patterns has led to the increased availability of opioids and the potential to overuse them by the patient it was prescribed for, as well as misuse by others who beg, borrow, or steal the opioids from the intended patient, the so-called “non-medical use of opioids”. The facts from the CDC about this public health problem are sobering:

Since 1999, overdose deaths involving opioids have quadrupled, and sales of opioids have almost quadrupled.

More than six out of every ten drug overdose deaths involve an opioid.

On an average day, 3,900 people initiate nonmedical use of prescription opioids and 78 people die from an opioid-related overdose.

Figure 1: CDC >map outlining variability in opioid prescriptions state to state

Because misuse is a multifactorial problem, multiple opioid interventions are required to combat the epidemic. But there are steps healthcare organizations and prescribers can take—both process and data driven—to decrease the risk of harm from opioids.

The Risk Patients Face When Prescribed Opioids

When I first started practicing as a registered nurse, appropriate relief of pain was a high priority. Like many others, I was trained that pain was the fifth vital sign. The minimum frequency for assessment of pain was every four hours, and we established pain goals with each patient. Pain was a frequent problem for the patients under my care because all had experienced a recent trauma. Broken bones and major wounds were common, so it was appropriate to assess the patients’ pain and appropriately attempt to relieve that pain so they could participate in their care.

It has been nearly 20 years since I provided care to her, but I still remember one patient because I was taken aback by the number of opioids prescribed and the amount I needed to administer to relieve her pain. Her opioid tolerance, and significant abuse, had started with a traumatic injury. For the initial treatment of her acute pain she received intravenous opioids. Then she was prescribed oxycontin (a medication that releases the opioid over 12 hours) for the pain and oxycodone (an immediate release opioid) for breakthrough pain. These were prescribed for her to continue at discharge, as it was anticipated that her injuries would result in continued pain.

More than a year later, when she was admitted to the hospital again, she was still taking both opioids, and had added a fentanyl patch. When the provider—who had been prescribing her 220mg of oxycontin three times daily, 60mg of oxycodone every three hours, and 100mcg fentanyl patch daily—moved out of the area, she had difficulty obtaining the number of prescriptions required for her to avoid the severe discomfort and symptoms of withdrawal she experienced without the opioids. Facing discomfort and withdrawal, she began using heroin because it was cheaper than oxycodone or fentanyl, and easier to get.

Her heroin abuse led to an infection of her hip joint and prosthesis, requiring multiple revisions. I still remember this patient, because her opioid misuse, which started with a prescription from a prescriber with good intent who was appropriately working to alleviate her severe pain, ultimately contributed to her demise. Because of her heroin use, she ended up with necrotizing fasciitis. I provided care to her over five years as she underwent numerous surgeries, painful wound debridement, numerous infections, and two above-the-knee amputations. I administered many opioids, including methadone, oxycontin, oxycodone, hydromorphone, and fentanyl. I also administered ketamine, a dissociative anesthetic that results in a trance-like state, reducing pain, and numerous medications to help ease the negative side effects of the opioids she was receiving.

Through all of this, she was aware of the risk she faced taking heroin. She had a central venous catheter, but never used that line for her heroin, telling me that it was too close to her heart and she didn’t want to die. Sadly, that is what finally happened. After years of opioid misuse and heroin abuse, she overdosed and died.

The Call for Opioid Intervention

While we healthcare providers absolutely need to do our best to alleviate pain and suffering, there is evidence that suggests non-medical opioid use is associated with heroin abuse, and that even appropriate medical use increases the risk of chronic opioid use.

A recent publication from the CDC evaluated the impact of early opioid prescribing patterns on opioid-naïve patients. The findings reveal the risks: a second opioid prescription doubles the risk for opioid use one year later; risk increases with each additional day of opioids supplied—starting with the third day— and the sharpest increase after the fifth and 31st day of therapy, a second prescription refill, 700 morphine equivalent cumulative dose, and an initial 10- or 30-day supply.

With these risks in mind, it is important that healthcare systems find a balanced approach to relieving pain and suffering, while also combating opioid misuse and overdose. While prescribers and the public await improved abuse-deterrent medications and improved non-opioid, pain-relieving treatment options, there are data-informed interventions that healthcare systems, payers, states, and individual prescribers can start today to help combat the epidemic.

Four Approaches to Confront the Opioid Epidemic

Data, analytics, and best practices can be used to identify opportunities for improvement and drive the prevention of opioid misuse and overdose.

1. Use Data and Analytics to Inform Strategies that Reduce Opioid Availability

Healthcare systems, payers, and prescribers can use data and evidence to change practices and reduce the opioid availability within the community.

Payers have access to rich data they can use for improvement. This data can be evaluated and provider-specific data shared with individual prescribers. For example, Aetna actively analyzes its claims database and intervenes if there is evidence of abuse. It notifies physicians if patients are taking more than three opioids, or if they have multiple prescriptions. This opioid oversight program reduced opioid prescriptions by 14 percent between 2010 and 2012 among 4.3 million members. In 2016, Aetna evaluated prescribing patterns and the CMO sent personal letters to nearly 1,000 prescribers—the top one percent—who refill prescriptions at a much higher rate than their peers.

There are situations in which higher amounts of opioids are clinically appropriate and should be prescribed. Sharing data creates the opportunity for prescribers to evaluate their prescribing patterns relative to their peers and adjust where appropriate.

How Allina Health Reduced Opioid Prescriptions by One Million Pills in One Year

Using data from the enterprise data warehouse, Allina Health obtained data on prescribing patterns, shared it with providers, and identified several opportunities to reduce the number of opioids prescribed.

The Allina Health team evaluated prescribing patterns in relation to national guidelines and evidence, and instituted guidelines for primary care providers, including:

Avoiding long-acting opioids.

Prescribing less than 20 opioid pills per prescription.

Limiting the duration to less than five days, unless it is assessed that the injury or medical condition will last longer.

Allina Health’s efforts are already producing results, with nearly one million fewer opioid pills prescribed in the outpatient setting in just one year.

2. Adopt Prescription Drug Monitoring Programs to Prevent Misuse

States can increase the availability of Prescription Drug Monitoring Programs (PDMPs) and availability of data from other state PDMPs. Vendors need to integrate all available state data into EHRs, improving the workflow and ease of use for prescribers. State regulatory boards can help communicate evidence-based dosing guidelines, increase education opportunities and ongoing education requirements for prescribers, and should consider requiring additional prescriber education requirements beyond a specific morphine milliequivalent dose.

PDMPs help providers identify patients who might be misusing their prescription drugs. Following the implementation of statewide PDMP programs and requiring prescribers to check the PDMP prior to prescribing, New York saw a 75 percent decrease, and Tennessee a 36 percent decrease, in patients who were seeing multiple prescribers to obtain the same drugs.

3. Adopt Evidence-Based Guidelines

Washington state implemented evidence-based dosing guidelines, including a dosing threshold trigger for consultation with pain specialists, criteria to be considered a pain specialist, elements for patient evaluation, periodic review of the patient’s treatment plan, exemptions for special circumstances, and continuing education requirements. The state also obtained additional funding for the PDMP. These changes are believed to have contributed to a 27 percent reduction in opioid deaths between 2008 and 2012.

Primary care providers account for approximately 50 percent of all dispensed prescription opioids. Individual prescribers, particularly those within primary care, should be familiar with, and use, the most recent evidence when making decisions regarding the treatment of chronic pain. The CDC Guideline for Prescribing Opioids for Chronic Pain can help providers make informed decisions about pain treatment for patients 18 and older in the primary care setting.

The ED is the largest ambulatory source for opioids. Prescribers within the ED can make use of the opioid prescribing resources made available by the American College of Emergency Physicians, and should limit prescribing opioids for chronic pain to only the immediate treatment of an acute exacerbation of uncontrolled pain.

When prescribing opioids, prescribers need to check the PDMP, use the lowest possible effective dose, and start with immediate release opioids rather than long-acting opioids. The quantity prescribed should align with the expected duration of the pain.

Dosages at or above 50 morphine milliequivalent doses are associated with an increased risk for overdose. Prescribers should consider offering a prescription for naloxone (the medication used to reverse opioid overdose) and provide education to the patient and their support system about how naloxone works.

States can use data from PDMPs, Medicaid, workers’ compensation programs, and state-run health plans to identify pain clinics that may be prescribing opioids in ways that are risky to patients, so they can address inappropriate prescribing. Many states also need to increase access to substance abuse treatment services and medication-assisted treatment services.

Healthcare providers and states can support layperson administration of naloxone and can encourage bystanders to be good Samaritans and summon emergency response for someone who has overdosed. These good Samaritans should be encouraged to contact emergency response without fear of arrest or other negative consequences. As of June 2016, all but three states had passed legislation to increase layperson access to naloxone. Good Samaritan laws are not yet widespread.

Increases in the availability of naloxone, Good Samaritan laws, and education for the public regarding the safety and utility of naloxone can reduce the number of overdoses. Between 1996, when naloxone was first made available to laypersons, and June of 2014, more than 150,000 naloxone kits were distributed, and 26,463 overdoses were reversed.

The Opioid Epidemic is a Public Health Problem with a Treatment

It is critical that healthcare providers continue to work to alleviate pain and suffering. At the same time, it is increasingly important that each of us involved in the delivery of healthcare—healthcare systems, payers, and prescribers—use data, analytics, and evidence-based practices to inform prescribing patterns, identify potential misuse, and change practices to minimize opioid misuse and overdose. Patients, like the woman I still think about from so many years ago, are relying on us to relieve their pain without inadvertently increasing the risk that they will be harmed.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/best-practice-opioid-intervention-for-healthcare-leaders/feed/0How to Apply Machine Learning in Healthcare to Reduce Heart Failure Readmissionshttps://www.healthcatalyst.com/how-machine-learning-in-healthcare-can-improve-outcomes
https://www.healthcatalyst.com/how-machine-learning-in-healthcare-can-improve-outcomes#respondThu, 08 Feb 2018 20:45:35 +0000https://www.healthcatalyst.com/?p=29451What if a technology could accurately predict the likelihood of heart failure readmissions? Or the likelihood that a patient with heart failure would not take his medications or would miss his appointments? Heart failure readmissions are one of healthcare’s biggest blocks to providing value-based care. Heart failure consistently ranks as one of the top five […]

]]>What if a technology could accurately predict the likelihood of heart failure readmissions? Or the likelihood that a patient with heart failure would not take his medications or would miss his appointments?

Heart failure readmissions are one of healthcare’s biggest blocks to providing value-based care. Heart failure consistently ranks as one of the top five principle diagnoses causing readmissions within 30 days. And it’s expensive for hospitals, which pick up almost 70 percent of the $110,000 incurred by each patient with heart failure over a lifetime.

Although healthcare systems across the United States already use readmission risk assessment tools, these tools can be unreliable. Some tools, such as the LACE index, require slow, manual processes that can produce inaccurate results. Critics have questioned the validity of the LACE index in its applicability to broad patient populations. These tools are also only available at a particular point in the patient journey. For example, LACE uses data that is only available at the time of discharge. But if predictions are going to guide decisions throughout the continuum of care, they need to be readily available well before discharge. With stricter initiatives calling for reduced readmissions, many systems are pursuing more accurate prediction tools.

MultiCare Health System, working with Health Catalyst, learned about the potential for machine learning in healthcare to more accurately make predictions. As a bonus, MultiCare could use machine learning to automate the prediction process and reduce the documentation burden on clinical staff. Through the approach outlined below, the health system began exploring machine learning’s ability to predict, and ultimately lower, heart failure readmissions.

Laying the Groundwork for a Machine Learning Program

Any improvement initiative should begin with buy-in from stakeholders across the system. It’s no different when that initiative includes machine learning tools. The collaboration should include frontline clinicians, data scientists, quality directors, and program managers. Developing a machine learning program—in MultiCare’s case, a predictive model to reduce readmissions across the entire organization—requires knowledge and expertise from multiple disciplines.

Developing the Predictive Model

When developing a predictive model, the team must gather the relevant historical data. The team also needs to identify input variables (or features) that could influence or predict the target outcome (i.e., likelihood of readmission). Broadly, the role of machine learning here is to learn the relationships between patient attributes and subsequent outcomes. While many types of events could be predicted, the aforementioned business priorities focused this project on readmission risk. Initially, the dataset will include a large number of input variables that the machine learning algorithm will analyze and pare to a smaller set of the most important outcome drivers.

For MultiCare’s predictive model, data scientists wanted to be able to predict 30-day heart failure readmissions in particular and worked with clinicians to identify 88 input variables thought to be drivers of readmissions. This data was gathered from 69,000 heart failure-related encounters over a six-year period.

Data scientists then ran the data through a variety of machine learning algorithms to evaluate the 88 input variables against the 30-day readmission outcome. Each experiment generated a predictive model and measured the accuracy against a special subset of the 69,000 input records that was not included in the experiment. Of the original 88 input variables explored, the final model used 24. The data scientists then selected the most accurate model for guiding readmission interventions. The final model was shown to accurately predict readmissions with a c-statistic of 0.84 (AUROC = 0.84). This is an improvement over the best models in the literature that show an accuracy of 0.78. Note: using this metric, a perfect model is 1.00; random predictions would yield a score of 0.50.

MultiCare now makes around 150 predictions every day on currently admitted patients. The system further improves usability of the model by categorizing individual patients into five different risk levels. Patients are classified by their individual readmission score (predicted probability). Each risk group’s overall readmission rate is also reported in parenthesis:

Severe: Predicted probability of >=.71 (100% readmission rate)

High: Predicted probability of >= .50 (58% readmission rate)

Medium: Predicted probability of >= .25 (37% readmission rate)

Low: Predicted probability of >= .03 (9% readmission rate)

Very Low: Predicted probability of <.03 (0% rate)

Machine Learning Can Help Allocate Resources and Guide Interventions

Machine learning automation leads to more efficient resource allocation and more appropriate interventions. Automation means the machine learning tool should run and update itself every day – or even more often. For example, when a patient admits, within 24 hours the model should show the percentage chance of readmitting and the top three drivers indicating why. Then the team can allocate resources appropriately, ensuring that the patients receive interventions consistent with their risk level. The algorithm runs again the next day and produces a new score, either higher or lower, which tells the team if the care the patient received the previous day decreased or increased the chance of readmission. The team can use a chart that shows the trend and drivers on any given day.

This type of machine learning-based decision support can go beyond inpatient care to also inform post-discharge interventions—especially when the team is trying to reduce readmissions. If, for example, one of the risk drivers is a socioeconomic issue, such as transportation to an appointment or help paying for medication, then the tool will suggest social worker involvement. Depending on the issue, the care management team could include a physician, nurse practitioner, social worker, care manager, pharmacist, cardiologist, or other clinical specialist. The entire system should be simple, automated, near-real time, and updated daily—qualities that are vital to care team adoption.

Lessons Learned from Implementing Machine Learning in Healthcare

MultiCare learned a few valuable lessons while developing its machine learning program:

1) Data volume and transparency build trust.

Trust in the data being used to develop the predictive model is critical to machine learning’s successful rollout. Developing trust among clinicians will generate strong buy-in and adoption of the suggested interventions. But creating data transparency can be challenging, especially if the data source is unknown. For example, when analyzing readmissions, is the data just for heart failure readmissions or does it include all-cause readmissions? Always ask questions about data—where does it come from; has it been filtered in any way—to build trust.

There’s also safety in numbers, meaning bigger datasets produce more reliable results. If the model is missing pieces of data, it erodes trust in the predictive model. However, an appropriately large dataset (e.g., the special subset of data from the 69,000 input records described earlier) is enough to reliably train the model and provides a much tighter probability that the model will be accurate.

2) Recruit the right stakeholders.

With the right stakeholders on board (e.g., clinicians, administrators, IT, domain managers from across the organization), the lifecycle for implementing machine learning can be relatively rapid. The right team is needed to guide the model development by suggesting input features as well as validate the results. Having the right stakeholders from the beginning will also ensure that the model is adopted. If clinical leadership is involved from the beginning, bringing the model into the clinical workflows can be planned and implemented much earlier.

3) Data presentation and user experience matters.

Implementing a machine learning model to influence decisions requires a thoughtful user experience. The data that is presented to clinicians must be concise, but still convey enough context for the clinician to know how to use the data to make meaningful decisions. Simply putting a risk score in front of a clinician may result in a clinician mentally asking, ‘Now what?’. However, presenting that same risk score with additional context like the specific factors that are driving that patient’s score can significantly enhance the clinician’s ability to translate the data into meaningful action. Providing the right context is a balance: too much information can quickly overwhelm busy front-line staff. This careful balance is best navigated by data visualization specialists who understand clinical workflow and visualization techniques.

Machine Learning Shows Promise for Improving Care

Machine learning can help remedy the problematic, time-consuming, and inaccurate predictive risk models most healthcare organizations currently use. Organizations should take input from stakeholders across the entire organization, including clinicians and care managers, when creating and refining a machine learning tool. This will increase adoption and the chances that interventions suggested by the tool are carefully considered. Using historical data to produce accurate models is an iterative process of refinement. It’s important to use trusted data that, when coupled with buy-in from the right stakeholders, can help organizations see results from machine learning tools very quickly.

Daily machine learning predictions are now directly fed into MultiCare’s EMR, which helps make it an integral part of the clinician workflow. Considering the accuracy and workflow integration, this new decision support tool shows great promise toward achieving the goal of reduced heart failure readmissions. The model’s ability to accurately predict these readmissions (at 0.84 AUC) is unprecedented in the literature, and will go a long way to optimizing care for many patients.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>Care management is a population health strategy tool that helps patients achieve their healthcare goals and overcome socioeconomic barriers to care. These achievements are guiding principles of value-based care. Though organizations deploy care management programs differently, it is the bedrock of every successful population health approach.

A primary care management component is patient stratification: the process of separating patient populations into high-risk, low-risk, and rising-risk groups, which is how health systems identify patients most likely to benefit from a care management program. But patient stratification, and the entire population health strategy, is only as effective as the data and risk models it employs. Accurate and effective risk scoring depends on multiple models, multiple data sources, and custom care management algorithms capable of blending them all into a comprehensive patient stratification visualization.

Care Management Risk Modeling Under Value-Based Care

Care management practitioners and goals have changed as healthcare has moved from fee-for-service (FFS) to value-based care. Under FFS, a payer uses care management techniques to identify members of the population, target them for enrollment, and plan appropriate care services. The payer uses claims and clinical data, which are run through a black box risk model, to assign static risk scores.

Under value-based care, healthcare organizations (HCOs) have more complex risk modeling needs because their goals center on the health of multiple patient populations. These organizations’ goals involve multiple payers and multiple risk models, which tend to complicate care management. Because most organizations have limited care management resources, identifying those patients who will respond best to care management tactics is important. HCOs need to incorporate a dynamic risk scoring methodology over the top of these static risk models to meet the patient stratification demands of value-based care.

The Problem with Static Risk Model Scores

Different risk models produce unique risk scores. For example, Conifer and Milliman Advanced Risk Adjuster (MARA) are two popular models used by payers to produce static scores. A patient with a Conifer risk score greater than 30 is considered high-risk, as is a patient with a MARA risk score greater than 3.5. But comparing these scores is like comparing apples to oranges. Though they have similar meaning, they are on completely different scales. Also, these models were originally developed without access to EMR data.

Typical clinical risk models also produce unique scores, such as predictive risk, admissions risk, Charlson/Deyo, and HHS-HCC. These models were built with limited access to clinical datasets. In managing population health, care managers need to effectively compare patient risk levels, then target and coordinate care appropriately. But static, incongruent scores make it nearly impossible to compare risk levels to stratify patients and create precise lists of patients for assigning to care management programs. Normalizing risk scores is necessary for precise stratification.

HCOs need an analytics solution that can ingest the vast amounts of data that come from their patient populations. The solution must also work with disparate data sources, including claims, EMRs, and clinical applications. A solution that can nimbly handle large datasets from a broad representation of data sources establishes the process for normalizing risk scores.

To normalize scores (i.e., standardize them for comparison in a risk stratification process), care management teams must be able to build custom care management algorithms from all datasets, so they can accurately stratify patients into risk categories.

Comprehensive stratification algorithms consider chronic conditions, risk, utilization, medication, and social determinant variables, each of which must be weight-adjustable. For example, the utilization variable could be weighted by any combination of ED visits, hospital admissions, skilled nursing facility stays, specialist visits, or ICU stays. The risk variable could be weighted by predicted risk, rising risk, readmission risk, HHS-HCC risk, and Charlson-Deyo risk. A custom algorithm must play on the strengths of multiple risk models and data sources, while removing the fragmentation imposed by individual, mismatched scores and disparate data sources.

Effective Patient Stratification Accounts for Patient Complexity

Typical risk models don’t take into account the complexity of patients. Instead, they rely on static datasets. A clinical- or claims-based patient stratification algorithm only looks at a portion of the patient profile, which limits the care management team to only a partial view of the risk variables impacting each patient.

Custom care management algorithms reveal patient populations with complex care needs that otherwise would be missed using traditional risk scoring models. Clinical, claims, and socioeconomic data are all required for effective patient stratification. Though robust geographic-based socioeconomic datasets still need to be collected, the value of being able to ingest this data cannot be overstated. Clinical and claims data alone cannot tell the care management team if a patient can afford to pay for medication or understand discharge instructions. Patient stratification care management software should include modules for consuming all these data sources, especially socioeconomic data, to generate a list of patients that can then be reviewed and assigned to a care management program.

In a properly designed patient stratification tool, custom algorithms should also use machine learning to predict which patients will respond to care management tactics; e.g., who will be readmitted and who will be most impacted by a care program. Care management teams should be able to save algorithms and use them for generating updated lists of care management patient candidates daily. Algorithms should be developed in an open-source environment and shared between HCOs to support diverse, widespread population health initiatives.

Population Health Demands Next-Level Patient Stratification

Population health initiatives require knowing as much about patient populations as possible. More robust data enables this understanding, shows the care gaps, stratifies the sickest patients, and reveals how they are utilizing resources.

]]>https://www.healthcatalyst.com/care-management-algorithms-that-actually-reveal-risk/feed/0Diversity in the Workplace: A Principle-Driven Approach to Broadening the Talent Poolhttps://www.healthcatalyst.com/diversity-in-the-workplace-principle-driven-approach
https://www.healthcatalyst.com/diversity-in-the-workplace-principle-driven-approach#respondFri, 02 Feb 2018 00:08:37 +0000https://www.healthcatalyst.com/?p=29375The business case for diversity, equity, and inclusivity in any organization, in any industry, is compelling; it’s backed by the personal stories we hear in our everyday lives and on the news—and it’s backed by data: More innovation, creativity, and knowledge sharing. Increased sales revenue and greater market share. Increased productivity and quality. Higher return […]

]]>The business case for diversity, equity, and inclusivity in any organization, in any industry, is compelling; it’s backed by the personal stories we hear in our everyday lives and on the news—and it’s backed by data:

Credible organizations (e.g., Gallup, Scientific American, and MIT) consistently publish studies that show how diversity in the workplace improves financial outcomes, strengthens team member commitment, and increases creativity. Yet, despite the strong business case for inclusive workplaces, a lack of diversity continues to be a problem in almost every industry, and healthcare is no exception.

Consider, for example, the lack of gender diversity in technology—one of the fastest growing sectors in the U.S.; the facts are discouraging:

The list of reasons for this lack of gender diversity in technology is long and includes challenges on both ends of the pipeline, from limited early exposure to computing skills, to unsupportive work environments. And this women-in-technology segment is just one of many underrepresented groups in healthcare that needs our attention and our commitment to change the status quo.

Healthcare’s workplace diversity challenges, gender-related or otherwise, are significant and complex, but we can start to turn the tide with inclusivity initiatives that are grounded in the same principles that work together to create a healthy culture: respect, humility, transparency, and advocacy.

Defining Diversity

Health Catalyst’s leadership team defines and values diversity in its broadest terms, recognizing all differences as representations of diversity. This includes more traditional definitions of diversity, as measured based on differences like gender, race, ethnicity, or sexual orientation, but also recognizes a near-limitless set of additional differences in life experiences and perspectives that should also be respected and valued.

This broader definition challenges us to ensure that all our differences are appreciated, respected, and learned from, and requires us to think systemically about how we create an enabling infrastructure to tap into the power of these differences.

A Principle-Driven Approach to Harnessing the Power of our Diversity

As we considered how to systemically harness the power of all our team members’ differences, we recognized that it’s a monumental effort requiring a principle-driven mindset, and that the anchor of our ability to tap into this power is embedded in our company’s culture. We fundamentally believe that our differences are what give us the broad perspective we need to accomplish our company’s mission. We believe the most effective diversity initiatives are grounded in the same principles that shape an organization’s culture, and that initiatives with a principle-driven foundation are more likely to be adopted and embraced by team members.Health Catalyst’s culture centers on core principles, including respect, humility, transparency, and advocacy, that motivate everything we do; every team member must embody these principles, so we can deliver on our promise to our team members and our customers. We take this principle-based approach because we want a talented, diverse group of people to feel empowered to join our team, stay on our team, and thrive on our team—to do the best work of their careers here.

As we examine the work we are doing to enable our company to be strengthened by diversity, we can tie our initiatives back to four guiding principles:

Respect

Respect is a timeless principle, and perhaps the most important one because it underlies every other principle. Respect communicates that we appreciate, value, and believe we can learn from each of our colleagues. Respect means we value one another’s unique characteristics and experiences, and assumes that we will be made stronger as we benefit from these diverse characteristics and experiences.

Humility

Humility helps us be great listeners as we interact with one another. The humility that makes us not just open but eager to learn from each team member and each interaction, including opportunities for improvement, lays the groundwork for an inclusive culture that values diversity. Everyone in an organization, from entry level data analysts to seasoned executives, must embrace the continual giving and receiving of honest, constructive feedback: what’s working and what can be improved.

Health Catalyst is motivated by and benefits from data and analytics-driven improvement. Humility allows us to actively pursue the facts—positive or negative—so we can alter our trajectory when we need to. For example, every six months, we ask our team members to take a Gallup survey—we actively solicit feedback about what’s working for our people and what could be improved. Each member of the leadership team reads every response and comes together to discuss how to reinforce what’s working and change what needs improvement. We share the results of these surveys with every team member in our all-team-member meetings.

Regarding the concrete example we discuss in this article—women in technology—we know our industry isn’t fully tapping into the female talent pool, and we’re mindful of this meaningful segment in our ongoing team member engagement surveys. We want the results, we pay attention to the results, and we transparently share the results so we can engage everyone to continuously improve.

Transparency

Transparency is effortless when there’s good news to share. Although not as easy to put into practice when there are major problems to address, this is when being transparent matters most. Transparency is inclusivity in practice; it invites every team member to be a part of the solution and, ultimately, holds leadership accountable for solving problems. Transparency is the mechanism through which we exercise humility, admit that we don’t always get it right, and, most importantly, commit to making it right.

When Health Catalyst learned that certain male team members weren’t willing to be alone with female team members while on business-related travel (e.g., traveling to and from business meetings by car, etc.), we relied on transparency and humility to understand and correct this troubling behavior. We openly discussed the problem in a meeting with all team members, clearly stating that all team members must be treated as equals and as respected colleagues. We made it clear that this behavior—and the lack of respect that allowed it to happen—was inconsistent with our core principles. And we invited every team member to talk about the problem, anonymously or openly. Transparency coupled with humility made it possible for us to get back on our intended trajectory of creating an inclusive culture driven by respect.

In another example of how Health Catalyst relies on transparency to boost inclusivity, all team members have a standing invitation to attend manager meetings; we want every team member, manager or not, to have the same opportunity to learn and develop their careers.

Advocacy

A fundamental part of Health Catalyst’s goal to be a best place to work is the hope that when team members look back on their multi-decade careers they identify their time at Health Catalyst as the time in their career when they did their best, most meaningful work. A fundamental part of enabling this engagement and productivity in our team members involves a deep level of trust between team members and management. This trust is strengthened each time a team member senses that their manager is also their advocate committed to enabling their long-term growth and development.

Health Catalyst advocates for and empowers its team members through a variety of initiatives, including financial assistance to pursue professional development opportunities; flexible work schedules, including generous work-from-home allowances; unlimited vacation time; affinity groups that are open to everyone, can be started by any team member for any underrepresented group (women, LGBTQ, etc.), and provide a platform for collaboration, knowledge sharing, and innovation; and a mentorship program that hopes to address one of the three reasons women in technology are dissatisfied with their career prospects: the lack of inspiring role models.

We have an incredible opportunity to attract and retain a diverse group of talented people. We know our principles-driven culture is the way we enable our current team members to thrive at Health Catalyst. But we don’t wish to stop there—we also desire to contribute to broadening the talent pool of individuals who consider healthcare and/or technology careers as their first choice.

Broadening the Talent Pool for Technology Careers

As organizations increasingly acknowledge the compelling benefits of broadening the talent pool, and start building and empowering their diverse workforces, there are several powerful initiatives that can help:

Health Catalyst’s mentorship program, open to all team members, is an empowerment tool that connects people interested in sharing their knowledge and experiences (mentors) with people interested in learning from others (mentees). As with any goal-oriented program, ground rules are helpful. Our mentees are responsible for the following:

Assuming ownership of the relationship.

Committing to expanding their level of growth opportunities.

Being open and receptive to new ways of learning.

Being open and accepting of feedback; being reflective; making meaningful change as opportunities arise.

Maintaining confidentiality.

Practicing active listening.

Mentorship programs create meaningful opportunities for team members to connect with each other and grow their careers.

Implement Equitable Practices (Hiring, Compensation, etc.)

“Many women cite their company’s outdated maternity leave policies, lack of flexible work arrangements or salaries that are inadequate to cover the costs of childcare as their main reasons for exiting the tech industry,” according to a Forbes article.

Fairness and respect are synonymous. HR-related practices driven by fairness, from ethical hiring practices to flexible hours that encourage work-life integration, add up to create a supportive work environment that minimizes the personal sacrifices team members make when they come to work. Health Catalyst’s policies demonstrate our commitment to equity and being a best place to work:

Family leave, including generous and flexible maternity and paternity leave policies.

Unlimited paid time off.

Flexible work hours and the ability to work remotely.

Wellness program (onsite gym and allowance to spend on anything wellness related).

Pay equity and above-average compensation.

Underrepresented groups struggle to get hired; to tackle this issue, Health Catalyst is implementing an unconscious bias training to help hiring managers (and any interested team member) understand the role subconscious bias plays in the hiring process—a training that exposes people’s weaknesses and taps into every team member’s humility and willingness to improve. Health Catalyst is also launching a Diversity and Inclusion training for all team members, and working to implement interview and selection equal pay initiatives, which include practices like not asking candidates about their pay histories.

Partner with Organizations Committed to Improving Diversity

The best cities for women in technology also have conventions and coalitions that support women in technology. Organizations can partner with and learn from these groups that are equally committed to diversity and have the experience to share pragmatic lessons learned. Health Catalyst partners with and sponsors several organizations as part of its initiative to empower and advance women in technology:

Women Tech Council (WTC), a national organization focused on the “economic impact of women in driving high growth for the technology sector through developing programs that propel the economic pipeline from K-12 to the C-suite.” The WTC offers mentoring, visibility, and networking to more than 10,000 women and men working in technology.

Kids Code program based on the curriculum from Code.org, a “non-profit dedicated to expanding access to computer science and increasing participation by women and underrepresented minorities.”

College of Healthcare Information Management Executives (CHIME) and its Women of CHIME, a “group of CHIME women leaders who share networking and leadership skills with other women CIOs and senior healthcare IT executives.”

Health Catalyst’s involvement with these local and national organizations extends beyond financial sponsorships to serving on their boards and inviting every team member to professional development and networking events.

Create a Supportive Work Environment

Regarding the litany of things organizations can do to eliminate disparities for diverse segments of the talent pool, some are easier to influence than others. To help address the early pipeline challenges diverse groups endure to get where they are in their careers, Health Catalyst team members volunteer with kids through a program called Kids Code to help achieve Code.org’s vision that “every student in every school should have the opportunity to learn computer science.”

Many Health Catalyst team members donate their time to underrepresented groups within their own communities, on their own time. One team member spends several weeks every summer mentoring young, underprivileged women; women up against the early pipeline challenges that contribute to the gender gap we see in so many industries today.

Health Catalyst’s advocacy-oriented culture led to the development of several affinity groups designed to support underrepresented team members (women, LGBTQ, etc.) by giving them time to connect, share challenges and strategies, and offer educational opportunities.

No single initiative solves the diversity problem; every HR policy, every training, every networking event adds up to have a cumulative impact on inclusivity. And the work is never done; we have a long way to go before we start shrinking the diversity divide.

Shrinking the Diversity Divide: Healthcare’s Imperative

Healthcare organizations committed to building diverse, engaged teams should start by taking a close look at their cultures: the foundation for any inclusivity initiative. Equipped with a principles-driven culture that motivates everything they do, internally and externally, and strategic initiatives that reflect an awareness of what diverse groups, such as women in technology, are up against every day, any organization in any industry can start doing its part to start closing the diversity gap.

Increasing diversity and inclusivity in healthcare is more than just the right thing to do: it’s an intelligent business decision that improves business outcomes (ROI, innovation, productivity, engagement, team member retention, etc.). As an outcomes improvement company, Health Catalyst will continue to relentlessly commit to its mission to make all team members—regardless of gender, race, experience, or perspective—feel respected, supported, and treated as equals across the entirety of their careers at Health Catalyst.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/diversity-in-the-workplace-principle-driven-approach/feed/0Improving Patient Safety: Machine Learning Targets an Urgent Concernhttps://www.healthcatalyst.com/machine-learning-patient-safety-improvement
https://www.healthcatalyst.com/machine-learning-patient-safety-improvement#respondTue, 30 Jan 2018 20:26:12 +0000https://www.healthcatalyst.com/?p=29360Patient harm is a persistent and urgent healthcare concern that directly impacts the patient experience and overall outcomes. According to recent estimates, one in three hospitalized patients experiences preventable harm, and over 400,000 individuals per year die from these injuries. As the third-leading cause of death, preventable harm costs health systems more than $1 billion annually. As health systems […]

]]>Patient harm is a persistent and urgent healthcare concern that directly impacts the patient experience and overall outcomes. According to recent estimates, one in three hospitalized patients experiences preventable harm, and over 400,000 individuals per year die from these injuries. As the third-leading cause of death, preventable harm costs health systems more than $1 billion annually.

Better Risk Prediction Tools Consider Whole-Patient Risk

Previous risk prediction models were limited because they were developed using populations different from the populations for which the tools were being used. Another downside to these out-of-the-box models (such LACE, which predicts readmissions) is that they were often trained on data that was 15 to 20 years old. These generalized tools were also siloed in stand-alone prediction systems; for example, rather than an all-cause harm view of potential safety events, a risk assessment tool looked at only one event (e.g., hospital readmissions for patients with chronic obstructive pulmonary disease, or COPD).

Patients tend to be at risk for a variety of negative outcomes, so with a siloed risk assessment approach, clinicians miss opportunities to manage or prevent harm. The patient safety community has devised a whole patient measure of safety to address siloing in measuring patient harm. The same concept needs to be applied to risk prevention. A better, machine learning-powered patient safety tool uses health system data to assess whole-patient risk, giving clinicians a comprehensive view of patients who are at risk for a safety event, including identifying the particular event(s), as well as modifiable risk factors.

Machine Learning Enables Timely Risk Identification

This new generation of machine learning-based patient safety tools will close the loop between information and action, as the software not only forecasts the likelihood of harm, but also the most important clinical actions to lower that patient’s risk, helping the clinician make an informed intervention decision. Clinicians will be able to predict harm before it occurs, know who in their patient population is at risk, understand which of the patient’s modifiable risk factors need to change, and be able to make timely interventions.

Identifies risk: provides concurrent daily surveillance for all-cause harm events using literature-based triggers to show how many patients in a health system population are at risk for safety events (e.g., CLABSI).

Shows modifiable risk factors: by understanding a patient’s modifiable risk factors and the degree to which they can be impacted, clinicians know where to intervene to prevent harm. Modifiable risk factors for a condition such as CLABSI include line days, number of lines, bathing rates, and compliance with bundles (interventions that, when implemented as a group, have a greater effect than individual interventions alone).

Shows impactability (Figure 1): offers clinical decision support that helps clinicians identify high-risk patients and prioritize treatment by patients who are most impactable (most likely to benefit from preventive care).

Figure 1: An effective patient safety tool shows how likely a patient is to be impacted by an intervention

The major differentiator between the new generation of machine learning safety surveillance and the retrospective tools they’re replacing is that the machine learning tools use data from that same health system to not only understand the risk factors leading to harm, but also to identify which patients currently receiving care are being harmed or about to be harmed. Machine learning capabilities help organizations get upstream of the risk before a patient is harmed or at risk of harm.

Eventually, predictive patient safety tools will advance the way organizations mitigate patient harm by recommending interventions for modifiable risk factors and documenting those interventions. The next goal is to integrate with cost management tools to attribute cost and recommend data-driven, cost effective interventions.

Machine Learning for Patient Safety: A Much-Needed Solution

With the current, unacceptably high rates of preventable patient harm occurring in health systems, improving patient safety is a critical healthcare mission. Machine learning will drive the solution by enabling safety surveillance tools that use health system data to identify patients at risk, identify patients’ modifiable risk factors and impactability, and, eventually, recommend the most cost-effective interventions.

Healthcare’s best chance of improving patient safety and outcomes lies in predicting harm and taking action to prevent it. As Don Berwick, MD, MPP, president emeritus and senior fellow at the IHI, explained in his keynote address at the 2017 National Patient Safety Foundation Patient Safety Congress, healthcare has a lot of work to do to improve patient safety. “There’s an illusion that we’ve worked on safety,” Berwick said, and added that healthcare hasn’t developed real insight into patient harm and ways to prevent it. With a comprehensive, concurrent data-driven approach to patient harm, machine learning promises to transform patient safety from illusion to reality.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/machine-learning-patient-safety-improvement/feed/0Healthcare Analytics Platform: DOS Delivers the 7 Essential Componentshttps://www.healthcatalyst.com/healthcare-analytics-platform-seven-core-elements-dos
https://www.healthcatalyst.com/healthcare-analytics-platform-seven-core-elements-dos#respondFri, 26 Jan 2018 19:51:55 +0000https://www.healthcatalyst.com/?p=29324DOWNLOAD Population health, value-based care, decreasing revenues, and increasing costs are just a few of the pivotal issues challenging healthcare IT leaders today. Among other requirements, these issues call for large amounts of data, real-time data access, scalable analytics platforms, and systems interoperability. Healthcare IT must address all these issues, and more, even as technology […]

Population health, value-based care, decreasing revenues, and increasing costs are just a few of the pivotal issues challenging healthcare IT leaders today. Among other requirements, these issues call for large amounts of data, real-time data access, scalable analytics platforms, and systems interoperability. Healthcare IT must address all these issues, and more, even as technology becomes obsolete at an exponentially increasing pace.

Current healthcare analytics platforms are struggling because they are inflexible and don’t provide access to the right data, in the right place, at the right time to effectively support clinical, financial, and operational decision making. The current platform’s shortcomings appear in many ways: through process-driven EMRs, expensive data lakes, and unrefined transactional data.

Healthcare desperately needs a new IT model. Understanding the solution requires further investigation into a few specific IT challenges, followed by a close examination of the new model: a data-first, analytics and application platform called the Data Operating System (DOS).

Healthcare’s Problems Extend to the Healthcare Analytics Platform

The big issues impacting healthcare, such as rising costs and decreasing margins, revolve around healthcare IT and specifically around data.

Though data demand is everywhere, access to the data supply remains elusive, particularly through EMRs. Clinicians often complain that they constantly put data into the EMR, but never get any data or value out. This is largely because of the current siloed and monolithic analytics environment.

A Fractured Analytics Environment

The common healthcare analytics environment today looks like a series of islands (Figure 1) and navigating them is difficult.

Figure 1: The current healthcare analytics environment

The analytics environment comprises many data sources, including EMR and claims data. Acquiring data from these sources requires custom-built connectors. Every new data source requires new understanding and another custom connector. Then the data is organized, but it still exists in silos. EMR and claims data may exist in one database or the same location, but navigation between the two data types isn’t possible (i.e., claims data isn’t accessible from the EMR record, and vice versa).

Eighty percent of the data from an EMR is unstructured, including physician notes, which hold significant value but sit completely outside the analytics system. Other data sources—social determinants of health, cost, care management, and biometric—contribute significantly to health outcomes, yet exist in silos.

In the current analytics environment, EMR and claims data is acquired and organized into reports and apps, but this is where content and logic can get confusing. For example, when determining a cohort of patients with diabetes, individual reports may use different logic to define which ICD-10 codes constitute diabetes.

Further complicating the environment is an isolated EMR, which is entirely separate from the analytics infrastructure. It is very hard to get insights generated by the analytics infrastructure in front of clinicians at the point of care.

The entire system is monolithic—users cannot customize it with components and they are predestined to a single configuration.

The current analytics model is the lockbox around healthcare data and the key to unlock it is DOS.

DOS Is the Key to Unlocking Data

DOS is a new data-first analytics and applications platform. Before we talk about the mechanics of DOS, it helps to understand the idea of “data first” and why EMRs cannot trace a data-first trajectory. Then high-level and magnified diagrams will help explain DOS structure and components, and the purpose of each piece in the system.

Process-First (EMR) vs. Data-First (DOS) Design

Processes have driven the current healthcare data environment. The requirement to digitize medical records has spurred massive EMR deployment. EMRs are process-first systems, designed to move data from person to person to follow a process. EMRs collect data within a specific workflow, but don’t allow access to, or the import of, data outside this workflow, making it very difficult to do any meaningful analytics on transactional data. Like mainframe computers, EMRs will always be necessary, but they are not the right tool for unlocking data.

To extract value from the healthcare data, we need a data-first approach. In data-first systems, the focus is on the following:

Getting as much of the available data as possible.

Understanding and mining the data for insights.

Returning those insights back into the ecosystem.

Presenting data to appropriate users.

Fueling other apps to create more insights.

The iPhone is a good example of a data-first environment. Various applications built into the iPhone receive data and enter it into the ecosystem in different ways:

Real-time (live video)

Near real-time (stock quotes)

Streaming (music)

Images (camera)

Text (notes)

Cloud (photos and videos)

Other applications consume this data, create insights, then publish those insights into the ecosystem, where they can be picked up by yet another application. In a data-first environment like DOS, data drives the process. Instead of implementing a process and moving data from place to place (like an EMR), data is analyzed and the analysis drives the process.

Five Ways EMRs and DOS Are Fundamentally Different

EMRs were designed to produce documentation for fee-for-service (FFS) billing, which is their fundamental purpose. DOS is designed for analytics and outcomes improvement. The core of EMRs and the core of DOS differ in five other key areas (Figure 2).

Figure 2: The core differences between EMRs and DOS

Technology—EMRs are built on 20-year-old technology. DOS is built on the latest technology for analytics.

Purpose—EMRs are designed to take paper medical records and convert them to electronic records. DOS is designed to improve costs, quality, patient experience, and provider experience within the organization.

Workflow—EMRs automate the FFS billing workflow, a process-first model. DOS uncovers insights and promotes action and change in the organization.

Utility—EMRs capture clinical and operational data. DOS leverages this same kind of data to save thousands of lives and hundreds of millions of dollars (more on results toward the end of this report).

The Seven DOS Components

Like the current analytics environment, DOS interacts with multiple data sources, reports, apps, and EMRs. The unique and comprehensive structure, missing from the current model, lives in between these elements. This structure transforms data so reports and apps don’t have to, and it includes seven components.

1. Acquire

Healthcare analytics systems need data from many sources, so the cost of acquiring data from any data source needs to be low. DOS builds in connections to more than 200 data sources, including most of the common healthcare data.

2. Organize

The Health Catalyst® Analytics Platform is the organizing function of DOS. It includes all the tools to rapidly consolidate data from every source, precluding the need to custom build anything. Data analysts can create data pipelines to transform data through multiple steps, then leverage data lakes, and Hadoop and Spark infrastructures to process large amounts of data through the system.

3. Standardize

After it’s organized, clinical data (e.g., patient populations, diagnosis classifications, medication groupings) can be efficiently standardized through pre-built shared data marts. These can be used out of the box, or users can create their own from built-in tooling.

4. Analyze

Using data and content microservices, the analyze component combines and enriches data, then makes predictions using AI models. This is the process of converting raw data to deep data.

5. Deliver

Once data is analyzed, DOS can deliver it directly to the EMR workflow and right to clinicians’ fingertips. Data can also be exported to other workflows, putting it where data analysts are, rather than making them navigate to the data. DOS learns through a closed-loop function. As analysts interact with the system, those interactions feed back into the system and improve the data over time.

6. Orchestrate

Orchestrating all this data requires a central place for storing metadata (i.e., data about data), so people can easily find and understand the data. Orchestration also requires a central place for enforcing security, and a central place to manage the various processes happening in the analytical environment. DOS provides these tools so anyone can find, manage, and secure all the organization’s data.

7. Extend

All the pieces of DOS are accessible through open APIs to ensure sustainability of the DOS environment. A marketplace allows users to choose different applications and content pieces for their own data operating system.

This is the high-level view of DOS. The following section will zoom into the operating system for greater scrutiny of each component.

A Deeper View into DOS

What does DOS offer, through each of the seven components, that really helps jumpstart a healthcare system’s analytical efforts? A deeper examination reveals DOS’s full capabilities (Figure 3).

Figure 3: A detailed DOS diagram shows the functions of each component

Acquires Data

DOS builds in connectors to more than 200 data sources (Figure 4), including all the common EMRs, most claims systems, financial systems, billing systems, HIEs, and clinical and patient satisfaction systems.

Figure 4: DOS builds in more than 200 data sources, with many more coming in the future

It takes very little effort to bring in data from these systems. DOS already understands them, can map to them appropriately, and can serve up the data for analysis.

Organizes Data

To understand the value of this DOS component, consider how data lakes and data warehouses have been developed and deployed (Figure 5).

Figure 5: The value-to-cost curve of data lakes and data warehouses

At the outset, the hope is that, over time, the business value of the data lake or warehouse will increase as the costs to maintain it decrease. What ends up happening is almost the exact opposite. Source systems change. An EMR is upgraded. The warehouse needs to pull data from a new claim source. These all incur substantial costs. After a while, staff turnover interrupts continuity, then the underlying technology changes, the data warehouse can’t keep up and, as a result, people stop using it.

DOS delivers the desired value-to-cost curve, with value increasing and costs decreasing over time (Figure 6).

Figure 6: How DOS achieves value

DOS starts with all the necessary building blocks, so there’s less programming involved. Much of the data organizing function is performed by DOS “under the hood.” As new staff comes on board, they can use DOS’s built-in tools (UI and other simple models) to manage and enhance the existing functionality rather than looking through piles of handwritten scripts.

Health Catalyst manages technology changes, so healthcare systems can focus on deriving business value from the data warehouse. The Late-Binding technology in DOS binds only the data that’s needed immediately, which generates very fast time-to-value. More data can be added over time. It’s not necessary to create a dimensional model of all the data up front, a process that usually takes years before any value is derived from it.

Standardizes Data

DOS standardizes data through shared data marts (Figure 7), but there’s a cost to doing this, so pursue standardization only when there’s agreement on the interpretation of that data and when standardization adds value. For example, standardizing the ICD-10 codes classified as diabetes likely makes sense, but creating a grouping for medications that are only ever used by one department may not be worth it.

Figure 7: DOS ships with built-in shared data marts

Standardizing data creates faster time-to-value and allows for building multiple reports and applications without having to create multiple mappings of the same data. Standardizing creates tighter data governance through controlled data access and data definitions.

DOS has built-in shared data marts around clinical, claims, patient satisfaction, cost, person, and terminology. DOS takes the 200+ data sources mentioned earlier and maps them into these shared data marts, giving access to data that’s aggregated across every data mart. This mapping effectively disintegrates silos by joining clinical and business data from multiple EMR systems and other data sources, such as claims, GL, and patient satisfaction.

Analyzes Data

The term big data refers to data in terms of volume, variety, and velocity. But the IT vernacular has introduced a new term, deep data, which is about understanding data. The analyze component of DOS transforms raw data into deep data through five steps:

Delivers Data

DOS’s delivery component gets the right data to the right place at the right time. The EMR’s current model for clinical decision support (CDS) is interruptive. Pop-ups serve as alerts, but don’t provide any underlying data so clinicians can understand the reasons behind the alerts. And there’s no way to correct a misleading alert. DOS delivery focuses on synthesizing data, so clinicians can choose the right path, rather than conforming to an A or B approach.

The model for CDS in DOS is like the Google Maps model. Google Maps shows a recommended route, but also displays alternate routes and travel times. The user can click any route, even if it’s not the fastest or shortest one. The app color codes routes, which helps explain why one route is recommended over another. Users get the benefits of Google Maps, but can still direct it according to their preferences.

This CDS model is built into DOS, as is the technology that can show previously created data and insights directly within existing workflows. Figure 9 shows a sample record of a patient at high risk of opioid abuse. It displays the patient’s risk factors, history, and suggested interventions (actions): all relevant insights for the attending clinician. Any actions are directed back into the workflow of the EMR, keeping all documentation internal. This augments the EMR and delivers the power of deep data and insights to the clinician’s fingertips.

Figure 9: DOS delivers deep data and insights to clinicians at the point of care

Orchestrates

What’s the value of data if analysts can’t find what they are looking for or can’t understand where the data came from? Many organizations try to build their own orchestration tools but end up spending a significant amount of time on these tools instead of focusing on getting business value. DOS provides orchestration tools around creating and managing metadata, enforcing security, and managing all the analytical processes.

Extends

DOS can be extended through open APIs and a marketplace of content, apps, and AI models. This extend component allows customization for every healthcare enterprise.

APIs are important because they accelerate app development. With APIs, developers can customize existing tools and services to build only what they need. DOS ensures that everything that needs to be built is done so within the operating system and connected by an API fabric.

EMR Closed Loop Service API—Show data and insights as worklists or in the patient chart inside EMR workflows.

Machine Learning API—Allows client apps to run any R or Python models via a REST API without needing to install additional infrastructure.

DOS Learns Within a Closed-Loop

A huge value of DOS is its ability to learn from data within a closed-loop system. After clinicians receive data, there’s the possibility they won’t agree with it or that the data is non-responsive (i.e., it’s read-only data). Data should be alive (i.e., read-write) and be able to transmit insights across the care spectrum. This is what happens in a closed-loop system (Figure 10).

Figure 10: Data is improved through a closed-loop system inside DOS

In the example shown in Figure 10, data is generated when a patient registers. This data is analyzed to see what stands out in the patient’s history. As clinicians make assessments, the patient is assigned to appropriate cohorts and registries. This data feeds back into the system in real time, which continues to update insights even as additional diagnoses feed into the system in real time. Risk predictions determine additional diagnoses or interventions, which are then presented as part of the treatment plan. This process involves different clinicians who interact with the data; the closed loop keeps bringing all the information back around so the data changes and improves over time.

Real-time Data Processing in DOS

Data consumers want their data sooner than the next day, but faster turnaround has been next to impossible. Real-time data processing follows a complex pathway that must receive messages, process them asynchronously, establish data reliability, and map the data from HL7 to a database (Figure 11).

Figure 11: Real-time data processing follows a complex pathway

Real-time data processing is vital to healthcare, but too difficult for healthcare systems to develop on their own. DOS provides a four-step real-time processing solution:

Point the source (e.g., EMR) HL7 feed to the DOS endpoint.

Use an HL7 content starter set to automatically map data into database entities for quick time-to-value.

Data shows up in your database, already transformed.

Edit the HL7 content starter set to customize.

Installing an interface engine isn’t necessary; all the asynchronous processing is built in. High availability and deduplication are built in. Data is already converted to XML, so it is easy for applications to consume. Applications can subscribe to the queue server and receive processed messages in real time too.

DOS Attributes and Benefits

DOS is a complete analytics ecosystem that has seven key attributes:

Reusable clinical and business logic: Registries, value sets, and other data logic lies on top of the raw data and can be accessed, reused, and updated through open APIs, enabling third-party application development.

Streaming data: Near- or real-time data streaming from the source all the way to the expression of that data through DOS that can support transaction-level exchange of data or analytic processing of data.

Integrates structured and unstructured data: Integrates text and structured data in the same environment. Eventually, it will incorporate images, too.

Closed-loop capability: The methods for expressing the knowledge in DOS include delivering that knowledge at the point of decision making; for example, back into the workflow of source systems, such as an EMR.

Microservices architecture: In addition to abstracted data logic, open microservices APIs exist for DOS operations, such as authorization, identity management, data pipeline management, and DevOps telemetry. These microservices also enable third-party applications to be built on DOS.

Agnostic data lake: Some or all of DOS can be deployed over the top of any healthcare data lake. The reusable forms of logic must support different computation engines (e.g., SQL, Spark SQL, SQL on Hadoop, et al.).

DOS’s attributes generate many benefits that impact leaders throughout the healthcare organization:

IT leaders—Meet the increasing organizational demands for deep data and predictive analytics through the use of SQL, Big Data, and AI capabilities, without building huge infrastructures.

Clinical leaders—Get real-time insights using all of the patient’s data without leaving their workflow, enabling them to interact more with their patients.

Financial leaders—Access deep data (clinical, costing, and operational) across the entire care delivery system to thrive in a world of decreasing revenue and increasing risk contracts.

Health system leaders—Avoid having to rip and replace expensive technologies to get the deep data they need to succeed.

Independent software vendors—Eliminate the frustration of waiting for access to clinical data and the insurmountable task of getting their tools integrated into clinical workflows.

DOS enables many applications and services, which sit on top of the operating system (Figure 12). Several cross-cutting analytic applications and vertical, use-case driven applications facilitate many healthcare programs. Health Catalyst builds some of these applications, but third parties can also build their own applications. In addition, Health Catalyst provides a line of strategic consulting services to implement the technologies and ongoing operating system improvements.

Figure 12: DOS-enabled applications and services

DOS is Built with Reliable and Proven Components

Components built over the last decade, and now available in DOS, have shown value in millions of dollars saved through process and outcomes improvement (Figure 13).

Figure 13: Components of DOS have driven outcomes improvements for many healthcare systems

Healthcare Needs DOS to Unlock Healthcare Data

Healthcare needs a new IT model to accommodate a rapidly evolving data landscape. EMRs, raw transactional data, and data lakes are useful, but no longer sufficient. The DOS approach is the only way to unlock healthcare data.

DOS easily acquires data from more than 200 data sources, organizes it with a built-in analytics platform, leverages standard and custom data models, accepts real-time data, and brings report logic into a common layer. DOS conveys data to the EMR and other workflow tools so clinicians can act on it. The closed-loop system keeps the data live and updated, and APIs extend the operating system.

Leaders across the healthcare enterprise can benefit from DOS. DOS also benefits independent software vendors, giving them access to clinical data through built-in APIs, so they can easily integrate data into clinical workflows.

Despite its huge scope and vast capabilities, DOS can get up and running quickly, delivering value within just a few months. DOS is a foundational and transformational system that has the potential to change the healthcare economy and massively improve financial, clinical, and operational outcomes.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/healthcare-analytics-platform-seven-core-elements-dos/feed/0Resolving Uncompensated Care: Artificial Intelligence Takes on One of Healthcare’s Biggest Costshttps://www.healthcatalyst.com/uncompensated-care-resolving-major-healthcare-cost
https://www.healthcatalyst.com/uncompensated-care-resolving-major-healthcare-cost#respondWed, 24 Jan 2018 00:05:44 +0000https://www.healthcatalyst.com/?p=29305What’s one action a health system can take to significantly improve its bottom line? Collect unpaid balances from patients for healthcare services (uncompensated care); by doing this, systems stand to curb one of their highest costs. This article will explain how artificial intelligence (AI) powers solutions that help organizations reduce their rates of uncompensated care […]

]]>What’s one action a health system can take to significantly improve its bottom line? Collect unpaid balances from patients for healthcare services (uncompensated care); by doing this, systems stand to curb one of their highest costs.

This article will explain how artificial intelligence (AI) powers solutions that help organizations reduce their rates of uncompensated care and improve the patient experience. Known as propensity-to-pay tools, these solutions integrate with workflow tools to identify the likelihood that a patient will pay a balance, and then assign appropriate ways for finance teams to reach out to the patient.

Uncompensated Care Can Cost Health Systems Billions Annually

Uncompensated care includes bad debt (balances that can’t be recovered) and charity care (healthcare provided for free or at reduced costs to low-income patients) for self-pay patients. Self-pay patients either don’t have health insurance or have a balance due that their insurance doesn’t cover (due to coinsurance, deductibles, or services their policy doesn’t cover).

For health systems, the economic implications of uncompensated care are significant. For example, one regional health system wrote off $350 million to bad debt in 2016. In 2015, a larger health system reported bad debt of over $3 billion. Healthcare consumers are also feeling the impact of high-deductible plans and large out-of-pocket expenses: medical debt is the number one cause of bankruptcy in the U.S.

Uncompensated Care Is Not Going Away in the Era of High-Deductible Health Plans

As high-deductible health plans are increasingly common, and insurance plans allow people to sign up for them without determining if the consumer can afford the out-of-pocket costs, health systems are experiencing considerable growth of uncompensated care.

According to the Kaiser Family Foundation and Health Research & Education Trust’s 2015 Summary of Findings, the proportion of U.S. workers with high-deductible insurance plans grew from 13 percent in 2010 to 24 percent 2015. During that same period, the patient’s annual financial responsibility rose from $2,713 to $4,955. This trend in high-deductible plans and uncompensated care will continue, and may even become more relevant under at-risk reimbursement models, as organizations take on more financial risk.

Health Systems Need a More Effective Way to Collect on Balances

In addition to the proliferation of high-deductible health plans, another factor behind the massive unpaid balances in healthcare today is that organizations don’t have effective and efficient ways to collect on these debts. Because high-deductible plans place all payment risk on the patient and the health system, organizations must navigate each individual insurance plan, balance, and allowable charges, and then collect on the balance.

Health systems often have departments of dozens to hundreds of team members dedicated to collecting patient balances, or getting patients charity care. But, given the complexity of collecting (described above), these departments can only do so much; their resources are relatively small compared to the balances they have to collect. For example, large organizations may have 250,000 patient balances to collect at any given time and only about 50 people doing the collections work.

The collection process often involves calls and letters to anyone with an outstanding balance, so that people who are likely to pay and those unlikely or unable to pay receive the same treatment. This one-size-fits-all approach can create a bad experience for patients who intend to pay in full, as they’re unnecessarily pursued, while futilely expending resources on patients who are unlikely or unable to pay and would be better served by charity care.

There’s a lot of nuance to an unpaid bill that health systems don’t have a way of understanding (e.g., the patient has a good income, but can’t pay a huge sum out of pocket at once, versus a patient with a low income who’s a candidate for charity care). To effectively reduce uncompensated care, organizations need to be thoughtful about who they contact about collection and how they reach out.

Current EMRs and billing systems employ overly simplistic methods that use generic modes of prioritization, such as sorting collections by highest to lowest balance, oldest to newest balance, or alphabetically by patient name. This blind approach leads to lost revenue and inappropriate treatment of patient balances.

Even experts in healthcare billing are vulnerable to bad debt. For example, a revenue cycle management consultant on contract at a large integrated health system missed a payment when a bill went to the wrong address and ended up in collections. This billing professional had the desire and means to pay, but health systems don’t have an effective way to determine propensity to pay throughout their populations. If this individual with a high degree of understanding of the billing process can fall through the cracks, it can happen to anyone.

To reduce rates of uncompensated care and improve the patient experience, health systems must have a better way of understanding patients’ likelihood to pay and effective ways to reach out to them.

Artificial Intelligence Propensity to Pay: A More Informed Approach

An AI-driven approach to propensity to pay generates a comprehensive view of a patient’s likelihood to pay by combining external (e.g., census, income, and poverty levels) and internal data (e.g., patient payment performance data that’s unique to that health system and its population). By merging millions of records, using algorithms to derive a patient’s propensity-to-pay score, and seamlessly integrating that knowledge into the organizational workflow, AI gives health systems a pragmatic approach to collecting from patients and driving down uncompensated care.

By understanding propensity to pay, health systems can determine which patients need reminders, which need financial assistance, and whether payment patterns are likely to change over time and after particular events. Organizations can then dedicate resources to the bills most likely to be paid, rather than pursuing balances that are unlikely to be resolved.

Figure 1 shows how the propensity-to-pay program uses algorithms to designate a patient’s propensity to pay and align that patient with an appropriate intervention:

Prioritize collection efforts (high likelihood to pay).

Send to charity team (low likelihood and low ability to pay).

Collect on balance or send to bad debt (low likelihood and high ability to pay).

Figure 1: The propensity-to-pay process.

Five Key Actions in the Propensity-to-Pay Process

AI-driven propensity to pay performs four key actions:

Step #1: Identifies propensity to pay.

An algorithm uses several factors to determine a patient’s propensity to pay: historical payment and demographic information, external socioeconomic data, size of balance, patient age (older individuals are more likely to pay), and many more data points. The propensity-to-pay program then designates corresponding interventions. The integrated propensity-to-pay identification step occurs in the EMR workflow to enable finance departments to better allocate staff and resources and follow intervention recommendations.

Patients with lower propensity to pay will receive automated reminders or counseling for financial assistance. Billing staff sees which accounts have the lowest propensity to pay and can write them off to bad debt (preserving resources for more promising accounts). For example, a patient with a low propensity to pay, a high balance, and a history of charity care, probably won’t be able to able to pay. Instead of letting this patient continue to bad debt, the finance team can help immediately by shifting the patient to charity care or government funding.

Patients with medium propensity to pay will receive interventions targeted to their situations (e.g., a reminder phone call, a recommendation for charity care, etc.). The system further divides medium propensity-to-pay patients into medium-low and medium-high categories, for which developers are currently testing interventions and methods to reach out (an example of the continuous improvement that AI enables). For instance, these patients may respond best to an automated payment plan and email reminders.

Step #4: Designates intervention for patients with high propensity to pay.

Patients with high propensity to pay will receive no initial intervention. If these patients don’t pay after a set time frame, they receive a phone call or other targeted intervention (e.g., email or text reminders). If patients with high propensity to pay miss a payment, staff can reach out to help the patients avoid bad debt.

Step #5: Seamlessly integrates with the EMR.

The propensity-to-pay analytics engine works smoothly in the background of the EMR and delivers the categorized outstanding accounts directly to the billing team’s native workflows.

Understanding Propensity to Pay Is Critical in Today’s Healthcare Industry

As high-deductible health plans become increasingly common and financial risk continues to shift to patients and health systems, organizations must have way to curb growing amounts of uncompensated care. AI-powered propensity-to-pay tools are an effective solution because they combine internal and external patient data, giving finance departments a comprehensive picture of patient’s likelihood to pay. These predictive models also integrate directly into workflows, giving billing departments immediate access to propensity to pay among patient populations and recommended interventions.

With an intelligent, analytics-driven propensity-to-pay approach, health systems will lose less money on uncompensated care, patients will avoid unnecessary bad debt collection, and those in financial need will get timely help.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/uncompensated-care-resolving-major-healthcare-cost/feed/0Clinical Data Management: 3 Improvement Strategieshttps://www.healthcatalyst.com/improve-clinical-data-management-healthcare-reduce-waste/
https://www.healthcatalyst.com/improve-clinical-data-management-healthcare-reduce-waste/#respondTue, 12 Jan 2016 16:48:48 +0000http://healthcatalyst.wpengine.com/?p=9817They say a sign of an organized mind is a cluttered desk. Either you’re that cluttered genius or you know one in the office. Stacks of files and folders cover the desk, floor, and shelves. Reminder notes stick to any remaining available surface. How you or anyone else ever finds that one needed document remains […]

]]>They say a sign of an organized mind is a cluttered desk. Either you’re that cluttered genius or you know one in the office. Stacks of files and folders cover the desk, floor, and shelves. Reminder notes stick to any remaining available surface. How you or anyone else ever finds that one needed document remains one of life’s great mysteries.

Another great mystery is how anyone can work with the equivalent clutter in our industry: healthcare data and clinical data management. Like an office with stacks of paper, healthcare must contend with petabytes of data, spread across numerous disparate systems, with more accumulating every second of every day (Figure 1). It’s been estimated that the volume of healthcare data will grow from 153 exabytes, where it was in 2013, to 2,314 exabytes by 2020. Though we are developing the architecture to handle this volume, clinicians would be wise to remember that more data doesn’t always mean more insight. Sometimes more data is just more clutter.

Figure 1: Typical complex healthcare analytics environment

For an industry evolving to value-based care, clinical data management is crucial. Today, to assist an organization in answering its most pressing questions, a healthcare data analyst must navigate a complicated information infrastructure.

The first step in that navigation is to identify the proper data sources. The analyst may work with an IT resource who can help determine which data would be most beneficial to start the analytic process. Collecting raw data from disparate systems is only the beginning. Distilling the information so it focuses on a specific question or strategy enables the analyst to discover more meaningful insights. Leadership, working together with one source of truth, can identify areas for improvement and take quick action to address them.

Problems with Traditional Clinical Data Management

Many hospital departments have developed DIY tools to confront the profusion of data. They use multiple spreadsheets, often containing information that may be inaccurate or conflicts with corporate data, which leads teams to focus on the wrong priorities or miss improvement opportunities altogether. Source data, once identified and captured, often become shadow systems with one spreadsheet linking to another and then another. This spreadsheet mania further complicates the process and introduces an unnecessary risk for error.

Changing Analytics Roles and Tools

In today’s value-based care environment, data analysts must be equipped with the right tools to identify performance gaps in the organization and to create actionable recommendations that drive improved outcomes. Providing the proper infrastructure mitigates the risk of compromising data during capture or transfer. Accurate and timely data is crucial to engaging clinicians and gaining their trust.

As Figure 2 shows, analysts spend roughly 80 percent of their time hunting and gathering data as opposed to performing high-value work, such as interpreting data and making recommendations about how to improve the outcomes.

Figure 2: The current, wasteful state of clinical data management versus spending most of time on value-added activities

Most analysts would rather conduct strategic analyses and contribute to organizational decision making. Today, the approach to clinical data management focuses most analytic resources on the task of hunting and gathering. Creating a process that is efficient, effective, and reliable will enable analysts to spend more time on gleaning insight from the data.

Most leaders also dislike the current approach. The multiple pressures of improving care, reducing cost, and planning for the future require near-real-time data. Submitting a request, being placed in a queue, and waiting weeks for a response is no longer an option.

Three Strategies to Improve Clinical Data Management

So, how do we change the way data is managed within a healthcare organization to ensure the right data is found quickly and that all the analysts are using that data? Here are three strategies to improve clinical data management.

1. Identify the Analysts in the Organization

To align the analysts, a good first step is to simply identify the current analyst pool sprinkled throughout the organization. Finding all the analysts can sometimes be a challenge, but one way is by working with HR to get a list of positions with names similar to “analyst,” “specialist,” or “informaticist.”

2. Assess Analytic Improvement Opportunities in the Organization

Once the analyst pool is determined, elect a core analyst team responsible for assessing the risk within the organization. Some of the team’s new duties, along with the risks they are certain to uncover, include the tasks in the table below:

Task

Risks and Challenges

Create a report inventory and find the logical “owners” of each report. It is most practical to start with recent reports, pulled during the last year. Reports that haven’t been run in over a year are candidates for archiving. Work with the owners to prioritize the work of combing through each report to document its purpose, rules, tools used, frequency, data sources, formats used, and steps taken to produce it. This process will lead to better documentation AND fewer reports that need to be touched upon system upgrade.

Most organizations are surprised at how many reports have been created (and maintained) over the years. Any given system may have thousands of reports run out of the EMR. Folder security can sometimes force duplication (and probably modification) of the same reports with no audit trail for what is different. Additionally, the report filters and logic are often hidden from the report consumer, making it next to impossible to determine the rules and ultimately, the “source of truth.”

Gather the analysts to develop a list of core competencies and a program to provide ongoing training and mentoring.

Analytic excellence is tough to measure, but without certain core competencies, there is no way to guarantee high quality, consistent results.

Assess the degree of silos and political will to improve alignment.

Analytic silos breed duplication and potential waste. Although this seems obvious, the political climate often complicates remedying this risk. However, it can help inform the type of alignment needed to reduce the risk.

Determine the current method for requesting reports and analytics.

Without a formal intake, triage, prioritization, and assignment of work, the reporting and analytic environment becomes the “wild, Wild West.” Without a disciplined process, there is increased risk that staff are not focused on the highest priority work.

Identify current data governance processes and ownership within the organization. Current data governance might be performed through numerous, disconnected committees so dig around.

Data governance focuses on managing data from initial capture in a transactional system, such as an EHR or laboratory system, through its aggregation into reporting structures and data stores and enterprise data warehouses. The intent is to make sure that each step of the health data management process is controlled and the effects of processes on the data are well documented and understood.

This initial assessment will clarify the current the level of risk and potential inefficiencies that exist in the analytic organization. This information can be shared with executive leadership to make the case for changing to a new organization model.

It can be difficult to implement this model as it can be incredibly disruptive to the organization. The political capital needed to build the organization may also be lacking.

An alternative, less dramatic solution, is to create a central business intelligence group to serve as a consulting group. This group will offer specialized services to functional areas and produce findings to help determine organizational risk and its readiness for change.

3. Use a Data Operating System with an EDW as the Framework for Clinical Data Management

Using a data operating system (DOS) with a enterprise data warehouse (EDW) is a critical step toward a robust analytic infrastructure. The EDW becomes a safe, central repository of data that is organized and optimized for measurement, analysis, and reporting. Sure, this is a large effort, but the payoff is significant and long-term.

The true value of the data warehouse is to organize data, provide links across disparate data sources (so the analysts don’t have to), and provide access so analysts and clinicians can “fish for themselves.” Aligning the analysts and developing clear clinical data governance and management policies will strengthen the entire analytics environment.

For example, the patient identifier provides a valuable link between the EMR data, departmental sources, and patient satisfaction data. After the data is pre-linked, it can be organized in a way that allows for fast reporting and visualization. When used correctly, the EDW becomes the one place analysts rely on for information, creating a single source of truth for the entire enterprise.

Once an EDW is in place, align the analysts and develop some important clinical data governance and management policies to strengthen the entire analytics environment and ensure the greatest return on investment.

Create a New Model to Improve Care and Deliver Better Outcomes

Creating a new model that supports effective clinical data management requires DOS and an EDW to assemble and coordinate data from across the organization, providing the foundation to improve care and deliver better outcomes.

This streamlined model greatly reduces lead time, enabling analysts to provide strategic insight from the disparate data collected across various systems. Healthcare organizations that embrace the value of data analytics, and harness it to create an effective clinical data management model, will be in the best position to survive and thrive in the new era of healthcare.

Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.

PowerPoint Slides

Would you like to use or share these concepts? Download this Clinical Data Management presentation highlighting the key main points.

The evolving healthcare data environment created the need for data lakes, but they are a significant IT investment. Understanding the relationship between an enterprise data warehouse (EDW) and a data lake, as well as the structural components of a data lake—the zones—is fundamental to investing in the right technology with the appropriate financial and human resources.

Why a Data Lake Is Necessary

Figure 1: The human health data ecosystem is large, though we use very little of it for improving outcomes.

To see more of the picture, bring it into focus, and understand what really impacts outcomes, we need genomic and familial data, outcomes data, 7×24 biometric data, consumer data, and socio-economic data. The complete ecosystem of data necessary for massive outcomes improvements will increase the total amount of healthcare data tenfold. According to a 2014 IDC report, the healthcare digital universe is growing 48 percent per year. In 2013, the industry generated 4.4 zettabytes (1021 bytes) of data. By 2020, it will generate 44 zettabytes. Unfortunately, this data volume would explode the data warehouse of most organizations. Fortunately, a data lake can handle this volume.

The Benefits of a Data Lake

The benefits of a data lake as a supplement to an EDW are numerous in terms of scale, schema, processing workloads, data accessibility, data complexity, and data usability:

A data lake, typically designed using Apache Hadoop, is the preferred choice for larger structured and unstructured datasets coming from multiple internal and external sources, such as radiology, physician notes, and claims. This removes data silos.

A data lake doesn’t demand definitions on the data it ingests. The data can be refined once the questions are known.

A data lake offers great flexibility on the tools and technology used to run queries. These benefits are instrumental to socializing data access and developing a data-driven culture across the organization.

A data lake is prepared for the future of healthcare data with the ability to integrate patient data from implanted monitors and wearable fitness devices.

The Data Lake’s Strength Leads to a Weakness

A data lake can scale to petabytes of information of both structured and unstructured data and can ingest data at a variety of speeds from batch to real-time. Unfortunately, these capabilities have led to a negative side effect. Gartner’s hype cycle for 2017 shows that data lakes have passed the “peak of inflated expectations” and have started the slide into the “trough of disillusionment.” This isn’t surprising. Often, an industry develops a concept thinking it will solve world hunger, then learns its real-life limitations.

Initially, data lakes were predicted to solve all of healthcare’s outcomes problems, but they have ended up just collecting petabytes of data. Now, data lake users see a lot of detritus that can’t be used to build anything. The data lake has become a data swamp.

Understanding and creating zones within a data lake are the keys to draining the swamp.

The Four Zones of a Data Lake

Data lake zones form a structural governance to the assets in the data lake. To define zones, Zaloni excerpts content from the ebook, “Big Data: Data Science and Advanced Analytics.” The book’s authors write that “zones allow the logical and/or physical separation of data that keeps the environment secure, organized, and agile.” Zones are physically created through “exclusive servers or clusters,” or virtually created through “the deliberate structuring of directories and access privileges.”

Healthcare analytics architectures need a data lake to collect the sheer volume of raw data that comes in from the various transactional source systems used in healthcare (e.g., EMR data, billing data, costing data, ERP data, etc.). Data then populates into various zones within the data lake. To effectively allocate resources for building and managing the data lake, it helps to define each zone, understand their relationships with one another, know the types of data stored in each zone, and identify each zone’s typical user.

Data lakes are divided into four zones (Figure 2). Organizations may label these zones differently according to individual or industry preference, but their functions are essentially the same.

Figure 2: Data lake zones.

The Raw Data Zone

In the raw zone data is moved in its native format, without transformation or binding to any business rules. Often the only organization or structure added in this layer is outlining what data came from what source system. Health Catalyst calls those areas in the raw zone source marts. Though all data starts in the raw zone, it’s too vast of a landscape for less technical users. Typical users include ETL developers, data stewards, data analysts, and data scientists, who are defined by their ability to derive new knowledge and insights amid vast amounts of data. This user base tends to be small and spends a lot of time sifting through data, then pushing it into other zones.

The Trusted Data Zone

Source data is ingested into the EDW, then used to build shared data marts in the trusted data zone. Terminology is standardized at this point (e.g., RxNorm, SNOMED, etc.). The trusted data zone holds data that serves as universal truth across the organization. A broader group of people has applied extensive governance to this data, which has more comprehensive definitions that the entire organization can stand behind. Trusted data could include building blocks, such as the number of ED visits in a certain period, inpatient admission rates from one year to the next, or the number of members in risk-based contracts.

The Refined Data Zone

Meaning is applied to raw data so it can be integrated into a common format and used by specific lines of business. Data in the refined zone is grouped into Subject Area Marts (SAMs, often referred to as data marts). A department manager looking for end-of-month numbers would query a SAM rather than the EDW. SAMs become the source of truth for specific domains. They take subsets of data from the larger pool and add value that’s meaningful to a finance, clinical, operations, supply chain, or other administrative area.

Refined data is used by a broad group of people, but is not yet blessed by everyone in the organization. In other words, people beyond specific subject areas may not be able to derive meaning from refined data. A SAM gets promoted to the trusted zone when the definitions applied to its data elements have broadened to a much larger group of people.

The Exploration (Sandbox) Zone

Anyone can decide to move data from the raw, trusted, or refined zones into the exploration zone. Here, data from all of these zones can be morphed for private use. Once information has been vetted, it is promoted for broader use in the refined data zone.

Zones and Their Data Definitions

For an example of the data type in each zone, consider length of stay (LOS). There are dozens of ways to define LOS using ED presentation time, admit time, registration time, cut time, post-observation time, and discharge time. The clinical definition of LOS for an appendectomy may be from cut time to discharge time, but the corporate definition may be from admit time to discharge time. A SAM that focuses on appendectomy might choose to use the clinical definition, which doesn’t apply to the global definition (i.e., the definition in the trusted zone). For an individual SAM definition of LOS to be promoted to the trusted zone, it needs to be vetted through a broader group of people to confirm it has universal application.

Directors who have financial responsibility over a single line of business may need to evaluate their department’s productivity. They may need to see things a certain way, such as excluding corporate overhead, over which they have no control. This is what makes the SAM more specific to one area. The data definition has been vetted and agreed to by a group of people, though it has yet to reach global agreement.

The Right Technology for the Right Zone

Different technology can run on top of different zones in a data lake. The data lake itself typically runs on Hadoop, which is optimal for handling huge data volumes. Relational Databases like SQL Server are more user friendly and will provide data to a larger user base. SQL queries can run on top of Hadoop to produce data marts and SAMs in the trusted and refined zones.

Hortonworks refers to a Connected Data Architecture, in which “data pools need to ensure that connected data can flow freely to the place where it is optimal for the business to get value from it.” Zones may not live on the same data technology. Much of the data will live in a data lake, but more refined zones may have a portion of their data that resides in an EDW or smaller data marts.

Partnering the appropriate technology to each zone lowers the barrier to entry for associated user groups.

Data Lakes Are Integral to a Larger Operating System

Earlier, we said that huge data volumes have turned data lakes into data swamps, which is remedied through a larger healthcare analytics ecosystem. Some, or all, of a data operating system can be deployed over the top of any healthcare data lake. The Health Catalyst®Data Operating System (DOS) (Figure 3) can index, catalog, analyze, and provide insights from the terabytes and growing data assets in a health system: attributes that can provide IT departments, clinicians, population health managers, financial leaders, and health system leaders with the knowledge they need to produce massive outcomes improvements.

Figure 3: The Health Catalyst Data Operating System.

DOS enables a data lake to be built with the required governance and meaning added to the data so it is easily organized into the appropriate zones. Data can then be used according to zone by the various data consumers in a health system. DOS also allows data to be analyzed and consumed by the Fabric Services layer to accelerate the development of innovative data-first applications.

The Future of Data Lakes

The volume of healthcare data is mushrooming, and data architectures need to get ahead of the growth. Vast volumes of data will continue to flow into the EDW.

A data lake is required to make data accessible to a subset of ETL developers, data stewards, data analysts, and data scientists. Data lakes allow data to be moved into various zones for experimentation and research, or for customization into shared data marts and SAMs.

To prevent data lakes from becoming mired in the petabytes of data now swamping healthcare, the new architecture presented by the data operating system offers a breakthrough in analytics engineering that can renew the life of a data lake and accommodate the big-bang growth of healthcare data.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/four-essential-zones-healthcare-data-lake/feed/0Five Solutions to Controlling Healthcare’s Cost Problemhttps://www.healthcatalyst.com/healthcare-cost-problem-how-to-control-it
https://www.healthcatalyst.com/healthcare-cost-problem-how-to-control-it#respondThu, 11 Jan 2018 21:46:14 +0000https://www.healthcatalyst.com/?p=29090For too long, U.S. hospitals have focused on increasing revenue, volume, and growth. At the same time, the healthcare system has wasted hundreds of billions of dollars on supply chain inefficiencies, variation, service duplication, and suboptimal labor management. Unfortunately, the healthcare cost problem has put the industry in a financial quandary. Recently, Moody’s Investors Service […]

]]>For too long, U.S. hospitals have focused on increasing revenue, volume, and growth. At the same time, the healthcare system has wasted hundreds of billions of dollars on supply chain inefficiencies, variation, service duplication, and suboptimal labor management. Unfortunately, the healthcare cost problem has put the industry in a financial quandary.

Recently, Moody’s Investors Service reported that hospital median operating margins fell to just 2.7 percent in fiscal year 2016, and operating expenses grew faster (7.5 percent) than operating revenues (6.6 percent). To address these margin challenges, healthcare systems need to change their business models to include a focus on costs.

While systems can deploy multiple strategies to impact revenue and volume, traditional methods are antithetical to value-based care. Hospitals can raise prices, but Medicare holds reimbursement increases steady at only one or two percent a year. Commercial payers—where hospitals have traditionally profited—provide increases, but those are increasingly tied to quality measures that put hospitals at greater risk. This risk-based payment structure is expanding with the growth of alternative payment models.

The revenue side of the earnings equation has changed, so hospitals need to change the cost side as well. Over the past 20 years, consumer prices for inpatient services increased 195 percent, prices for outpatient healthcare services increased 200 percent, and prices for all medical services jumped 100 percent. Furthermore, consumer prices for prescription drugs doubled, and prices for nursing homes and adult day services more than doubled during this time. In comparison, consumer prices for all items increased just 50 percent over this same period.

With high-deductible plans, consumers are bearing more and more of the healthcare cost burden. Healthcare systems need to identify the root causes of high costs and implement smart, creative solutions to improve their financial health and prosper under value-based care.

Let’s explore what’s behind healthcare’s cost problem and some ways to get it under control.

The Role of Operating Expenses in the Healthcare Cost Problem

Healthcare providers (hospitals and clinics) may have diminished control over revenues, but they can control hospital operating expenses, which comprise all the costs of taking care of patients: labor, supplies, utilities, equipment, buildings, property, and capital. A few of these expense categories are driving the current cost problems:

The registered nurse workforce, the largest healthcare occupation, is projected to grow to almost 3.2 million by 2024, a 16 percent increase over 2014 levels. Other healthcare occupations (occupational therapy assistants, physical therapists, home health aides, nurse practitioners, and physician assistants) are projected to grow at least 30 percent during that time.

The healthcare industry accounted for 15.8 million jobs at the end of 2016. By 2026, that number will reach well over 23 million, a growth rate faster than any other major specialized industry in the U.S.

By 2026, the number of jobs in healthcare and social assistance will be the highest of any major specialized industry in the U.S.

In 2000, national health expenditures were roughly $1.37 trillion. In 2015, this number had risen to more than $3.2 trillion; labor is a big chunk of this expense. If salaries increase between two and five percent every year, but reimbursement increases by less, the expense and revenue lines will cross soon. Even though everyone expects annual raises, without change to the overall expense structure, this isn’t sustainable. Healthcare systems must figure out how to work smarter rather than harder, and how to be more productive.

The healthcare industry has lost its focus on costs partly because it has pushed hard on the patient experience, access to care, and quality issues; however, hospitals need to be more efficient and productive with their resources. For example, to be more service oriented, hospitals have adjusted the hours of operation for outpatient services, offering flexibility by opening the doors for early morning, evening, and weekend hours to accommodate working people. But then volumes become diluted in the middle of the day. Extended hours are challenging from a financial perspective, even though patient satisfaction scores may be high. Healthcare leaders need to redesign their systems to be more effective in terms of productivity and service. It’s a balancing act to synchronize quality, satisfaction, access, and cost components.

Labor shortages will compound the problem

More than half a million experienced nurses are expected to retire by 2022, and 1.1 million new RNs will have to replace the retirees and fulfill the healthcare needs of a swelling patient population. Physicians and many other clinical disciplines are in this same predicament, though less severely.

Healthcare systems will fight to become the employer of choice. If hospitals can’t expand available resources, they must get them from somewhere else (e.g., contract labor), which will double, triple, or quadruple the cost of labor and ultimately impact prices. Suddenly, the labor expense line will be unmanageable, even with fewer people on staff.

Costs of Employed Physicians

Hospitals employed 38 percent of all U.S. physicians in 2015, a 50 percent increase from 2012. From a strategic perspective, there are many good reasons for employing physicians; doing so captures the revenue stream of their patients and facilitates population health activities. Employing physicians increases the payroll, but it also has benefits, so healthcare systems need to diligently monitor both.

Unintentional Costs of New Technology

Ten to 15 years ago, not-for-profit health systems focused on productivity, but along came new enterprise systems and EMRs, which distorted this focus. Healthcare systems were promised many things that didn’t happen with EMR conversions. Often, the technology was installed without sufficient integration because the process was usually rushed. Clinician workflows and methods weren’t studied and optimized, which resulted in workarounds to accommodate the technology. Healthcare did not realize the efficiency gains typically seen in other industries by installing technology and, in some respects, the technology has cost more money than it has saved.

The Cost Burden of Risk and Reimbursement

The risk and reimbursement structure in healthcare adds to the cost burden. The American Hospital Association reports that health plans and systems are slow to pursue capitated payment plans. As shown in Figure 1, only 29 percent of medical payments were linked to alternative payment models (APMs) in 2016 (e.g., shared savings, shared risk, bundled payments, or population-based payments). Payers often reimburse on a fee-for-service (FFS) basis (43 percent of healthcare dollars in 2016), but put providers at risk for various measures, setting up a greater risk component than has ever existed. For example, payers compare competing hospitals on utilization and reward the better performers; It is in their interest to reduce utilization, thereby reducing claims.

Medicare reimbursements generally fall short of actual costs, so health systems and providers struggle to better match their costs with reimbursement. Medicare has never increased its reimbursement rates to match inflation. Medicare also applies restrictions to many clinical processes. For example, when a patient transfers between services, it must be under the same DRG. Or before patients can transfer to a skilled nursing facility, they must have been in the hospital for at least three days.

Medicare, in general, reimburses less than commercial payers do. One long-time mantra states that hospitals should operate at a cost structure that allows them to be at least breakeven at a Medicare reimbursement rate. And even though commercial rates are rising, they aren’t doing so as quickly. In some respects, the goal for commercial payers is to lower their reimbursement rates to match Medicare’s. They may never quite get there, but every time Medicare makes a significant change in how they’ll reimburse, the commercial world eventually follows suit.

Costs Related to Poor Patient Workflow

Finally, there are many reasons why healthcare systems need to be concerned about inefficient patient flow through their hospitals. Poorly managed flow leads to higher volume and overutilization of emergency rooms and intensive care units. It creates surgery delays and longer length of stay, the latter which increases infection rates. Healthcare systems need to do a better job of optimizing operating room schedules, filling exam rooms, and generally streamlining physical asset use and managing capacity.

Five Ways to Control Operating Expenses

While the cost problems appear to be out of control and worsening, healthcare systems can minimize them through creative new approaches to how they conduct their operational, clinical, and financial business.

1. Refocus on Labor Management

Hospitals can get more productive by refocusing on labor, a focus that was lost in the past few years for many reasons. Hospitals need to better match resource demand, making sure they have the right staff with the right skill for the workload.

2. Manage Employed Physicians

Many hospital systems acquire physician practices and then remain hands off in their day-to-day operations, but it’s important to align physician incentives with the rest of the system. Physicians are accustomed to benchmarking through the Medical Group Management Association, so equivalent internal review is appropriate, which means digging into their operations and inquiring about things like how much support staff is needed. Now that the practice is part of the system, what services are duplicated, what needs to be systematized, and what needs to stay at the practice? For example, physician practices may not be accustomed to collecting revenue based on the policies of a larger healthcare system, so these types of processes need to be standardized. Managing employed physicians applies a new level of discipline where it hasn’t been applied before.

3. Change the Patient Encounter Environment

To adapt to changing risk and reimbursement models, healthcare systems need to evolve how they oversee certain patient types. For example, telemedicine is one way to use technology to modify resources, allow centralization, and apply specialized skills. And regulations are changing to accommodate greater reimbursement for telemedicine.

In 2015, Kaiser Permanente processed 59 million patients through its online portals, virtual visits, or the health system’s apps. That was the first year virtual encounters outpaced in-person encounters. Telemedicine, along with email and other connectivity with clinicians, can take healthcare to a whole new level, reducing the need for office visits. Clinicians can still treat and deal with patients in an appropriate way using technology (e.g., doing exams over video), which changes the workflow of how a patient accesses the system and receives service.

The healthcare Internet of Things is also changing the patient encounter environment. This market is projected to reach $117 billion by 2020 with more than 25 billion connected devices. Wearable monitors, smartphone diagnostic applications, and remote scanning and imaging devices, to name a few, represent new technology that holds potential for reducing operational and clinical costs.

4. Augment Standard Approaches with Technology to Control Costs

Many healthcare systems don’t truly understand the costs of the care they provide. Minimizing variation and standardizing clinical care delivery positively impacts costs. Tools like the CORUS cost management suite, can be used to understand the true cost of providing care across the continuum and can relate those costs to patient outcomes. CORUS gives providers the ability to see clinical activity data at a very granular level.

While standardization is critical to managing costs, clinicians may not react favorably to the notion of a cookbook approach, even though most of them already practice consistency and standardization when they treat patients. Clinicians follow specific steps during every exam, process patient variables, develop diagnoses, and recommend treatments. Clinicians conduct many standard processes for each patient, with appropriate deviation when necessary.

Clinicians tend to be driven by data and competition. Show them data that compares their performance to their peers with quality outcomes, and they will do everything in their power to end up at the top of the list. This is a classic tactic for getting this very important stakeholder group motivated toward understanding and controlling costs for improving outcomes.

Healthcare systems can also do more to improve the bottom line by better managing their revenue cycle. Systems must improve what is commonly referred to as revenue integrity: their ability to appropriately document a medical bill, justify it, and collect it. This boils down to how well the system adheres to clinical documentation improvement (CDI), following the premise that if services are not well documented, then they weren’t rendered and cannot be appropriately reimbursed. As the American Health Information Management Association (AHIMA) says, “Successful clinical documentation improvement (CDI) programs facilitate the accurate representation of a patient’s clinical status that translates into coded data. Coded data is then translated into quality reporting, clinician report cards, reimbursement, public health data, and disease tracking and trending.”

5. Manage Patient Access and Flow Through the Healthcare System

Historically, wherever patients have accessed healthcare—in the ER, the ambulatory setting, or the inpatient setting—is where they have received treatment. Over the past few years, however, this has been better controlled by case management workers who focus on patient navigation and reducing length of stay. Improved control puts healthcare systems on the right path to managing demand.

Healthcare systems must take a business approach to examining capacity, controlling where patients should be and what services they should be receiving. This control also depends on the disease state of the patient. If someone presents with cardiac issues, the protocols are well understood almost all the time. If patients aren’t where they should be, then case managers can place them in the right environment (e.g., specialist’s office or urgent care) where they aren’t using resources needed for emergency situations.

There is variation in the pathways patients follow through a hospital, but there is an expected or typical approach and use of services that each patient needs. Some deviation from the expected is legitimate; for example, the patient presented with comorbidities that needed to be addressed. But sometimes, deviation exists just because something has always been done a certain way. Providers need to use their tools and technology for treating patients of certain disease types in more standardized ways, and then deviate when a patient requires it. Then providers can manage resources more from a capacity perspective rather than just trying to fill beds, and better influence what resources are being consumed in the delivery of care.

Recognizing the Healthcare Cost Problem Is the First Step Toward Solving It

Healthcare systems previously relied on payment increases to meet bottom-line needs, particularly from commercial payers, but now the expense trend is growing faster than the payment trend. The United States already spends more on healthcare per person than any other country, and increasing costs are only putting U.S. healthcare consumers at greater risk.

Healthcare has a cost problem, but the solutions exist. Healthcare systems need to recognize the problem, identify the problem’s sources, and then begin the improvement journey. Systems must pay close attention to the increasing costs of labor, labor shortages, acquiring physician practices, technology implementations, and increasing risks accompanied by decreasing reimbursements. But recognition is only half the battle. Adopting the appropriate technology, embedding the required expertise, recruiting targeted human resources, and spreading best practices all combine to win the cost war.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/healthcare-cost-problem-how-to-control-it/feed/0Hospital Revenue Cycle Management: 5 Ways to Improvehttps://www.healthcatalyst.com/hospital-revenue-cycle-opportunities
https://www.healthcatalyst.com/hospital-revenue-cycle-opportunities#respondWed, 07 May 2014 19:42:07 +0000http://healthcatalyst.wpengine.com/?p=4308With cash flows declining, margins tightening, and bad debt increasing, it’s more important than ever to maintain a steady stream of income for your hospital. This goal may seem difficult, but there are many opportunities to significantly improve your revenue cycle management. As a long-time executive in the healthcare finance realm, I’ve spent years implementing […]

With cash flows declining, margins tightening, and bad debt increasing, it’s more important than ever to maintain a steady stream of income for your hospital. This goal may seem difficult, but there are many opportunities to significantly improve your revenue cycle management.

As a long-time executive in the healthcare finance realm, I’ve spent years implementing financial solutions for various health systems. Along the way, I’ve discovered a few tried and true solutions. While it’s important to choose the appropriate information system components based on a health system’s size and to educate the staff on the ins and outs of managing revenue, the following five suggestions will also help improve your revenue cycle management.

Click to View Infographic

1. Trend and Benchmark Your Healthcare Data

We all know the maxim: you can’t measure what you don’t manage. This is true with healthcare data as well. If you start trending and benchmarking your data, you’ll be able to identify areas to improve your revenue cycle.

The best way to trend your data is to use the healthcare data operating system (DOS). DOS combines data from your disparate IT systems — clinical, financial, human resources, patient satisfaction scores, and more — and optimizes it for analysis. As you review the data, you’ll be able to use visualizations, which will help you better understand the results as you drill down and discover the root cause of certain trends. When trending, be sure to look at both hospital and physician billing data for a full picture of your revenue cycle.

To find out how you compare to other health systems, you can purchase benchmarking data. An economical option is HFMA’s (Healthcare Financial Management Association) MAP (Measure, Apply, Perform) initiative, which sets the industry standard for revenue cycle excellence. If you’re an HFMA member (a nominal fee), you can access benchmarking data through MAP.

2. Use DOS to Mine Your Healthcare Data

DOS is a powerful solution that enables you to mine your data, discover roadblocks, and determine how to improve. Get a good analyst to go behind the scenes and see what the data reveals — and then work to implement change in your organization. This process must be done in a positive atmosphere. To illustrate: I was helping a client who had a lot of denials that were significantly affecting their revenue cycle. We analyzed their work queues and discovered that many of the queues weren’t being touched; outstanding bills were just sitting there. I loved the response of the manager; he simply said, “How can we improve?” There was no blame game, just an atmosphere of improvement. After doing a workflow analysis, we discovered the roadblocks, set up new queries to automate the manual work, and created exceptions to monitor. With DOS, experienced analysts, and a culture of improvement, this process was easy.

3. Constantly Ask Frontline Staff for Suggestions

After you’ve identified a roadblock, consult your frontline staff for suggestions about the best way to solve the problem. Then, continually check back with them for additional suggestions. Reexamine your workflow on a periodic basis, and ask the following questions:

What workarounds have you put in place?

Are the workaround still needed?

What could you do to eliminate roadblocks and workarounds?

Frontline staff have a lot of insight to share — and making them active participants in a culture of improvement helps to guarantee sustainable change.

4. Monitor All Payer Contracts

With margins growing tighter, monitoring your contracts and communicating clearly and frequently with payers can make all the difference. Make sure payers aren’t under-reimbursing you, denying too many claims, or putting unreasonable demands on your patient accounting office. Use your analytics system to easily measure the yield. Is it changing over time? What should it be? As we move to value-based purchasing, such monitoring will become far more difficult — and an analytics system absolutely necessary.

5. Maintain Convenient and Caring Touchpoints with Patients

Use any touchpoint you have with patients as a public relations possibility. Every person who registers or schedules a patient should have skills to do their job efficiently and with a caring attitude. By making it convenient and pleasant for the patient to do business with you, you are far more likely to get correct and complete registration and insurance information. It also pays to offer a well-designed, branded patient portal to increase patients’ satisfaction and give them the opportunity to pay bills online.

Here is a brief example. I worked with a physician group who was having problems with increasing collection time in accounts receivable. We discovered that the root of the problem was a poor process at the registration desk. By simply scripting questions for the registration staff, we were able to decrease collection time in accounts receivable, thus increasing cash flow, while also continuing to provide a good first impression for patients.

The healthcare industry continues to welcome advances in technology, but data analysts and architects know that better IT tools alone won’t help organizations achieve the Quadruple Aim (enhancing patient experience, improving population health, reducing costs, and reducing clinician and staff burnout). Instead of relying disproportionately on tools, healthcare organizations will reach the Quadruple Aim by cultivating a rich analytics ecosystem—one with a synergy of technology, highly skilled people in analyst roles, and an organization that promotes interoperability.

The woodworking industry provides a straightforward example of a productive ecosystem. The tools and raw materials are different in a woodshop compared to a healthcare analytics ecosystem, but the goal is the same: both entities work to transform something rough and undefined into a valuable finished product.

Woodworking and Healthcare Analytics: Both Thrive on Synergy

Woodworking runs as an ecosystem built on a synergy between sophisticated tools, the people who operate them, and an efficient and dynamic process:

In both woodworking and analytics, interoperability is the seamless workflow from station to station or step to step. Interoperability is a function of the shop layout or the analytics organizational structure. If the layout enables projects to progress smoothly between stations, the workflow is efficient and accurate.

Even the most advanced woodworking tools have an operator (just like healthcare IT tools need a skilled data analyst). Both woodworking and data tools are automated to reduce avoidable human error, but neither is designed to operate without human supervision.

More than Tools: Health Systems as Analytics Ecosystems

To thrive in a value-based, analytics-driven environment, today’s healthcare organizations must function as ecosystems. Like a woodshop, the analytics ecosystem includes a community and its environment functioning as a unit, with each contributor’s (whether human or technology) strength affecting the end product.

Healthcare IT risks isolating itself from the analytics ecosystem by focusing too heavily on advances and new tools, and not enough on the people with the skills to effectively leverage these exciting technologies and the environments in which they work. Like a woodshop without tool operators and an efficient layout, the most advanced analytics tools are useless without skilled people to run them and an organization that supports their work. Five main parts make up the analytics ecosystem:

Analytics Ecosystem Part One: Must-Have Tools

Human capital is paramount in the analytics ecosystem, and these highly skilled team members need the right tools to turn raw data into actionable insights. Fortunately, the must-have tools in the analytics ecosystem are foundational technologies that many health systems already have:

An EMR to document the care delivered to a patient.

A costing tool to understand the actual cost of that care delivery.

A patient satisfaction tool to capture how the patient perceives the overall care experience.

A billing and accounts receivable (AR) tool to bill for services rendered and collect and document payment received.

A data operating system (DOS) to empower the data from the previous four transaction systems and put that information onto a common platform that can be used for analytics and more.

IT Tools Are Important; The Inordinate Spend on Them Is Not

Health systems must have the five technologies described above to build out their analytics ecosystem; they don’t, however, have to spend an inordinate portion of their IT budgets to do this. Organizations that spend too much on certain tools end up with an imbalance in their analytics ecosystem and executive pressure to maximize that investment; this can lead to misguided recommendations on how the organization uses the technology. Inordinate spending can also lead the organization to neglect other technologies and the people who support them.

Inordinate spending is a common pitfall of EMR rollouts, as some organizations place more importance on the EMR than other tools. While the EMR is critical to care delivery and improvement, it should be a part of the analytics ecosystem, not the entire ecosystem.

Analytics Ecosystem Part Two: People and Their Skills

The human side of the analytics ecosystem—the people and their technical skills and contextual understanding of issues and challenges—operate the tools in pursuit of outcomes improvement. These people need five technical skills to drive sustained outcomes improvement:

Data query.

Data movement.

Data modeling.

Data analysis.

Data visualization.

But technical skills alone provide limited value; they need to be coupled with the knowledge of where to find the multiple rich data narratives that surround a patient encounter (e.g., EMR, costing, and claims data). Add to the skills a deep contextual understanding of what the analytics are measuring, and the skills gain extraordinary value.

#1. Data Query

Data query, also known as structured query language (SQL), is the language of how data is stored within an organization’s transaction system. Data query allows analysts to explore the relationships between data stored within transaction systems and to establish custom relationships between different transaction systems (e.g., within an EDW or big data in data lakes).

With data query, analysts can move beyond the predefined structures a transaction system comes with, and begin to answer personalized questions about the transaction data. The ability to access and manipulate data within their own systems gives organizations more control over their analytics future.

#2. Data Movement

Data movement refers to the extract, transform, or load (ETL) portion of the technical work. It has two objectives:

It brings together multiple data narratives from disparate sources that previously have not talked to one another. Analysts can tell the right story or a more complete story than they can with data from a single source.

It makes data more accessible. Data movement is one of the most expensive parts of analytics workflow, so users only want to move data to make actionable information accessible, and do so efficiently. Because the effort required for data movement is a huge source of waste and frustration for the majority of healthcare analysts, health systems should embrace technologies and practices with the lowest possible expectations for data movement.

#3. Data Modeling

Data modeling takes a real-world concept and builds a virtual proxy for it in a database. For example, how might a database represent someone as having diabetes? What would the database use in terms of data to represent such a cohort? Are there specific codes that could be leveraged, such as international classification of diseases (ICD) 9/10 or current procedural terminology (CPT) codes that would qualify or exclude someone from the registry?

Best practice in data modeling stores logic at the database level, making logic visible and accessible to those in the organization who need it (versus performing data modeling in Excel, which has logic visible to only the analyst who created it). Making logic available to subject matter experts for each measured domain, for example, helps health systems design more accurate and robust data models. Transparent logic will also accelerate much-needed engagement from health system professionals. Direct visibility allows them to think or print the logic in the data models and then trust the resulting measurement.

#4 Data Analysis

Data analysis is about making information accessible to the right people at the right time. Data analysis relies on data query, movement, and modeling. Meaningful analysis requires deep contextual understanding of the processes being measured. Analysis without contextual understanding puts an analytic effort at risk of long-term credibility issues.

#5 Data Visualization

Most healthcare professionals interact with an analytics platform through some sort of visualization, such as reporting, key products indicators (KPIs), dashboards, or ad hoc reports. Data visualization is the vehicle for broad analytics adoption within an organization. Most people who want information to help them do their job don’t have the ability, time, or interest in doing data query, movement, modeling, or analysis.

The visualization step uses underlying data models with their embedded cohorts, their rules of inclusion or exclusion, and their associated metrics. Content is more important than how the visualization looks. Visualizations are only as good as the understanding and the trust of the underlying data.

Figure 1 shows the technical skills data query, movement, modeling, and analysis—described across the X axis—and the level of proficiency with that scale along a Y axis. The colored bars represent different rankings of skills: blue bars for a senior analyst; red for midlevel analyst; and green for each scale of junior analyst.

After establishing an understanding of the technical skills needed for analysts to be effective, organizations must understand the analytics work stream. Figure 2 shows the difference between various analytics work streams (prescriptive, descriptive, and reactive). The rise along the Y axis represents an increase in analytics complexity; the X axis has a compound axis, a combination of the technical skills described above and a contextual understanding of how analysts will use that information.

Figure 2: Differences between various analytic work streams.

Reactive Analytics

Reactive analytics show counts of activities or lists of patients. They can include basic calculations on industry-accepted metrics, such as a diabetes admission. Reactive analytics answer anticipated questions; for example, a family practice physician wants to know how many patients with diabetes are on a panel.

Analysts can make minor adjustments to the look and the feel of a reactive analytics report; that’s the extent of customization, however, because reactive analytics are generally confined to a single source of data. Reactive analytics fall short when the report writer doesn’t have a solid contextual understanding of the analytics they’re measuring and why it matters. Reactive analytics explain some of the what has happened or what is happening, but they don’t explain the why—that requires another level of complexity (descriptive analytics).

Descriptive Analytics

Descriptive analytics get users much closer to addressing not just what is happening, but also why it happened. Analysts perform descriptive analytics outside the vended systems—often within a dedicated analytics environment, such as DOS or big data environment. Descriptive analytics leverage highly customizable data models. These data models are populated with multiple sources of data (an EMR, claims, external lab, professional billing, etc.), and the models are organized around a common domain. For example, data provisioning and integration efforts in a DOS platform allow a more comprehensive view of the activities within the entire health system. Work in this arena is an ongoing and iterative process.

For example, if a health system is motivated by regulatory penalties to reduce heart failure readmissions, it can look to the CMS explicit definition of cohort and readmission criteria. Descriptive analytics leverages that CMS construct to determine what gets loaded into the data model. Clinicians will scour and approve available sources to capture: diagnosis codes, admit discharge codes, readmission windows, and patient types (as defined by CMS). The analyst will integrate that data together within a custom data model.

Prescriptive Analytics

Prescriptive analytics make it clear that issue warrants intervention. Once analysts understand the root cause of an issue, they can begin to identify interventions for meaningful change. Sustained prescriptive analytics require technical and domain experts to work side by side in permanent teams.

Each analytics work stream relies on a certain combination of the core technical skills of data query, movement, modeling, analysis, and visualization. Figure 3 shows the required technical skill by analytics work stream.

Figure 3: Required technical skill by analytics work stream.

Certain skills are associated with each work stream. In Figure 3, each bar color corresponds to a skill listed below the graph: data query, movement, modeling, analysis, and visualization.

The graph makes three important points about technical skills in the analytics work stream:

Reactive analytics are an entry point along this analytics continuum.

Skill levels vary across the analytic streams, but all skills are necessary to effectively accomplish the work.

Analysts need high levels of both skill and contextual understanding to effectively prescribe a course of action using analytics.

Matching skill with expected labor output is critical in maintaining workflow. Bad things happen when skill and output are mismatched—when team members are in situations where they’re not challenged or don’t have necessary skill to accomplish the work. Team members who aren’t challenged may become disengaged because they’re accomplishing tasks too easily.

A large integrated health system recently worked with a systems vendor to assess its intake process for analytics requests and its organization of analytics work. The health system findings were similar to those from other comparable organizations:

In the reactive analytic space, the technical staff was performing work well beneath its abilities.

In the descriptive space, the team was attempting work largely beyond its abilities.

The skill gap in the predictive space was even more evident.

Analytics Ecosystem Part Five: Interoperability

Interoperability is a byproduct of the analytic work streams (the analytics equivalent of the wood shop layout). Each work stream requires the skilled labor to understand not only its role, but also its role relative to the work streams surrounding it, which can only happen if leadership responsible for the overall analytics space can effectively appreciate the analytics continuum: the reactive space, the descriptive space, and the prescriptive space.

Interoperability is a function of the tools and the work stream layout. If leadership fails to identify and stitch together the seams of these analytics work streams, these teams will forever run up against one another in a competitive way, killing analytics interoperability. The organization should reflect years of combined experience and months of deliberate planning.

The Analytics Ecosystem: Four Key Takeaways

To thrive in an analytics-driven healthcare environment, organizations must understand four key aspects of the analytics ecosystem:

Organizations must have all three parts of the analytics ecosystem: tools, people, and skills. The total cost of ownership for the ecosystem must consider all three elements. If an organization can only afford the tool without the operators or without investing in growing the operators’ skills, then purchasing that tool will not provide meaningful ROI.

EMRs and EHRs are not competitors in the analytics ecosystem space; they are necessary and complementary tools. By integrating clinical, costing, and financial data with patient satisfaction datasets, the analytics environment will earn quick dividends.

Data analysis is a top priority, and organizations need to ensure other technical skills are present. Support continuous analysis for continuous improvement.

Though finding the skilled labor with data query, movement, modeling, analysis, and visualization skills is challenging and expensive, this investment is imperative to maintain a healthy analytics ecosystem.

In the Analytics Ecosystem, Technology and People Work Together to Change Lives

An effective analytics ecosystem is made up of must-have tools; qualified people with the right skills; reactive, descriptive, and predictive analytics; matching technical skills to analytics work streams; and interoperability. Organizations that understand the value of—and work hard to implement—this analytics ecosystem will change lives for the better by making high quality information available to clinicians and caregivers.

Like the woodshop, a successful healthcare analytics program relies on not only advanced technology, but on the synergy of tools, skilled people, and an organization that promotes interoperability. Tools don’t build cabinets; people do (using tools). Likewise, analytics platforms don’t produce actionable insights; people use these systems to derive knowledge that transforms healthcare.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>In healthcare, there is room for improvement in the communication between care teams. Communications gaps are thought to account for a significant portion of adverse events, particularly during handoffs (such as from the inpatient to ambulatory setting). The Joint Commission has estimated that handoff miscommunication is responsible for 80 percent of serious medical errors, raising the need for tools and practices that support improved communication.

A communications framework that outlines essential elements can help clinicians convey important information and is a key component of a care management plan. One such tool, SBAR (Situation, Background, Assessment, Recommendation), establishes common expectations about the information clinicians share and how they structure the communication. SBAR’s standardized communications approach also helps nurses and physicians build trust around sharing information, and creates open and structured communication between members of the care team.

This article explains how an integrated, effective communication methodology, such as SBAR, can help health systems avoid medical errors and improve outcomes and the patient experience.

An Effective Healthcare Communication Strategy Delivers Critical Information in a Timely Way

Doug Bonacum, the vice president of safety management for Kaiser-Permanente in Denver, developed SBAR in an effort to bridge the communication gap between nurses and physicians. Bonacum based the guidelines for consistent and concise handoffs on his experience on a Navy submarine, where crew members often needed to communicate strategy in less than 60 seconds.

Similar to a submarine, healthcare information handoffs tend to involve complex messages; the accurate and timely delivery of these messages can be a life or death matter. As clinicians see more patients in shorter time frames, quality care increasingly relies on an effective communication and handoff methodology. By using a concise format, such as SBAR, clinicians share only the relevant information they need to make an efficient, informed care decision.

An Effective Communication Strategy Must Be Integrated into the Workflow and Culture

A communication tool is most effective when organizations standardize it as a regular part of workflows and make it part of the culture. For example, one health system saw improved SBAR compliance when they gave clinicians SBAR reminder pocket cards and posted summaries of the methodology at each telephone. The entire organization soon adopted SBAR guidelines in all communication.

Organizationwide adoption of a communication methodology is critical in today’s interdisciplinary, patient-centered care. Today’s healthcare setting requires a collaborative team approach that includes physicians, nurses, nurse care managers, social workers, community health workers, and patients at the front of the team.

The SBAR Four-Part Communications Toolkit

The SBAR communication model, a leading healthcare methodology, has four parts that streamline healthcare communications:

1. Situation (S)

Under part one: situation, a clinician gives a concise statement about the patient’s current situation; for example:

A 72-year-old female patient was admitted from the emergency department. The patient had fallen at home, resulting in a severe ankle sprain and a fractured wrist that required surgery. As an inpatient, she hasn’t participated in physical therapy for two days.

A 45-year-old male patient has been discharged home and has no family to help with self-care and daily chores. The nurse care manager visits the patient at home and finds the patient sitting in pajamas in a dark room; the home is dirty and disorganized, and the refrigerator is empty.

2. Background (B)

Under part two: background, clinicians provide brief and pertinent information related to the current situation; for example:

In order to discharge the female patient, the patient must be able to climb five steps to enter her home and move (not walk) 100 feet using a cane for support while wearing a walking boot.

The male patient has refused home health nurses over past 30 days and has lost 10 pounds over past three months.

3. Assessment (A)

Under part three: assessment, the clinician analyzes the situation and background, and considers intervention options; for example:

When the physical therapist comes in, the female patient may be in pain. She only takes Tylenol as needed at night and has not requested Tylenol during the day.

4. Recommendation (R)

Under part four: recommendation, the clinician requests or recommends an action to address the situation; for example:

For the female patient, the nurse care manager recommends that the physician order pain medication before the patient’s physical therapy, which may make it easier to participate in therapy.

For the male patient, the nurse care manager recommends a follow-up visit with the primary care provider for a depression assessment or referral to a behavioral health specialist.

A consistent strategy for effective healthcare communication improves efficiency and provides a best practice method that optimizes the sharing of patient information. Health systems can reduce their risk of medical error by integrating a communication framework into their care management that, such as SBAR, defines the concise, vital, and relevant information for successful care team handoffs.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/effective-healthcare-communication-care-management/feed/0In Healthcare Predictive Analytics, Big Data Is Sometimes a Big Messhttps://www.healthcatalyst.com/predictive-analytics-big-data-big-mess
https://www.healthcatalyst.com/predictive-analytics-big-data-big-mess#commentsMon, 21 Oct 2013 19:20:05 +0000http://healthcatalyst.wpengine.com/?p=5171Those in big data and healthcare analytics circles will seldom hear the phrase, “less is more.” In a clinical setting, however, there is an important lesson to learn about the effective execution of predictive analytics: Health systems should not confuse more data with more insight. More data is simply more—more tables, more lists, more replicates, […]

]]>Those in big data and healthcare analytics circles will seldom hear the phrase, “less is more.” In a clinical setting, however, there is an important lesson to learn about the effective execution of predictive analytics: Health systems should not confuse more data with more insight.

More data is simply more—more tables, more lists, more replicates, more clinics, more controls, more rows, more tables of tables and lists of lists, etc. In short, for predictive analytics to be effective in a clinical setting, a specific focus will always trump global utility.

Successful Predictive Analytics in Healthcare Does Not Depend on Big Data

The key to successful predictive analytics implementation is more rooted in upfront planning than in harnessing big data; It begins well upstream of the predictor and implementation, and includes four parts.

First, healthcare organizations must accurately model the workflow and detail the specific questions they want the computer to address.

Second, health systems need to collect the necessary data specific to and characteristic of the problems they are trying to solve. Gathering this data is often guided by three questions:

What supplementary data can be leveraged from external and public sources?

The goal of this second step is to stay specific to a system’s original question. Using the computer to help with feature selection can be especially useful during this step.

Third, health systems must recognize the weaknesses and leverage the strengths of various algorithm approaches:

Linear regression is typically used for continuous data.

Logistic regression is for categorical/discrete data.

Naive Bayes deals with missing data much better than many other approaches.

Support vector machine algorithms are powerful as non-linear classifiers (and have excellent performance in binary classification), but they are computationally demanding to train and run, and are sensitive to noisy data.

The final step, and perhaps the most important step, is finding the appropriate clinical group and environment for implementation. Without the proper framework in place, and without the willingness to intervene and give context for meaningful use, prediction is not useful; rather, it is often a waste of time and money.

Don’t Trade Utility for Big Data Hype

With so much hype surrounding market buzzwords, such as big data and predictive analytics, it can be daunting for healthcare organizations to sort through all the noise in this space. One guiding principle can help: do not trade useful for glamorous.

In healthcare, the tradeoff of a more generalized prediction model that inputs big data and global features is that targeted utility is lost or diluted. The features that effectively characterize a condition are the same attributes that can train an accurate predictor. But if those features do not stand out above the background noise, then the predictor only finds the noise; for this reason, prediction focused on a specific clinical setting or patient need will always trump a generic predictor in terms of accuracy and utility.

The full power of clinical prediction is best realized when the computational question is carefully defined, specific variables are gathered, a targeted need is met, and participants are willing to act. For predictive analytics, it’s the intervention that matters most. After all, it’s the intervention—not the predictor—that will improve patient care.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/predictive-analytics-big-data-big-mess/feed/1Sepsis Treatment: Target Five Key Areas to Improve Sepsis Outcomeshttps://www.healthcatalyst.com/sepsis-treatment-target-five-key-areas-to-improve-outcomes
https://www.healthcatalyst.com/sepsis-treatment-target-five-key-areas-to-improve-outcomes#respondThu, 21 Dec 2017 20:05:12 +0000https://www.healthcatalyst.com/?p=28970Given the severe consequences of sepsis, health systems must work diligently to improve the early detection and treatment of their septic patients: Sepsis kills 258,000 Americans each year. Sepsis contributes to as much as 50 percent of in-hospital deaths. The mortality rate for patients diagnosed and hospitalized with severe sepsis or septic shock is 25 […]

Despite these sobering sepsis facts and the industry’s ongoing efforts to improve outcomes for septic patients (mortality, LOS, readmissions, cost per case, etc.), outcomes for this population are getting worse. Although health systems juggle myriad demands and priorities—reporting requirements, population health management, etc.—improving sepsis outcomes should rank at the top of their lists.

So, what can these organizations do to drastically improve outcomes for their septic patients? This article highlights five key areas health systems should focus on to improve sepsis outcomes, from early recognition in the ED to patient stratification. This article also offers a helpful consensus definition of sepsis and useful resources to prepare systems for their sepsis improvement work, whether they’re beginning or midstride.

Start Here: Consensus Definitions of Sepsis and Septic Shock

Sepsis: a life-threatening organ dysfunction caused by a dysregulated host response to infection.

Septic shock: a subset of sepsis in the underlying circulatory and cellular/metabolic abnormalities profound enough to substantially increase mortality.

Helpful Resources: Surviving Sepsis Guidelines and Sepsis Alliance

Just as sepsis definitions evolve, so do its treatments. According to the Surviving Sepsis Campaign (SSC) “the optimum treatment of severe sepsis and septic shock is a dynamic, evolving process requiring further programmatic clinical research to optimize these evidence-based medicine recommendations.” Health systems must continually monitor research and guidelines from the organizations leading the battle against sepsis:

Surviving Sepsis Campaign (SSC)

The battle against sepsis isn’t a new one: the SSC was launched in 2002 to reduce the mortality rate of severe sepsis and septic shock using evidence-based guidelines and performance improvement initiatives. The most recent SSC recommendations, released in 2013, were created in bundles:

Early identification and treatment of the severe sepsis or septic shock patient.

Blood cultures before antibiotic therapy.

Administration of broad-spectrum antimicrobials within one hour of severe sepsis or septic shock recognition.

The Sepsis Alliance

The Sepsis Alliance, founded in 20017, is the largest sepsis advocacy organization in the U.S. To support its mission—to save lives and reduce suffering by raising awareness of sepsis as a medical emergency—the Alliance produces educational materials (e.g., sepsis information guides, infographics, and videos) and hosts events (e.g., free webinars) to help healthcare professionals and the public learn more about sepsis.

Distilling the industry’s continuously changing clinical and quality improvement knowledge is time consuming; this article does some of the legwork by summarizing five key areas every health system should focus on while working to improve sepsis outcomes.

Sepsis Treatment: Focus on Five Key Areas to Improve Sepsis Outcomes

Having identified sepsis as a promising area for quality improvement (because of its impact on people, health systems, and the entire industry), Health Catalyst has worked with more than a dozen health systems to improve outcomes for septic patients. There are five areas health systems should prioritize to improve sepsis outcomes, which are based on the success of Health Catalyst’s outcomes-driven collaborations and the SSC’s recommendation that healthcare organizations implement process improvements:

#1: Early Recognition in the ED

The ED represents the best opportunity for early recognition because, nationally, 80 to 85 percent of sepsis cases present in the ED. However, despite the overwhelming evidence demonstrating the importance of early recognition, many health systems do not have standardized early recognition processes.

Identifying sepsis demands a high degree of awareness, vigilance, and knowledge. Although challenging, early recognition is possible. A standardized approach to ED care can speed recognition and improve care. There are several possible interventions:

Develop and implement a standardized sepsis screening tool for all patients, which can be as sophisticated as an algorithm firing from an EMR, or as simple as a checklist.

This focus area can reduce average time from ED arrival to recognition of sepsis and initiation of sepsis treatment.

#2: Three-Hour Sepsis Bundle Compliance

Despite their awareness of three-hour sepsis bundle guidelines and their ability to improve outcomes, health systems often struggle to comply with this bundle.

The following three-hour sepsis bundle intervention guidelines—specifically, broad-spectrum antibiotics—have been repeatedly shown to improve sepsis outcomes, and are supported and widely accepted by the medical community:

Centralize/organize equipment to support rapid and appropriate care (e.g., a sepsis trolley).

Implement a standardized protocol that includes reminders.

Create prompts near antibiotic storage.

Define a sepsis threshold (a “Time Zero” or other) and provide visual cues (e.g., clock with targets highlighted, colored blanket on patient bed) for the timing of interventions based off it.

#3: Six-Hour Sepsis Bundle Compliance

Tasks associated with the six-hour bundle must be carried out while the patient remains in the ED, is in the process of being admitted, or has already arrived in the intensive care unit. Care of the patient who is critically ill with septic shock is complex, time sensitive, and resource intensive, and requires health systems to be focused and coordinated.

The six-hour sepsis bundle includes interventions for patients who are at greatest risk of death from septic shock:

Include vasopressors and lactate re-measurement on standardized ED and ICU collaborative protocols and order sets.

#4: In-House Recognition of Sepsis

It’s often difficult to recognize patients who become septic in ICUs, but it is even more challenging to recognize sepsis in patients on a floor ward. Early sepsis recognition in already-hospitalized patients can reduce progression to septic shock and mortality, but provides additional challenges—This group of patients already has complex medical or surgical conditions that can confuse or delay a diagnosis of sepsis.

Initiating interventions like those listed below can begin to help identify and intervene on this complicated population of patients.

In addition to interventions like boosting post-discharge care (e.g., better care management to treat recurrent or other infection), health systems need to prioritize risk stratification for patients likely to return for readmissions. For example, one health system’s model assigns a risk score that indicates patients most likely to return for readmission; once these patients are identified, the system pairs them with care managers committed to keeping them on the right track.

The interventions listed below can also help reduce sepsis readmissions:

Create protocols for patients identified to be at high risk of readmission, and implement appropriate actions.

Healthcare Must Work Together to Improve Sepsis Outcomes

The severe clinical, financial, operational, and patient experience consequences of sepsis make it an industrywide priority. Improving sepsis outcomes is a group effort, from hospital teams working together to implement sepsis care best practices to the entire industry working together to share what works and what doesn’t. Every healthcare organization should actively participate in industry-wide collaboration to improve sepsis outcomes.

By focusing on the five key areas outlined in this article, health systems will improve early detection, action, and intervention for septic patients; health systems will reduce the economic burden sepsis continues to have on the industry; and, most importantly, health systems will save more lives within their communities.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest:

]]>https://www.healthcatalyst.com/sepsis-treatment-target-five-key-areas-to-improve-outcomes/feed/0Five Deming Principles That Help Healthcare Process Improvementhttps://www.healthcatalyst.com/5-Deming-Principles-For-Healthcare-Process-Improvement
https://www.healthcatalyst.com/5-Deming-Principles-For-Healthcare-Process-Improvement#commentsFri, 22 Jan 2016 20:34:19 +0000http://healthcatalyst.wpengine.com/?p=8667Download Click to Enlarge Image Few people have had more influence on the science and practical application of process management than Dr. W. Edwards Deming. His impact on the automotive industry is legendary, and many other industries have tried with varying degrees of success to implement his principles as well. For years I have followed […]

Few people have had more influence on the science and practical application of process management than Dr. W. Edwards Deming. His impact on the automotive industry is legendary, and many other industries have tried with varying degrees of success to implement his principles as well. For years I have followed and admired those that have tried to bring his quality improvement processes to healthcare. I strongly believe that healthcare has much to gain by successfully implementing key Deming principles. Let me share five principles that I believe can make the biggest difference in healthcare process improvement.

1. Quality improvement is the science of process management.

When Deming and others developed their approach to modern quality improvement in the 1940s, they were basically developing a way for modern organizations to deal with the complex challenges that were confronting them. The approach they developed to improvement was remarkably simple, yet extraordinarily powerful. It’s centered on the fact that quality improvement is really about process management. These quality improvement concepts and techniques have been used to transform almost every major industry in the world with dramatic results. The last holdouts, the last passions of resistance, are primarily healthcare, higher education, and government. Now, it’s happening to healthcare. I believe higher education is imminent; it’s anyone’s guess whether government will ever succumb to these forces.

Now, we all know healthcare is very complex, but it’s not fundamentally different from other industries. Healthcare simply consists of thousands of interlinked processes that result in a very complex system. If we focus on the processes of care one at a time, we can fundamentally change the game and deal with the challenges facing healthcare. Now, this may seem like a tall order, but the Pareto principle tells us that there are probably 20 percent of those processes that will get us 80 percent of the impact. So, the challenge of every organization is to identify that 20 percent, roll up their sleeves, and begin the important work of addressing those challenges.

2. For quality control in healthcare, if you cannot measure it—you cannot improve it.

Deming clearly understood the importance of data. Meaningful quality improvement must be data-driven. This is particularly true for quality control in healthcare. You’re basically dead in the water if you try to work with healthcare providers and you don’t have good data. I think everybody recognizes that.

Deming said, “In God we trust…and all others must bring data.” I love this quote because it reflects that reality. I’ve had physicians during my career tell me pretty much the same thing, only they’re not quite so polite. They basically say, “Dr. Haughom, John, get lost! Bring the data. And then we’ll decide if we believe it.” So, data is critical if we’re going to have a meaningful impact in healthcare.

An important application or clarification of a Deming principle was put forward by my good friend, Dr. Brent James. Managing care means managing the processes of care. It does not mean managing physicians and nurses. What James said is very true. One of the big mistakes made in the 90s with the managed care movement was naively thinking that managing care meant telling physicians and nurses what to do. The reality is that you need to engage clinicians in the process because they understand the care delivery process and they are best equipped to figure out how to improve the process of care over time. And for this reason, I strongly believe that these changes will, in fact, ultimately be very empowering for all clinicians who try to get involved.

4. The right data in the right format, at the right time, in the right hands.

If clinicians are going to manage care, they need data. They need the right data delivered in the right format, at the right time, and in the right place. And the data must be delivered into the right hands—the clinicians involved in operating and improving any given process of care.

5. Engaging the “smart cogs” of healthcare.

If quality improvement is going to work in healthcare—if we are going to realize value—it means we must engage clinicians. To use Deming’s term, clinicians are healthcare’s so-called “smart cogs.” They are the frontline workers who understand and own the processes of care. And as I said in an earlier slide, we’re very fortunate in healthcare because we have a workforce dominated by clinicians who are extraordinarily committed, very intelligent, and highly educated.

But we live in a pristine time. I once received from an email from a fellow physician leader at a leading national delivery system. I’m going to withhold the name of the delivery system, but I can tell you that if you ask knowledgeable people to list of top 10 delivery systems in the country, almost everyone would put this organization on their list. Despite that, this physician wrote to me lamenting how difficult it was for him to get his peer physicians to see a new future. And in his email, he succinctly described the problem by saying that his physicians were “historically encumbered and demoralized.” And I love the succinctness of his description because what he is basically saying is they’re clinging to the past and are demoralized because they don’t see a new future. And in that short phrase, this very excellent physician leader pretty much encapsulated the problem and points us towards the solution.

Are Physicians Willing to Change?

A 2011 McKinsey survey clearly demonstrates that the majority of physicians are, in fact, willing to change. McKinsey more than 1,400 US physicians and found that 84 percent said they were willing to change if a reasonable course and argument could be made that change was necessary.

So, how do we reconcile this? I believe we need to help clinicians figure out how to give up the past by helping them see a new future and help them understand their role in creating and sustaining that new future. In fact, I believe one can make a very strong case that the future will be very empowering for clinicians of all types if we can successfully inform them, engage them, and inspire them. Applying these key Deming principles to healthcare process improvement can help every healthcare organization show the workforce why change is necessary, what they need to understand to participate in meaningful change, and what success will ultimately look like.

PowerPoint Slides

Would you like to use or share these concepts? Download this presentation highlighting the key main points.

]]>Opioid drug overdoses are currently the leading cause of death among Americans under 50, outpacing guns and car accidents. With roughly 64,000 opioid-related deaths in 2016 and 91 Americans dying daily from overdoses (including prescription pain relievers and heroin), the opioid-related death rate has quadrupled since 1999.

Initially, prescription opioid pain killers (morphine, methadone, hydrocodone, oxycodone, etc.) were welcome innovations to treat acute pain, and were presented to clinicians, and to patients, as a low-risk way to effectively treat pain, a longtime challenge for clinicians. Aggressive marketing campaigns and an increased focus on relieving pain from regulatory boards and professional organizations changed both societal expectations about pain management and clinician prescribing patterns. Pain was introduced as a fifth vital sign in 1996; the current opioid problem has its roots around this time.

Risk prediction tools that leverage machine learning have the potential to change how clinicians prescribe opioids, with the goal of preventing overuse, misuse, and abuse.

Machine Learning Offers Deeper Pattern Recognition

Machine learning is relatively new to healthcare, but has already helped organizations improve outcomes and reduce costs:

Risk Assessment After Prescribing

Instruments to assess misuse once opioid treatment has started use more data points than prior-to-prescribing tools, but don’t assess use of other substances (e.g., tobacco, alcohol, or marijuana), and most don’t include comorbid conditions that increase the likelihood of misuse (e.g. mental health conditions or history of substance abuse). One widely used tool, the Current Opioid Misuse Measure, is administered by the patient, and has a sensitivity of 0.76 (the portion of sick people who are correctly identified as having the condition) and a specificity of 0.66 (the portion of healthy people who are correctly identified as not having the condition). Relatively low specificity means that some patients will be incorrectly labeled as misusing (a false positive), when no problem actually exists.

Machine learning can improve risk assessment tools’ ability to identify patients at risk for opioid misuse. While it’s too early to know exactly how well machine learning applications will help predict risk (the technology hasn’t been applied in practice yet), studies are showing significant potential:

Mining Twitter Data for Illegal Sales

Using machine learning to mine Twitter data, a team of researchers from the University of California San Diego successfully identified accounts used to illegally sell prescription opioids online. The study demonstrates that technology and machine learning could be used for active surveillance and detection of illegal online activities, and could be used to prohibit the online sale of controlled substances.

Predicting Potentially Problematic Use at the University of Pittsburgh

Researchers at the University of Pittsburg used machine learning to identify potentially problematic use of opioids in 194,148 fee-for-service Medicare beneficiaries between 2007 and 2012. The researchers identified five subgroups based on opioid use patterns and identified a small subgroup (less than one percent) who used higher dosages of opioids, received prescriptions from a higher number of prescribers, and obtained medication from numerous pharmacies. Most of these beneficiaries were disabled and had higher comorbid conditions and concurrent healthcare use. Having a better understanding of the factors influencing misuse could improve clinicians’ decision making and opioid prescribing patterns. Machine learning can also illuminate unknown patterns or confirm suspected patterns or pathways to abuse.

Identifying Patients at the Highest Risk of Overdose

Machine learning also has the potential to help clinicians identify which patients are at the highest risk of overdose. Using machine learning, researchers identified the variables most related to overdose and predicted which patients were more likely to overdose. This information could be valuable to clinicians as they make decisions regarding prescription opioids, and could help clinicians ensure that patients at a higher risk of overdose also receive a prescription for the overdose reversal medication, naloxone. A recent study demonstrated that natural language processing techniques could be used to extract unstructured data from the EHR, automating opioid risk assessments. In addition, machine learning could help identify which patients might benefit from non-pharmacologic, multi-modal therapies, or care management programs.

The Next Level: A Bigger Impact

These studies confirm machine learning’s potential to combat the opioid epidemic. More precise machine learning tools integrated into the workflow will take opioid misuse risk assessment to the next level and drive real change.

For example, one forward-thinking healthcare organization currently uses a client pain agreement. To reduce the patient’s risk of overuse and misuse, that patient signs a contract to only use one specific clinician for pain management and one specific pharmacy for prescription opioids and other prescribed medications.

The contract is a solid start, but adding a function that predicts when a patient will default on the pain agreement would take the agreement to the next level of proactive risk reduction. Integrated into the health system workflow, the ability to predict noncompliance will help clinicians plan appropriate interventions based on risk over time (e.g., scheduling follow-up appointments during the time of highest risk of default).

Combatting the opioid epidemic is uniquely demanding because it’s a multidimensional problem with multiple factors, causes, and effects. Machine learning, however, meets the challenge of reducing opioid misuse risk by enabling a more comprehensive and complete view of the situation (comorbidities, other substance abuse, the amount of medication prescribed, the duration of opioid use, etc.).

Machine learning enables objective, multidimensional, and trainable risk reduction models (versus subjective, patient-administered tools) that will help clinicians make informed decisions about prescribing opioids, as well as finding other options in pain treatment, when appropriate. These insights will also help clinicians monitor patients and intervene to combat the escalating rate of opioid abuse in the United States.

Additional Reading

Would you like to learn more about this topic? Here are some articles we suggest: