Can digital health bridge the low-income gap?

We’ve known for a few years that income correlates with health in a major way. From the length of life to stroke risk to self-reported health status, the more access, opportunity, and means of upward mobility a person has, the better their projected health outcomes.

Waves of organizations are stepping up to help put things on a better course. Accountable and managed care organizations like Upward Health are working with payers, providers, and local governments to bring doctors, nurses, and community health workers to underserved neighborhoods. In some places, networks like L.A. Care have sprung up to give low-income patients affordable access to doctors and hospitals.

But all of these initiatives live or die by their ability to connect with and engage patients who may be reluctant to take advice from medical professionals, especially from outside their communities. If we’re going to move the needle on low-income health outcomes, we have to connect with people directly through the technology they use every day — their mobile devices.

The Income Barriers to Health

Delivering care to low-income populations — and more accurately populations with low socioeconomic status (low-income, low-education, low upward mobility, and lack of insurance) — is complicated by some deeply human challenges. Obviously, not all people of low socioeconomic status are minorities, but they represent a disproportionate bulk of this group, with the economic gap widening since the 2007 recession.

Though it varies by city and other factors, on average, minority groups have a higher distrust of doctors than their white counterparts. Even though many of these individuals say they’ve seen a physician in the last year, they may not trust that doctor when it comes to

1) their competency to refer them to a specialist when needed;

2) commitment to putting the minority patient’s medical needs above all other considerations;

3) influence by health insurance company rules on the doctor’s decision making; and

4) performance of unnecessary tests and procedures.

The reasons for this distrust are complex but entirely valid. There is plenty of evidence of current and historical inequitable health treatment for black Americans. Meanwhile, one-quarter of Latino adults in the United States are undocumented, making them ineligible for Medicaid and afraid to sign up for ACA health insurance. Add in a language barrier or communication gap, and it’s unsurprising minority groups are wary of the healthcare system.

Cultural norms can come into play as well. Back in 2012, the Census found that 72 percent of Hispanics say they never use prescription drugs. The reason seems to be, in large part, a stronger belief among Latinos in natural medicine and alternative practitioners. When a Hispanic person feels ill, they’re likely to ask their families for home remedies instead of turning to medication.

For all these reasons, bussing mostly white medical doctors and nurses into low-income minority areas won’t help very much on its own. We have to address the trust barrier if we’re going to start driving meaningful outcomes for low-income groups.

Spotlight

Follow us

Spiraling healthcare costs: New solutions to an old problem are emerging

No one would buy a new car without doing some research, shopping around and taking a test drive. So why do Americans seem to accept the high costs of healthcare without conducting the same research?

In comparison to other developed countries we use the same amount and receive the same quality of healthcare services. But we pay twice as much to keep ourselves and our families healthy. At the same time, we’ve become more removed from our own healthcare journeys, and it seems like taking back control is a task too large and impossible to tackle.

One answer is to move our healthcare marketplace toward a transparent, people-driven healthcare ecosystem. By empowering consumers, we can provide the needed oversight to our own health-related experiences and rein in costs.

Healthcare costs on the rise

Americans pay more for healthcare than any other developed nation in the world. In 2018, the cost of healthcare for a family of four in the U.S. reached an all-time-high of $28,166. And families aren’t the only ones who must bear the rising costs.

Employers have also seen their contributions steadily increase. In 2018 per-employee spend for health insurance benefits, including premiums and out-of-pocket costs for employees and dependents, averaged around $14,000, with employers kicking in about 70% of the tab.

Consider this context: if the U.S. healthcare industry were its own nation, it would be the world’s 5th largest economy.

According to the Centers for Medicare and Medicaid services (CMS) U.S. healthcare costs reached $3.3 trillion or 17.9% of gross domestic product in 2016.

That translates to $10,348 for every man, woman and child in the country annually.

By comparison, Americans spent just or $146 per person in 1960.

But this isn’t news to anyone. For decades healthcare costs have risen faster than the average annual income. And as a nation our healthcare spend is expected to continue to become a larger percentage of our gross domestic product (GDP).

How did we get here?

U.S. healthcare costs didn’t suddenly increase overnight. For more than 100 years there has been a push and pull between government attempts to implement various versions of government-sponsored healthcare programs and private-industry’s attempt to grow and improve what exists.

For various reasons, the U.S. has never taken the step of other developed countries to completely overhaul and design the entire system, from top to the bottom. For the most part, we’ve left it to the market, with piecemeal programs for specific segments.

The result has been a bit of a patchwork of healthcare options, funded by various entities including all levels of government, corporate-sponsored insurance and individuals.

In our current inefficient and expensive system, the consumer is removed from directly paying for most of their healthcare expenses:

Employers are subsidizing costs

Insurance is often paid for directly out of paychecks

Providers charge insurance plans with little involvement from the individual who received healthcare services

Quick history on U.S. healthcare costs

The country’s earliest attempts at formalized healthcare trace back to employer-led plans introduced in the late 1800s. During the industrial revolution, unions became more prevalent and advocated to offer protection that focused on the health of their members.

At the time, there was very little organized structure for healthcare and decisions about costs and payment were made on a trial and error basis.

Throughout the early 1900s we started to see more organization in the healthcare industry form:

Medical training became more rigorous and formalized through medical schools

Degrees and certifications began to emerge as ways to distinguish “properly trained” physicians

Regulations surrounding pharmaceuticals began to emerge

We also saw the beginnings of the first medical insurance companies

In addition, the federal government began to propose and provide health benefits, though primarily to veterans. In 1930, the government created the Veteran’s Administration to administer medical services to current and former soldiers. Over the following decade, the number of VA hospitals increased.

Options for non-veterans

Fast-forward to the Second World War. The federal government had frozen wages, so as to arrest inflation during WWII. U.S. businesses began offering expanded healthcare plans in order to recruit and retain workers.

The deal was sweetened when IRS rulings in the ‘40s and ‘50s made employer-based healthcare tax free. If you think about it, our healthcare insurance is a form of compensation from our employers. But it has been tax-free compensation for nearly half a century.

Soon other companies started getting in on the act. By 1953, almost 63% of employees were participating in employer-based plans, bringing the number of Americans covered by some form of health insurance to 190 million.

As health insurance was being adopted by the masses, more workers were using their covered healthcare services. Medicine was making remarkable advances and expensive new procedures and medications became the standard of care to keep people healthy. However, healthcare spending rose in tandem with the number of insured, pushing national healthcare expenditures to $12.7 billion, or 4.5% of the Gross National Product (GNP).

By the 1960s, 70% of the population had some form of employer-provided health coverage. Insured Americans enjoyed expensive medicines, specialty care, and for-profit hospitals at record rates, and healthcare spending rose to $27 billion, or 5.1% of GNP.

But those who didn’t work—including more than half of the nation’s senior citizens—were left without any coverage. That’s where Medicare and Medicaid come in.

Medicare and Medicaid dramatically change healthcare for everyone

In an effort to address the disparity between those with employer-provided healthcare and those without insurance, the government enacted Medicare and Medicaid in 1965. These programs provided compulsory hospital coverage and voluntary physician insurance for everyone over the age 65, as well as state assistance for the nation’s poorest families.

Unfortunately, it also led to unintended consequences that added cost and changed the healthcare landscape for everyone else.

As the nation’s newest, largest healthcare administrator, the government now made the rules for how private insurance companies would get paid for the services provided to their program recipients. Because of the government’s purchasing power, these programs also influenced pricing for most treatments and services provided in the U.S. Research shows that even today Medicare’s rates have a significant impact on what other insurers pay, as a small $1 change in Medicare reimbursements yields a $1.30 change in what private insurers pay for the same service.

With the addition of Medicare and Medicaid recipients, the majority of Americans was now covered by some form of health insurance. But demand for services continued to increase and providers were reimbursed without much attention to price, pushing prices up. By 1970, healthcare spending reached $73 billion, or 7.1 percent of GDP. The recession brought healthcare costs to the forefront and taxpayers, who were paying nearly 3 times what they paid just 5 years earlier, began to feel the U.S. was in the midst of a healthcare crisis.

By 1980, healthcare spending had tripled again to $257 billion, or about 10 percent of GDP. This trend continued throughout the 1990s. As higher demand for services emboldened providers to raise prices, healthcare costs rose at twice the inflation rate. The majority of employer-sponsored group insurance plans began attempts to switch from “fee-for-service” plans to the cheaper “managed care plans.”

This decades-long process slowly led us to where we are today, a system that has shifted from patients to payers and where providers and payers control the economics of our healthcare system. Consumers have little insight or oversight of the cost structures, and spending continues to escalate.

What are we paying for?

According to the 2016 National Health Expenditures report from the Center for Medicare & Medicaid Services (CMS), hospital care, physician and clinical services, and prescription drugs comprise more than 60 percent of healthcare expenditures. Residential, home health and nursing care, dental care, and medical equipment make up most of the rest. And the same three sponsors are footing most of the bill. Medicare/Medicaid pays 37 percent, private health insurance pays 34 percent, and out of pocket costs account pick up 11 percent.

Surprisingly, Americans use the same amount of healthcare (and on average, receive the same level of quality) as residents of other developed nations. We just pay more. According to a study published in the Journal of the American Medical Association, the U.S. spent nearly twice as much as other high-income countries on healthcare as recently as 2016.

Administrative costs. A comparison of U.S. healthcare cost metrics with the United Kingdom, Canada, Germany, Australia, Japan, Sweden, France, Denmark, the Netherlands and Switzerland concluded that the number one contributor to U.S. healthcare costs is administration. Activities related to planning, regulating and managing health systems and services account for about one quarter of total healthcare costs in the United States – compared with a range of 1% to 3% for other countries.

Pharmaceuticals. The second largest culprit is our higher drug costs. Per capita spending for pharmaceuticals was $1,443 in the U.S., compared with a range of $466 to $939 in other nations. U.S. prices for several commonly used brand-name pharmaceuticals were often double the next highest price in other countries.

Defensive medicine. The third biggest cost driver was defensive medicine, around $650 billion annually. This covers the tab for doctors who order multiple tests, even when they are confident that they know what the diagnosis is, to protect themselves against lawsuits. However, everyone pays for this through higher insurance premiums, co-pays, and out-of-pocket costs, as well as taxes that go toward paying for governmental healthcare programs.

Other factors. Other contributing factors include our overuse of expensive specialized procedures. These treatments result in higher spending on technology in more locations, plus the higher fees for specialists who administer them. Add to that our higher average salaries—$218,173 for a general practice physician in the U.S. compared to a range of $86,607-$154,126 in other countries.

Common sense solutions can keep costs in check

When you look at the factors that are driving prices higher, who could blame consumers for being disengaged? Our current healthcare ecosystem is complex and confusing, with multiple tiers of payers, and numerous plans and programs offering a complex array of services.

In addition, prices and coverage vary widely between providers and make it hard to compare. People are socialized not to push back and challenge medical providers on costs, and in truth, it’s easy to ignore the prices when feels like someone else is paying for it.

It’s not likely that a system that’s taken decades to get this far out of the control can be fixed overnight. But the tide is turning. Consumers have had enough of out-of-pocket health expenses and high-deductible insurance plans. They are ready to step up and reshape the system in a way that works for them. As a result, big healthcare providers are getting the message that customers want better price transparency, customer service, and convenience. What works?

Approach healthcare more like other major purchases

As people start to approach healthcare like they would any other major purchase, by looking for the fair pricing that represents the value they feel they’ve received, the healthcare industry is being forced to respond in order to stay competitive.

In a people-driven system, being cost-conscious is a first step to managing pricing and quality of care. Informed people can make better choices and plan ahead to get the best value, whether it’s a routine exam or elective surgery.

When consumers are empowered with information, they also feel empowered to ask questions, push back, and make common sense-decisions to manage their own healthcare. People can use the many available online tools to comparison shop for the best prices and performance ratings from doctors, hospitals and other providers. Providers can fill in the blanks by offering information about the total cost of any procedure and the patient’s share up front.

Harness the power of technology to empower healthcare consumers

Technology is another tool people can use to keep costs down. Using data captured by personal devices can help patients make connections between their behavior, healthcare consequences and costs. They can make proactive decisions that lead to have healthier habits and actively manage conditions like diabetes or high cholesterol.

Compare health plan options

By taking higher-deductible health plans with smaller premiums, health-conscious and healthier individuals will have an incentive to be smarter consumers, because they will have to pay more out-of-pocket before coverage kicks in. And depending on how much they use their health insurance, these plans can turn out to be cheaper than conventional insurance plans, which have smaller copays but higher premiums.

Don’t be afraid to ask questions

Engaged consumers aren’t shy about questioning when and where diagnostic tests should be performed, or whether they are really needed at all. They also ask their doctors for less expensive alternatives to brand-name medications and medical devices. And don’t be afraid to say no to care that seems inappropriate or excessive.

When consumers take control, they are able to use the system in a way that is appropriate and beneficial to them and their families. For example, avoiding the hospital for anything other than emergency care. By taking advantage of urgent care for immediate needs and drugstore “minute clinics” for quick checkups, patients can avoid the costly hospital charges that can sometimes be as much as 10 times higher depending on the procedure.

Controlling our health journey means controlling our costs and data

Over the years, end-users have been taken out of the equation and there hasn’t been anyone leading the charge to bring costs down. It is time for the healthcare consumer to be put at the center of the healthcare ecosystem, and we anticipate today’s people-driven health movement will do just that. Putting people back in the driver’s seat on their health journey.