Hackers Outsmart Pacemakers, Fitbits: Worried Yet?

Mobile health devices aren't as secure as you might think. Look at how researchers plan to strengthen security for consumer devices and regulated medical devices.

1 of 12

Connected medical devices, together with the proliferation of consumer health-monitoring gadgets, hold great promise for remote patient monitoring and digital health management. Yet many of these devices are laughably -- perhaps even dangerously -- insecure and lacking in privacy protections.

In the last five years, cybersecurity researchers have demonstrated the potential to hacks pacemakers, defibrillators, insulin pumps, and other devices that could have life-or-death consequences. The wireless data links used to retrieve data and send instructions are often unencrypted. Authentication methods are often weak, and the password might even be hard coded into the device firmware. Some of the greatest concerns revolve around wireless communications for devices that patients take home with them (sometimes implanted in the body).

Consumer health monitoring devices turn out to have similar vulnerabilities. Though hacking a Fitbit won't have consequences on the same scale as compromising a pacemaker, increasing interconnectedness will demand more attention to security and privacy across the spectrum of medical and consumer health devices.

Click the image below for a slideshow on the vulnerabilities of health devices.

The Heart of the Problem

Some of the most sensational findings in medical device security revolve around devices that keep the heart beating: pacemakers and implantable cardiac defibrillators. The threat was brought to light by a 2008 paper. Before his death this year, the security researcher Barnaby Jack demonstrated that it was possible to hack a pacemaker remotely and cause it to deliver a life-threatening jolt.

The intended role of wireless communications is to allow these devices to be reprogrammed or to report diagnostic information without the need for surgery. Because they are designed to operate for a decade or more on a single battery, these devices have little power to spare for the kind of encryption you take for granted on your PC. Research focuses on the most frugal methods of protecting data links.

"For about a decade now, with wireless communication for medical devices, the assumption by the manufacturers has been that the telemetry interface is proprietary, and nobody is going to know what these bits mean," Denis Foo Kune, a visiting scholar at the University of Michigan, told us. Kune is working with professor Kevin Fu, one of the foremost medical device security researchers. The security-by-obscurity strategy was partly based on the fact that the devices use radio frequencies that few other devices could access, Kune said. "However, since about 2008, researchers have been showing that they could use software-defined radios to detect the bits coming out and play them back."

Through methodical hacking, it's often easy to decode and manipulate communications, Kune said. Even when communications are better protected, his research has shown the potential of hacking device sensors so they feed false data to higher-level applications, potentially leading to the administration of an unnecessary shock to the heart or an overdose of medicine.

The cybersecurity researchers are trying to raise enough of an alarm to change things, without being alarmist. Yet I have to wonder how long it will be before a real hack of someone's pacemaker comes to light, or some other life critical incident occurs.

Or is this really too far-fetched? Granted, there are easier ways to kill someone.

I'm a privacy advocate and that along with security are two big issues for me. The data selling epidemic in the US is so bad as you have the likes of Walgreens making about a billion a year selling data and Fitbit's model is built around selling data profiles...have a campaign for the FTC to license and excise tax all data sellers so they can regulate it. Can't do that until we have an index..that would be a license.

Oh wellness companies owned by insurance companies love to get a hold of data like this and it will end up in some analytics program to fix you in some fasion or another or potentially in time to deny access to something or someone.

I am all over the FTC all the time about this. There's a new "open" device out there which would allow one to choose their platform to use with it and that in time may be a plus as it would be out there for everyone to jump and fix flaws, security or otherwise.

Healthcare providers just don't get it. They refuse to see the need to fully secure their protected health information from unauthorized users -- and from authorized users who abuse their access privileges. As a result, they don't allocate enough budgetary resources for securing medical data.