February 2014 - Posts

One of the best ways to prevent a HIPAA data breach is to use disk encryption like AlertBoot. Not only is encryption, other than the destruction of PHI, the only way a covered entity can take advantage of the safe harbor provision found under the Breach Notification Rule (BNR), it is synonymous with data security when it comes to stored information. Indeed, HIPAA (and many state privacy laws) defines encrypted data as not being PHI or PII (hence why it cannot be subject to BNR). This is not lost on people looking to limit their risk exposure as a covered entity.

However, because HIPAA does not require outright the use of encryption on a computer that is storing PHI, one can find certain instances where PHI is not encrypted even if it should be.

EEG Machines Include a Computer

Saint Vincent Hospital in Indiana has notified approximately 1,100 patient that their medical information was lost when a laptop computer was stolen, according to phiprivacy.net. According to the notice, the computer in question was “password protected.”

Many people often confuse password protection (a form of access control) with encryption. Could it be that St. Vincent is also confused? I doubt it for the following reason: if they thought that password protection was equivalent to encryption, they would also be under the impression that they’d be covered under the safe harbor provision and wouldn’t have sent the breach notification letters.

Ergo, since the letters were sent out, it’s quite apparent that St. Vincent is aware that they’re not covered. Hence, they know that password-protection is not encryption.

(What’s the technical difference between encryption and password protection? Encryption modifies the underlying data. If someone were to run across it, they still wouldn’t be able to understand it. With password protection, there is no data modification. Essentially, password protection is the equivalent of drawing a curtain across a board. If you happen to draw the curtain by mistake -- say, because your shoe got snagged on the drawstrings -- you have full access to the contents of the board.)

Usually, not encrypting a laptop with sensitive data is grounds for a severe excoriation from, well, from anyone. More and more people are becoming aware of the importance of data security, and the basics of how to do so.

However, the St. Vincent case could be classified as a bit unusual. The laptop in question was hooked up to an EEG machine. Chances are that this laptop was never taken out of the medical facility (possibly from the room it’s in). In such circumstances, people tend to think that the risk of losing the laptop is low (not an unjustifiable position), which in turn leads to more relaxed attitudes where the application of encryption is concerned.

The Many Ways Laptops Can Go Missing

As I noted, it’s not an unjustifiable position. The chances of a laptop hooked up to a medical device going missing are extremely low. If the laptop goes missing, the EEG machine would probably go missing, too.

But then again, in today’s era of USB ports, what could be easier than to pull a bunch of wires and then steal the one thing that can be easily unloaded on Craigslist? The one thing that any Joe Schmoe could be carrying around in a hospital setting? And be able to literally walk away with via the front door?

There’s two ways of thinking about risk. The first is “what are the chances of this happening?” The second is “what are the consequences if it did happen?” Combine the two, and one generally arrives at a good way of deciding whether something is worth doing or not.

More Than 500 Means OCR Will Investigate

Incidentally, the breach of the 1,000+ patient data is not just the breach of patient data. Yes, there may be consequences for the data breach itself. But, the Office of Civil Rights investigates all PHI breach incidents involving more than 500 people. The investigation could uncover a host of problems other than the lack of encryption on EEG machines.

Sometimes, an epiphany hits you out of the blue. Of course, that's why it's called an epiphany, but still... if one lives or visits an environment like Las Vegas, it's kind of hard to understand why my epiphany didn't come a lot sooner. But it hit me once I started reading the subtitle to an article by channel4.com, "Two of the UK's largest pawn brokers are selling second-hand phones which still contain texts..."

This is the sort of data breach that services like AlertBoot are trying to prevent, via the use of smartphone encryption and other forms of endpoint protection.

Device Data Breach: Lost, Missing, Stolen, or... Handed Over?

Many data breaches are due to an unlucky event. For example, a laptop computer gets stolen from someone's car. A smartphone is left behind on the train. But there are many instances where a data breach takes place because the device's owner consciously decides to give up the goods.

Take laptops, for instance. You'd imagine that if someone offered to return a laptop, the owner would jump at the chance to recover it. Turns out that's not always the case. I've heard over the years of instances where a laptop left behind at a hotel couldn't be returned to the owner because the owner didn't want it anymore (the hotel obviously has a way to track down the client).

The machine was insured, so "having lost it" became an excellent excuse for getting a brand new laptop. After all, you don't want to be stuck paying those insurance premiums without ever getting something back – the ROI on that would be very negative.

Devices that are sold to pawnbrokers, as covered by channel4.com, have a similar parallel. Here you have an instance where objects are not exactly sold as much as they're handed over for safekeeping, for a set period. Most people think they'll get it back (or so they hope. If there's a casino involved, which is often the case in Vegas, chances are that this won't happen).

Perhaps the smartphone was insured, perhaps it wasn't. Perhaps the device's owner will claim it as stolen or lost (in order to appease the company that got it for him; he certainly can't say that he pawned it off).

Regardless, here you have a data breach that is not an unlucky event. It's also not really an "internal attack." It's just...stupidity. It's kind of hard to avoid stupidity.

The easiest way to prevent data breaches of this nature is to secure and protect the data prior to giving a device to a person. But, this is easier said than done. For example, even if a passcode is set up on a smartphone, what is to prevent the user from deactivating it (or changing it to something incredibly vulnerable, like a passcode that is literally "1" as opposed to "1234"?)

It's because of concerns like these that companies and organizations are using security software like AlertBoot to protect endpoints, regardless of whether it's a laptop computer, a desktop computer, smart phones, tablets, and other portable data storage devices.

Among other things, AlertBoot's cloud-based endpoint security allows an administrator to install encryption (remotely, if necessary); prevent its unauthorized uninstallation; the setup of password policies; remote lockout and data wipe; and a host of other features that enhance security before and after a device goes MIA.

The addition of a reporting engine, integrated into the service from the ground up, means admins can keep an eye on the security status of all protected devices. It can also be used as documented proof of encryption if a laptop or smart phone is lost or stolen.

The Information Commissioner’s Office (ICO) in the UK has released a compendium of numbers for data breaches from April 2013 to December 2013. While laptop disk encryption and similar crypto technologies are the best way to prevent data breaches in the US (based on public records like HIPAA’s “Wall of Shame”), it looks like the UK is more prone to acts of maladroitness, and hence requires a solution that relies less on silicon chips and lines of code.

Health Sector Most Data Breach Prone

The data tracks 43 “sectors” that include health, insurance, housing, local government, estate agents, telecoms, etc. Of the 43, the health sector leads the way in terms of reported data breaches, with 160 incidents in Q4 of 2013. The local government and education sector come in respective second and third, with 55 and 36 data breaches each.

Ten sectors had zero data breaches in Q4.

Incident Type

The top three data breach incidents were “disclosed in error” (hence my “maladroitness” comment in the introduction) followed by “lost or stolen paperwork” and “other principle 7 incident.” (See this page for more information on principle 7).

Computer loss or theft comes at fourth place, barely behind principle 7 incidents. Disclosed in error accounts for nearly 50% of all incidents. The top three account for nearly 70% of all incidents (as you would expect from a Pareto distribution).

This contrasts heavily with data breach incidents in the US, where the loss of digital devices that store data (laptops, external hard drives, flash drives, backup tapes, etc.) tend to account for most data breaches. Indeed, online hacking incidents have recently surpassed the loss and theft of digital devices in terms of sheer individuals affected, yet in the ICO’s summary “technical security failing (including hacking)” comes in at a lowly six place.

Cultural Effects?

What could account for this disparity between the US and the UK? Could the answer lie in the UK having and using better digital security than the US? Perhaps the answer lies in cultural factors.

For example, Japan is supposedly a country where the personal computer revolution never happened (at least not in the way it did in the US) because cellphones dominate the scene. Hence, PCs are not as prevalent in the office as they would be in the US. From here on, it’s a matter of statistics: if you don’t have PCs to begin with, then there’s very little need for external drives, flash drives, and backup tapes. Ultimately, the number of data breaches attributed to digital devices can only be lower.

While I do not know what the situation is like in the UK, I imagine that cultural factors could account for the discrepancy (lesser use of digital devices; better adherence to computer usage polices and the law; people not reporting data breaches; etc).

This just points towards each country needing to approach their data security needs in their own unique way.

(Of course, sometimes it makes sense to follow another nation’s tactics and strategies. Most experts agree that Target’s Thanksgiving Day data breach would have been impossible — or limited — had a similar “chip and PIN” credit card been the norm in the US as it is currently in the UK.

One of the puzzling aspects of HIPAA Security Rules is that the use of HIPAA data encryption is not a requirement. Rather, it's classified as an "addressable" issue. This means that PHI encryption is "optional" in the sense that you can opt to use something else that's as good as encryption.

In other words, you've still got to protect the PHI at a level that compares to (or is stronger than) AES 128 encryption. But, in a world where you already have encryption, why would you opt for something like encryption? Especially when that alternate protection does not afford you safe harbor from the HIPAA/HITECH Breach Notification Rule?

For example, let's say that you opt to lock up an external hard drive in safe instead of encrypting it, which could be a perfectly good way of complying with the Security Rule (I write "could" and not "would" because I don't know whether this is true. I am unaware of HHS/OCR giving its official imprimatur on the practice).

What could go wrong, right?

Kmart Burglarized, Has Data Breach

According to lehighvalleylive.com, a Kmart in Wind Gap, Pennsylvania experienced an unusual type of data breach on January 4. A man robbed the retailer at gunpoint and,

...left with more than cash.A bag stolen from a safe contained money and electronic media that backed up the store pharmacy's computer system, the retailer said today.The media contained confidential information related to customer prescriptions: names, addresses, dates of birth, prescription numbers, insurance cardholder IDs and drug names.A relatively small number of those prescriptions may have included customers' Social Security and/or driver's license numbers

Kmart has already contacted affected clients about the data breach. Now, had the backup "electronic media" been protected with HIPAA-compliant encryption, the company wouldn't have had to do that.

Nor would they face the possibility of getting sued over the data breach (which Kmart will probably be able to get summarily dismissed in the courts, if rulings over the past five years are indicative of anything).

Nor would they have to submit themselves to a HHS/OCR investigation into the incident.

PHI Encryption: The Carrot and the Stick

It's commonly noted that HIPAA covered entities should seriously consider the use of encryption. The problem with this attitude is that it doesn't really mesh with the spirit of HIPAA. When it comes to ePHI data security, encryption software should be the default and not the fallback option. It's when looking to secure ePHI in some other way that one should seriously consider the ramifications.

Midland Independent School District in Texas had a data breach when an administrator's car was "broken into" (it's the correct expression, but the car was not locked so it wasn't as much "breaking and entering" as much as it was "opening and entering"). Information for approximately 14,000 past and present students at Midland ISD was compromised because a laptop computer was stolen (no, the device was not protected with laptop encryption software like AlertBoot).

Usually, the problem is a result of an employee not knowing or understanding a particular organization's computer security and usage policies. However, in this case, it looks like there's less than meets the eye: according to cbs7kosa.com, Midland ISD didn't have such policies in place.

SSNs and DOBs Stolen

The data breach was caused when a laptop computer and an external hard drive were stolen from a school administrator's unlocked car. Approximately 14,000 students' SSNs and dates of birth were breached in this information security incident. Students affected include "all current students from seventh grade through high school seniors, along with graduates dating to the class of 2008" according to nbcdfw.com.

Despite reports all around that encryption software was not used, nbcdfw.com notes that " [Midland school Superintendent Ryder] Warren said the information in the computer was coded and password-protected, but not on the external hard drive."

Of course, password-protection is not really protection. Easy ways to get around it include using a Linux CD to bypass the "protection", using a Windows recovery CD, or even hooking a laptops internal hard drive to a second computer. This is the main reason why data breach laws provide exceptions from legal action when computer encryption software is used but not when "password protection" is used.

However, I've learned over the past five years or so that, if people confuse password-protection with encryption, they're also liable to confuse it the other way around as well. Regardless, seeing how the external hard drive was not protected at all, there is no room for giving the benefit of the doubt on this one.

A simple thing like using file encryption or volume encryption on the external hard drive would have prevented this particular data breach. (Incidentally, this is available for free when you use AlertBoot full disk encryption because, well, copying data off of a computer is one of classic ways that FDE can be "bypassed").

"BYOD" Program in Place

The fact that a proper computer usage policy wasn't in place is astounding when you consider the nature of the information that was being handled by the administrator and the fact that she was authorized to take it home as necessary:

There really was no specific policy in place about employees bringing sensitive data home.

Superintendent Dr. Ryder Warren says various employees work remotely and they do have data on laptops and hard drives. He says it's on a "need to know" basis, depending on their job requirements. [cbs7kosa.com]

According to the same article, the administrator needed that information on her person:

"She handles graduation accountability rates which are based on TEA, cohort data, which is what was on the hard drive. She often conducts home visits of currently coded dropouts to assist them in reenrolling in school, so she is basically always in her car," Warren said.

One may ask, "what's the use of computer security policies if they're rarely followed – people ignore them all the time – and read even less?" The point is that an organization that creates such a policy also produces ways to sustain those policies. For example, if Midland had created a policy where (a) employees are authorized to transport student data and (b) they're required to use encryption when doing so, then (c) the school has to provide some kind of means of doing so, either by providing them with the proper tool or by showing them where they can get it (there are free FDE software available like TrueCrypt).

Why would I freely advertise a free option over our own AlertBoot? Easy. TrueCrypt is pretty good, but it tends to lack the underlying built-in foundation that makes it ideal for organizations that are looking for mass deployment of encryption on multiple computers, such as the ability to push encryption installation and policy updates, remote management, and encryption key backup, to mention a few examples.

Furthermore, laws and regulations generally require documentation (preferably unassailable) that encryption was used in the even that a laptop is used. With TrueCrypt, there is no easy way to prove its installation after the fact, whereas a solution like AlertBoot keeps detailed logs, via the cloud, of a machine's encryption status. In fact, AlertBoot's encryption reports are regularly used by clients the world over as proof of encryption when devices are lost.

How bad can a medical data breach get? Apparently, it can cause your company to fold. LabMD, which has twenty employees. has announced that it will wind down its operations due to the overbearing reach of the FTC. The company's CEO has issued a press release blaming the FTC for his company's woes, accusing it of overreaching and abusing its power.

Update (05 May 2014): Dissent at phyiprivacy.net argues that the FTC may very well have over-played its hand. Definitely worth a read.Update (29 May 2014): A Georgia Court Opinion (uploaded to phyiprivacy.net) finds the "the FTC caused our business to go bankrupt" claim questionable. More at the bottom of this post.

Summary

You can find a summary of LabMD's recent trials and tribulations by googling it. The short version of the story is that the FTC went after LabMD when the latter was caught with a data breach. LabMD went to the courts, arguing that the FTC didn't have the right to be going after it. It lost time and again, and finally decided to toss in the towel.

But as I read the stories, it seemed to me that the FTC is not necessarily overplaying its hand.

HIPAA Applies to Covered Entities

One of the judo moves up LabMD's sleeves was the argument that the FTC had no jurisdiction because the information at the center of the controversy happens to be medical in nature. Thus, the argument goes, LabMD's failings are under the purview of HIPAA, which means the Department of Health and Human Services (and its Office of Civil Rights) should be bringing forward any action, not the FTC.

Seeing how OCR has the ability to levy fines of up to $1.5 million and other penalties, including audits for 20 years, I'm not quite sure why LabMD is making this particular argument. Regardless, let's get something straight here: just because medical data has been compromised doesn't mean that HIPAA applies.

Anyone who has taken a somewhat in-depth look into the issue knows that HIPAA applies to "covered entities" (and thanks to HITECH and the Final Omnibus Rule, "business associates"). What is a covered entity? Well, medical entities like hospitals, dentists, clinics, and others. But, and this is key, not all medical entities are HIPAA covered entities (my emphasis):

Covered entities are defined in the HIPAA rules as (1) health plans, (2) health care clearinghouses, and (3) health care providers who electronically transmit any health information in connection with transactions for which HHS has adopted standards. Generally, these transactions concern billing and payment for services or insurance coverage. For example, hospitals, academic medical centers, physicians, and other health care providers who electronically transmit claims transaction information directly or through an intermediary to a health plan are covered entities. Covered entities can be institutions, organizations, or persons.[nih.gov]

In other words, if there's no electronic transmission, you're not a covered entity under HIPAA even if you operate a medical business. For example, perhaps you're a dentist who is stuck in the analog age and will only take cash payments. HIPAA doesn't apply to you.

So, the question is not whether the breached information was medical in nature. The question is whether LabMD is a covered entity. The second question is whether LabMD was a covered entity when the data breach occurred, since things have changed quite a bit since the introduction of HITECH:

In a statement released in September 2013, an HHS spokesperson said "[HHS] decided not to join FTC in their investigation of these peer-to-peer sharings and we did not independently receive complaints […] This was pre-HITECH, so there was and is no obligation on LabMD with respect to our breach notification requirements - whether any exist under state law would be for the state to determine." [dataguidance.com]

While the above HHS comment is about breach notifications under HIPAA, it stands to reason that if LabMD was not subject to the Breach Notification Rule, then it probably was not subject to HIPAA, either. If HIPAA applies to you, then BNR applies to you as well (no ifs or buts).

Disingenuous: It's Not About "Unfair" Trade Practices

There's another aspect of LabMD's arguments that doesn't quite mesh well with me. This is a quote from LabMD's CEO:

Following a four year investigation of LabMD, the FTC filed an administrative suit alleging LabMD’s patient information data security was an “unfair” trade practice. Absent any established or uniform data security standards; absent Congressional approval to regulate data security practices; absent a consumer victim from any alleged LabMD security breach; all without alleging that LabMD violated HIPAA privacy regulations . . . [healthitsecurity.com]

The problem with this argument lies in the quotation marks. As a lay person who's also a Gen-Xer, those quote marks hold a special part in my heart. I can already visualize a pair of hands in the air, the curve of the phalanges, the up-and-down motion.

The LabMD CEO's accusation sounds like the FTC views the lack of PHI security on LabMD's part as an "unfair" advantage over the competition, as if the word unfair is being misapplied, as if it doesn't make sense.

But that's not what it's about. Under Section 5 of the FTC Act deceptive maneuvers also constitute "unfair practices":

(a) Declaration of unlawfulness; power to prohibit unfair practices; inapplicability to foreign trade

(1) Unfair methods of competition in or affecting commerce, and unfair or deceptive acts or practices in or affecting commerce, are hereby declared unlawful.

(2) The Commission is hereby empowered and directed to prevent persons, partnerships, or corporations, except banks, savings and loan institutions described in section 57a (f)(3) of this title, Federal credit unions described in section 57a (f)(4) of this title, common carriers subject to the Acts to regulate commerce, air carriers and foreign air carriers subject to part A of subtitle VII of title 49, and persons, partnerships, or corporations insofar as they are subject to the Packers and Stockyards Act, 1921, as amended [7 U.S.C. 181 et seq.], except as provided in section 406(b) of said Act [7 U.S.C. 227 (b)], from using unfair methods of competition in or affecting commerce and unfair or deceptive acts or practices in or affecting commerce.

Let's say that you as a company promise customers that their information will be kept secure. That you hold its security paramount. That it will not be released to third parties. And so on, etc.

If it turns out that the company has a track record of lousy data security, would it constitute a deceptive act? What if the FTC investigated the issue and turned out that, not only do you have a lousy record, you still have lousy security, and invalidates the promise regarding data security?

Precedent for Deceptive Practices

The same reason – deceptive claims – was used when the FTC and HHS went after Rite Aid in 2010. What happened there? In a nutshell, Rite Aid was found dumping sensitive information without nary a thought given to data security, despite having promised to hold person information in a secure manner. Rite Aid settled with both government agencies.

But that's neither here nor there. The question is whether the FTC is wrong in trying to enforce data security standards on commercial companies.

Is it? Perhaps. But, I think most people will agree that LabMD didn't have a foot to stand on when considering the bigger picture. What does it matter whether it's the FTC or HHS/OCR that comes after the company? After all, it's not as if the company is being falsely accused of having had a data breach. (They had two).

One way or another, they were destined to be fined, and an agreement to be periodically audited. It seems to me that LabMD would have been better off paying for encryption software than paying its attorneys to essentially change who's overseeing the finish line.

Questionable Claim (Updated 29 May 2014)

Copied here is the court's reasoning why it questions the veracity of LabMD's claims that the FTC was the reason for winding down the private lab's operations. Footnote 8 in LabMD v. FTC Opinion and Order (my emphases):

LabMD's claim that the FTC investigation had a crippling effect on its business is questionable in light of Mr. Daugherty's testimony at the Preliminary Injunction hearing. In 2010, the FTC began its investigation into LabMD's data security practices. Four years later, in January, 2014, LabMD decided to no longer provide cancer detection services, which is the essence of its business operations.

Preliminary Injunction Hr'g Tr., at 6: 20-25. LabMD continued to operate as a going concern throughout the FTC investigation until the end of 2013. In 2013,LabMD retained 25 to 30 employees on its payroll, and it continued to generate a profit margin of approximately 25% until 2013 when the company experienced a loss of half a million dollars. Id.at 11: 1-25. The company "never had problems getting insurance prior to 2013." Id.at 12: 6-8. The evidence presented at the Preliminary Injunction hearing demonstrates that an insurer's decision to deny tail risk coverage to LabMD on account of the FTC investigation and administrative proceeding was not made until January13, 2014, which is a week after LabMD had decided to discontinue its cancer detection services. See Pl.'s Ex. 15, attached to Pl.'s Ex. List. At the Preliminary Injunction hearing, Mr. Daugherty, conceded that the implementation of the Affordable Care Act, and its resulting effect on cost containment and market consolidation negatively impacted LabMD's operations, and "creat[ed] huge anxiety, destruction, consolidation in our customer base." Id. at 52: 9-21. Mr. Daugherty also conceded that LabMD's future "depend[ed] on Obamacare, and other than that I don't know." Id.at 54: 1-4.