April 2010 - Posts

St. Jude Heritage, in Fullerton, CA, has notified 22,000 patients that their personal data may have been compromised when five laptop computers were stolen. The information was not protected via full disk encryption software like AlertBoot endpoint security solutions, although password protection was used.

Theft Happened in February, Notifications Sent Last Week

A total of 22 computers were stolen, but only the five mentioned above contained sensitive, patient data.

The theft occurred two months ago, but patients are being notified only now because St. Jude had to reconstruct what was on the stolen computers: SSNs, dates of birth, and health-related information such as diagnoses (in some cases only, for the latter).

Credit monitoring and fraud alert services are being offered by St. Jude.

The Problem with Password Protection

Password protection doesn't provide information security. Earlier this week I noted that the HHS--the organization charged with enforcing HIPAA and patient data security--uses encryption software on all laptops, as well as on desktops that weren't secured physically. (HIPAA Encryption: What Does the HHS Use?)

And there's a reason for that: as a federal agency, they have to follow with a government standard known as FIPS 140-2 which is maintained by the National Institute of Standards and Technology. According to their guidelines, password-protection does not afford data security.

Now, the information that is retained by the HHS is probably not too different from what St. Jude's had. And yet the HHS declines to use password-protection and goes for something much stronger. What does this tell you about St. Jude's data protection policies?

Encryption Software: California and Federal Laws Affords Safe Harbor

St. Jude took two hits by not using disk encryption programs to protect their data. The first is the increased risk to patient data. The second is its inability to take advantage that the state laws provide.

St. Jude is located in California, and under California law, an entity does not need to make a breach public if the information is protected using encryption. Furthermore, HIPAA rules, amended by the HITECH Act, also provide the same protection from disclosure.

Needless to say, using encryption kills two birds with one stone, and not taking advantage of such protection is short-sighted.

Today's big news in our industry is the acquisition of PGP and GuardianEdge by Symantec. The two acquirees (that is a word, my spell-check notwithstanding) offer a number of data security products, and are competitors to AlertBoot in the drive encryption software arena.

The companies are being bought for $300 million and $70 million, respectively. The deal is all-cash, according to Barron's and other sources.

On Going Consolidation Trend - PointSec and SafeBoot and Others

The above is the continuation of a consolidation wave in the data security industry that was started, as I recall, by the acquisition of PointSec by Check Point back in 2007. The other merger of companies that comes to mind is that of SafeBoot by McAfee, an acquisition as well. And, of course, the Sophos-Utimaco merger (AlertBoot is powered by Sophos's SafeGuard encryption).

In an internal discussion, it was point out by our CEO that this is just further proof on the importance of endpoint encryption, and that the market is poised for growth.

No argument there. The importance of data encryption has always existed, but the downward pressure in the cost of computing hardware and software--triggered into overdrive in the Internet bubble era--meant that data security would need to go to the next level, sooner than later.

Not the "next level" in terms of technology, but in terms of adoption: Most homes didn't own a computer prior to 1995, but that's not the case today. Most people didn't need to think, or even know, about data security 15 years ago; today, it's a different story. Where technology in communications goes, data security soon follows.

Will The Merger Cause Problems?

As you may know, most companies don't successfully execute mergers and acquisitions, even if they're a friendly takeover. It tends to work better for simple industries, like a candy company taking over a gum company. Encryption and data security are not simple industries.

I've heard (all unconfirmed, of course) that there was plenty of talent fallout in the PointSec acquisition (despite of which, the acquisition became a great success, according to some). McAfee had problems integrating SafeBoot encryption support for about a year or so, and supposedly is working on technical glitches to this day. Both are testament that even the most complementary, matching technologies need some hashing out regardless of the level of professionalism and dedication.

Will Symantec experience similar problems? It's been pointed out by ars technica that "both companies already have OEM partnerships with Symantec, so effective integration into Symantec's existing software line-up is likely to be unproblematic."

Perhaps so. On the other hand, I've read my fair share of management case histories where it's pointed out that M&A failures are not just about technological matches, but also depend on organizational culture (behavioral as well as psychological), inter-office politics, etc.

You might be of the opinion that it's just all b-school hocus-pocus, but when I read that stuff, it tends to a ring bell. The technology might be a great match, but if the people working on integrating that technology are focused on something else, like turf wars...well, I think expecting delays, problems, etc. is not unreasonable.

Plus, there is the fact that Symantec has acquired not one, but two, competing technologies that are being incorporated together to their platform. I'd bet on the side of problems appearing: in the long-run, it's the winning bet, like that strategy for blackjack where you're supposed to hit on any cards less than 17.

How Does this Affect AlertBoot?

The key differentiator for AlertBoot encryption software has always been the fact that you can deploy laptop disk encryption over the internet without having to concern yourself with the associated overhead of typical disk encryption solutions, such as,

Buying extra servers

Getting a secure space, like a cage at a data center or a locked closet

Obtaining licenses that you or your company doesn't end up using because they don't have 50 employees (the 10 hot dogs, 8 buns mismatch strategy, I sometimes call it)

etc.

(By the way, if you're wondering what the 10-8 mismatch is all about, read this explanation. I think it gets as close to the truth as anything I've ever read on the subject.)

Of course, Symantec could decide to become a competitor in the most direct way possible. However, they would still face difficulties in building some of AlertBoot's value-addeds such as the integrated encryption reporting, which was pre-conceived as part of the product design process, whereas Symantec would still have to deal with legacy systems.

So, for the time being, we're feeling pretty comfortable since nothing has really changed out there. I mean, does a desert fox care about global warming? I know polar bears would, right off the bat.

In fact, a better question might be, with the acquisitions, where is the customer left? For example, both PGP and GuardianEdge have clients who've taken up their respective disk encryption program offerings. Will a particular set of clients have to re-encrypt their computers with a different solution? And what will happen to PGP's open-source? I'm sure answers will be forthcoming, but we'll also have to take a wait-and-see approach.

I'm poring over the latest report released by The Ponemon Institute, and I'm wondering whether there's more to the story than meets the eye. In addition to localized reports for the US, UK, Germany, France, and Australia, The Ponemon Institute has released a global report. Among other things, it shows that companies that are more aggressive in protecting their data--such as by using drive encryption software like AlertBoot to protect the data on lost laptops--tend to have lower data breach costs.

The Findings

The report concludes that countries with data breach notification laws tend to have higher data breach-related costs. Of the five countries, the US and Germany are the only nations that have data breach notification laws.

I've read that a similar law is coming to the UK soon, due to EU requirements, which implies there should be a similar one passed in France. Australia is preparing to pass such as law as well, although it has yet to be announced when.

The above table essentially sums up the 2009 Annual Study: Global Cost of a Data Breach study. Note how the US and Germany, the two countries with breach notification laws, have higher average costs associated with data breaches.

The "average cost per record" includes every possible cost: ex-post response; detection and escalation, such as hiring data forensic experts; lost business to customer turnover, due to the breach; etc.

"Ex-post response" is defined as "activities to help victims of a breach communicate with the company to ask additional questions or obtain recommendations in order to minimize potential harm. Redress activities also include ex-post response such as credit report monitoring or the reissuing of a new account or credit card."

It includes setting up and operating hot-lines and legal defense costs, but not the actual act of notifying customers; for example, by mailing letters about the incident.

While Germany had the highest ex-post cost, it's interesting to note that as a percentage of the overall cost, it trails France. The US ranks bottom. But, the US and Germany rank highest in terms of pure dollars.

I'm not sure what the correct interpretation ought to be. Perhaps it just means that in countries with data breach notification laws, companies involved in an information security incident put more an effort at identifying what went wrong and notifying customers, and put less emphasis on offering post-notification services.

Or perhaps it means that, regardless of the country, the amount necessary for post-notification services tends to be about the same, and countries with breach notification rules have significant extra expenses related to getting in touch with those affected by a breach.

What About the Cost of Living and Other Factors?

The opinion that data breach laws lead to higher costs makes sense. After all, without such a law, a company doesn't have to make an announcement. No one's the wiser and the company can save all that money that would have been spent on mailing letters, fielding customer inquiries, defending against lawsuits, etc.

Whereas if you have a "make your breach public" rule, then you'd have all those extra expenses--and lost future opportunities because a number of customers don't want to do business with you.

Makes sense, right? Except, I'm wondering whether that's true. For example, did the analysis incorporate the cost of living in each country? One dollar in the US is not necessarily its equivalent in the UK, regardless of what the exchange rate happens to be.

In other words, it's about purchasing power parity: does the same product cost the same in two different countries, and if not, which one is overvalued? This ties directly to whether one dollar in the US equals one dollar in the UK.

Restated, if it costs a company US$10 to mail 100 letters in the US, can a UK company also mail those 100 letters for the UK pound equivalent of $10? Or could it cost less? If so, then the average cost could be lower regardless of whether there was a breach law or not.

Too Tall An Order

It would be nice to see such factors incorporated into the Ponemon reports. However, I'll readily admit that it would be outside the scope of The Ponemon Institute's objectives.

ICO deputy commissioner David Smith pointed at the National Health Service (NHS) as being the worst when it comes to data breaches. It shouldn't come as a surprise. I've read and covered more than my share of UK breaches related to medical facilities than I'd like to admit, and often commented that the use of laptop encryption software like AlertBoot would have been ideal.

30 Breaches Per Month

The deputy commissioner noted that the NHS had 960 data breaches in a little over two years, which "works out round about 30 a month." He also noted that, while the numbers have dipped just slightly, it's still problematic in the sense that the NHS has been pretty consistent year in, year out.

Makes sense. When you're batting 0.170 on average, a "jump" to 0.180 that lasts one season would be attributed to luck, and not as a sign of improvement.

Incidentally, the above figures mean that the NHS is responsible for approximately one-third of all data breaches reported in the UK. Since reporting such breaches is still voluntary, those figures could change drastically:

Smith admitted there were plans to force an audit on the NHS to get the real picture on both the number of data losses and where they are going wrong, but he felt all organisations needed to buck up when it came to training and awareness around data protection issues. [itpro.co.uk]

It's not necessarily something I would want to do. It reminds me of my spring sophomore year in college, when I decided to clean out my dorm room, even the hard-to-reach places: once I saw how bad it was, I called it a day and only cleaned the "visible stuff" from then on. It rocks to be a guy.

Obviously the government can't be caught engaging in such actions.

Data Breach Notification Law Coming Soon to the UK

The deputy commissioner also noted that a data breach notification law will be in place within 18 months. While it will be strictly for telcos initially, "there's no logical reason to confine it to them."

If this is true, and assuming that the UK sees similar activity to what the US saw after states started passing their own breach notification laws, one can expect an upsurge in companies using encryption software.

The site modernhealthcare.com has noted today that approximately two-thirds of the breaches listed at a HHS site involve storage devices, whereas only 3% involve hackers. The complications resulting from the former can be mitigated by using hard disk encryption like AlertBoot endpoint security software.

HHS Airs Breaches Involving More than 500 People

As I noted back in February, passage of the HITECH Act meant that the Department of Health and Human Services now has to make public any notifications where private health information is breached for more than 500 people.

Obviously, HIPAA-covered entities must notify the HHS regarding the same.

Now that there are about 6 months' worth of notifications (the first breaches that are listed date from September 2009), modernhealthcare.com did some analysis, and found that:

The average size of a breach by a hospital involves 6,251 records; the same by a physician's office involved 4,496 records.

The median size of a breach across all was 2,667 records, which may be a better barometer for making sense of the data--at least, that's what my stats professor said when extreme data points are involved, if I recollect correctly.

It's interesting to note that the total sum for the fourth bullet lies at 61%, meaning that 39% of breaches are accounted for by other storage devices: external hard drives, CDs and other disk media, tapes, etc.

If such breaches were to be included, the number of breaches that could be prevented via encryption software would be over 33% of all reported breaches. In fact, it might be even more than that since we're dealing with instances where 500 or more records were involved.

On the one hand, some people may be disappointed that disk encryption programs can prevent only a portion of all possible breach incidents. On the other, being able to alleviate 30% or more of security risks is not a bad deal.

HIPAA-governed entities may need to use data encryption software to protect sensitive data--usually known as ePHI, electronic personal health information--depending on what their security needs happen to be. Encryption is not a requirement, at least not yet (although some beg to differ).

Which is weird because HHS apparently holds itself to a higher standard.

HHS Won't State "Use This"

Generally, the guidance given by HHS, formally known as the Department of Health and Human Services, is that ePHI does not require encryption to be used by HIPAA-entities as long as there are other adequate ways of security information: locked doors, laptops fixed to a desk somehow, etc.

HHS's guidance won't specifically state which encryption software programs to use. This is common when government entities give out recommendations, since they have to appear impartial--they wouldn't want to be accused of favoring one provider versus another.

So, HIPAA-entities are often stuck trying to figure out which encryption package to choose, knowing that the wrong choice could mean that heads will roll when something goes wrong.

What Does HHS Use? Does It Use Anything at All?

Unsurprisingly, HHS also has policies governing encryption for itself. What is surprising, though, is that HHS requires the use of encryption on its machines, specifically laptops. From the "HHS Standard for Encryption," (Dec 2008):

"...applies to all Department of Health and Human Services (HHS) employees, contractors, and others acting on behalf of HHS. Media is subject to this encryption standard until it is sanitized or destroyed."

"All sensitive information stored on government-furnished desktops and non-government-furnished desktops used on behalf of the Department shall be secured either through a FIPS 140-2 compliant encryption solution or through adequate physical security and operational controls at the desktop’s residing location."

Note how there is no encryption waiver for laptops. Also note how, in all instances, they require the encryption solution to be FIPS 140-2 "compliant" (we'll get to this later--there is a difference between compliance and validation).

What does this mean to you, HIPAA-entity that wants to be in compliance? This means that perhaps first and foremost, you ought to make sure the encryption product suite you're looking at has been FIPS 140-2 validated. After all, what's good for the goose is good for the gander, right? Seeing how HHS oversees HIPAA enforcement, the health department can't hold it against you for using a solution that holds the same FIPS 140-2 validation (er, I think. Speak with your lawyers, although I'd bet a pretty penny that common sense would rule in this case).

What's the Deal with FIPS 140-2?

The purpose behind FIPS 140-2 is to ensure that an encryption module actually works. There is a lot of money to be made in the computer security business, and it brings forth its share of willing and accidental digital snake-oil salesmen, the latter truly believing their solution works...until someone proves it otherwise.

This "proving otherwise" is the idea behind FIPS 140-2: after going through a battery of tests, if a particular encryption package passes, it's assumed to be safe for protecting information. In fact, it's accepted by the US and Canadian federal governments as a product that can be used "for the protection of sensitive information." The FBI won't be using something that's not FIPS validated, for example.

So who does the testing and gives the passing ones the shiny FIPS 140-2 sticker? The National Institute of Standards and Technology (NIST, for the US) and the Communications Security Establishment Canada (CSEC, for--wait for it--Canada).

What's the Deal with FIPS 140-2 Validation?

This brings us to "validation" vs. "compliance." FIPS-validated means that an encryption package went through the NIST-CSEC battery of tests and passed. FIPS-compliant just means that a package falls in line with FIPS requirements but has not actually gone through the testing.

Based on the above, it's quite obvious that you want a FIPS-validated solution. So why does the HHS require that their encryption solution be FIPS-compliant? Because the HHS uses footnotes.

If you read the footnote to what they mean by "FIPS 140-2 compliant," you'll see the following:

The cryptographic module used by an encryption or other cryptographic product must be tested and validated under the Cryptographic Module Validation Program to confirm compliance with the requirements of FIPS Publication 140-2 (as amended). For additional information, refer to http://csrc.nist.gov/cryptval. [my emphasis]

So, make sure that the encryption package you're looking at is FIPS 140-2 validated. Compliance is good; validation is better (and what counts).