October 2011 - Posts

According to new survey, protecting a company's data is more stressing than getting a divorce or getting fired. I didn't think I'd live the day when I could claim that data encryption software like AlertBoot can be used to relieve stress.

72% Think Divorces are Easier

According to the survey by Websense, 1,000 IT managers and 1,000 non-IT employees were surveyed in the US, UK, Canada, and Australia. Results show, among other stuff:

72% think that protecting company data is more stressful than a divorce

14% think that protecting company data is more stressful than losing their job

86% say their job would be at risk if there were to be a security incident

24% say that executives' data has been breached

37% say that data has been lost by employees

34% of employees who compromise data wouldn't tell their boss

What impresses me most about this particular survey is that they were able to find all these people who work in IT and have gone through a divorce and were willing to share that. Just kidding; however, I am questioning how the questionnaire was drafted. For example, the people who answered this question, did they actually go through a divorce? Or was it one of those supposition questions, "if you had to compare it to a divorce, regardless of whether you have been divorced in the past, ....."

34% Wouldn't Tell Their Boss of a Breach

34% wouldn't tell their bosses if they compromised data (accidentally compromised it, that is; I imagine that 100% of those who do it on purpose would not tell their bosses). This particular statistic is much more arresting that the ones on divorce and whatnot.

First, it leads to the further question, "does this mean that the problem will be rectified while keeping the boss in the dark?" Or does it mean that the problem is allowed to fester, or what?

Second, it poses problems for companies trying to do the right thing. If they don't know, how can they do something about it? For example, how are they going to notify a bunch of people (or 10,000) about a data breach if they don't know about it? Furthermore, it's not something that can be carried out while keeping the boss in the dark because he's bound to find out about it (especially when the firm gets sued).

The 37% figure for data lost by employees is not particularly surprising. It just goes to show that solutions like encryption software are still pertinent.

The theft of a laptop computer could potentially affect 100 youths in Newcastle. According to a press release by the UK Information Commissioner's Office (ICO), a laptop was stolen from a contractor that was working with the Newcastle Youth Offending Team. The laptop was not protected with disk encryption software.

Contract in Place, No Follow Up

The theft took place in January of this year. A laptop being used by a Newcastle Youth Offending Team contractor was stolen during a home burglary, leading to the breach of names, addresses, dates of birth, and the names of schools. The contractor was working on a "youth inclusion program."

While there was a contract in place for the protection of data, Newcastle Youth Offending Team did not ensure that the agreement was being followed. Due to the breach, Newcastle Youth Offending Team had to sign an Undertaking, promising to make sure that contractors and other data processors working on their behalf would ensure that laptops, mobile devices, and other portable machines will be protected with encryption software.

Third Party Compliance - How to Check Up

That contractors, business associates, and other third parties can cause an organization is not news. It's happened more than often in the past; chances are that it will happen many times more in the future as well.

The ICO has faulted Newcastle Youth Offending Team for the lack of encryption, but realistically, how can the Youth Offending Team monitor what the contractor does? It could ask for some kind of certification, I guess. But then, they had a contract: are we to believe that one official document can be trusted over another?

Of course, there is a way around this: certain encryption suites, such as AlertBoot, feature a central management console that is accessible over the internet. In fact, we've signed up customers based on the fact that they can just tote around a laptop and show potential customers that, yes, they do use full disk encryption to protect their laptops and other computers.

With such a tool at one's disposal, proving compliance would literally be a couple of clicks away.

Henry Ford Health System has announced a data breach. On August 8, it was discovered that a computer from one of their laboratories. It was not revealed whether drive encryption software like AlertBoot was used, although password-protection was present.

520 Patients Affected

The breach occurred sometime between August 5 and August 7 and was discovered on August 8. A computer was stolen from the Infectious Diseases lab at Henry Ford's corporate offices.

Information on the computer included patient's names, physician's names, medical record numbers, and test results. The required substitute notice also notes that the "information stored on the computer was between the time period of 2009 and August 2011."

Again, the presence of encryption software was not specified. However, when you consider that Henry Ford went public with the information, it's doubtful that they did use it, for several reasons:

Under the same, breaches involving more than 500 patients must be made public, leading to wide-spread PR issues unless encryption is used.

They mention password-protection but not encryption.

This is Henry Ford's third data breach in a 12-month period.

In other words, Henry Ford has a very clear cut case for not making the breach public. And, the use of encryption software would have given them the legal option to exercise such a right. (It should also be noted that it's not just a legal loophole: encryption software really does provide effective data security).

Let us posit that Henry Ford wanted to notify patients despite using encryption. I can understand that. But why do so by airing one's dirty laundry in public, especially when its dirty laundry was aired twice already? They could just reach the patients directly; certainly, there is nothing in HIPAA / HITECH that prevents them from doing that.

One would imagine that in the course of a nearly one year, Henry Ford would have succeeded in deploying encryption on all computers. In fact, Henry Ford notes that they "protect our patient health information using a variety of security measures including but not limited to encryption, password authorization and digital signatures."

So, what happened here? An oversight, where one computer was not encrypted? Or perhaps, the computer that was stolen in this case was a desktop computer (most business seem intent on encrypting laptops and other "designed to be portable" devices. Desktops get stolen, too, you know).

Regardless of how it happened, I think it's safe to say that Henry Ford really needs to look into this issue of data protection. An organization can only take so many PR hits.

According to a new Ponemon Survey, the negative effects of a data breach last about one year. That is, it takes that long to restore a company's reputation to its pre-breach levels. The use of laptop encryption software like AlertBoot and other data security tools will minimize the risk of such a thing happening. Is it worth it? With an average damage of 12%, I'd say so.

Some Stats

The survey polled 850 execs. According to survey results:

The brand is tarnished for one year, on average (if you're looking to restore your brand)

Brand value loss ranges from 12% to 25% in extreme cases

42% did not have a data breach response before the breach

Of course, the value of a brand is not something that can be precisely assessed, so the numbers above are a little suspect. What's not suspect is the 42% figure. While I was aware that many companies react to breaches (e.g., encryption software for laptops are deployed company-wide after a laptop with sensitive information is lost or stolen), I didn't realize it was so in nearly half the cases.

A Caveat for SMBs?

It should be noted that the above survey was conducted on very large companies:

Depending upon the type of information lost as a result of the breach, the average loss in the value of the brand ranged from $184 million to more than $330 million, with an average brand value prior to the breach of $1.5 billion. Hence, the minimum brand damage was a 12 percent loss, increasing to nearly a one-quarter loss of the brand value in some instances.[prnewswire.com]

What does this mean for SMBs? It could be that for such concerns, recovering from a data breach could be neigh impossible if it affects a large percentage of clients. Other dangers apply as well: for example, it's not uncommon knowledge that SMBs have a harder time defending themselves against lawsuits due to its disproportionate impact on, well, pretty much everything.

About four years ago, when TJX had its massive data breach, I had reported the fact that TJX's revenues actually had gone up since the breach. I had noted that perhaps this could be explained by the fact that the breach had coincided with the beginning of what is now called the "Great Recession." Had a concern like Whole Foods been in the same position at the same time, it's doubtful that they would have had the same happy outcome.

However, a large company like Whole Foods would have a chance to restore its reputation due to its resources. A small company could go out of business altogether.

When you consider such an outcome, it's not an exaggeration to note that SMBs require data protection tools even more than larger companies, even if they don't hold as much customer data.

While carrying out a study on discarded computer hard disks and the data contained in them, the Cyber Security Research Institute (CSRI) happened upon a computer drive that was used by The Sun, one of the newspapers that belongs to News International. The kicker? Apparently, the news organization uses disk encryption and a third party contractor to wipe data.

30% of Discarded Drives Contain Data

The hard drive contained names, home addresses, and mobile phone numbers of The Sun staff and high-profile individuals. According to theregister.co.uk, "The Sun's PC came into the hands of CSRI via a third-party disposal firm that had failed to wipe the data."

The Sun contacted theregister.co.uk and noted that,

"All our drives are encrypted and we have a policy to only dispose of end-of-life hardware in a secure way through a 3rd party supplier. We are contacting the CSRI to find out more about the drive that has been passed to them."

Taking into consideration the two statements, I infer that this unnamed third party is....selling used hard drives? After wiping them, I guess (but not doing a great job of it)?

Decrypting Drives Before Wiping Them?

My question, though, is: if these drives were encrypted, how was CSRI able to access the information? Encryption software prevents unauthorized access to data. In fact, that's all encryption does.

The obvious answer is that The Sun got rid of the encryption before handing the drive over to the vendor. But, that's a bad idea, as the above shows.

Not only do you risk a data breach from the third party supplier (it's not as if those never happen), there is also the problem of your data not being protected between the time you decrypt the hard drive and the contractor collecting the device.

So, the proper method for getting rid of drives would be to hand them over to a company dedicated to destroying data (if you choose to go that way) while it's still encrypted.

There is, as far as I can tell, no HIQA requirement or direct recommendation to use data encryption software like AlertBoot. However, it appears that the use of encryption has not passed unnoticed. If you delve deeper into HIQA's publications, you see their glowing embers. Maybe it'll be a matter of time before they become a full-blown bonfire.

Ireland's HIQA - Promoting Quality and Safety

The Health Information and Quality Authority (HIQA) is a government-funded agency ("Authority") that, among other things, is responsible for quality and safety in patient healthcare. HIQA is also responsible for monitoring, advising, and evaluating patient health data issues.

A quick search of the site and its many publications (available as PDFs) show that words related to encryption (encryption, encrypt, crypto, etc.) don't show up very often. It would appear that HIQA doesn't place too much of an emphasis on the use of encryption, despite consistently noting that patient data must be protected.

Starting with Q.13 on the Self Assessment - Level 1 section ("Is access to personal health information restricted to those who need to access it?"), the tool delves into role-based access to data, culminating in Q.18, "Are all portable electronic devices that are capable of handling or displaying personal health information and databases password protected and encrypted?"

The answer, as the introductory remarks to the Self Assessment note, should be "yes." Despite the fact that you can barely find references to it, encryption software turns out to be a very important component in patient data safety.

HIQA Encryption and Protection

Mind you, it's not "password protected or encrypted" but "password protected and encrypted." When it comes to encryption, it cannot be otherwise: if you're using encryption to protect data, chances are you're being handed out a password as well. The converse is not true.

Besides the use of encryption, Q.22 asks whether "servers and files, both paper and electronic" are security locked away when not being used. A guidance prompt notes that this is a requirement under the Data Protection Acts of 1988 and 2003.

Personally, I find it surprising that encryption isn't spotlighted more. Granted, encryption is not a panacea; however, I should note that it's one of the easiest ways of complying with a multitude of access requirements (questions 13 through 17 on the Self Assessment tool):

Is access to personal health information restricted to those who need to access it?

Is there a mechanism in place to audit and validate staff access to personal health information?

Do all staff members that have access to electronic records have individual login details and passwords?

Is there a requirement that individual passwords are updated and changed regularly?

Only people who should have access to a computer (e.g., a nurse station staffed by specific nurses only) can access it

All users have their own usernames and passwords

People can be prompted automatically to change their password, say, every 6 months

Requirements for passwords can be set up, including forbidding the use of palindromes or usernames in the password

Again, even with all of the above, it is not guaranteed that an organization will never suffer a data breach. For example, let's say that employees decide to trade usernames and passwords. One of them goes rogue, logging into systems as someone else. The rogue employee is not going to show up on audit logs. There are limitations to what technology can do.

However, the use of proper encryption products can go a long way in seriously lowering data breach risks.