June 2009 - Posts

What's the cost of a data breach? According to the latest news, nothing lower than a cool $10 million if the information security breach approaches TJX-like scales. If you'll recall, TJX had a massive breach of customer credit-card information nearly 3 years ago. The entire thing stemmed from the fact that their encryption software was not updated to industry-accepted standards.

Settling With 41 States

The parent company to T.J. Maxx--which also includes Marshall's and Bob's Stores--has agreed to settle with 41 states. The total payout to states will be $7.25 million. An additional $2.5 million is being used to fund state "projects that 'advance' effective data security and technology," according to computerworld.com.

The company has also agreed to upgrade its encryption and to limit how long credit card information is stored.

And, as is usually the case with settlements, the company has essentially denied any legal wrongdoing. Of course, in this case, it's probably true: as far as I know, TJX never broke the law. They were supposedly in breach of PCI-DSS, which is a security standard in the credit card payment industry (i.e., it ain't the law. But, hey, I'm not a lawyer...and some state AG's think otherwise).

Other Costs Related to the Data Breach

The $10 million being paid out in this case is on top of other costs associated with the breach. TJX also agreed to settle with Visa back in 2007, paying out $41 million, no doubt to cover the costs of issuing new cards to the 94 million customers affected (reportedly in the $65 million to $80 million range).

Plus, there was the settlement with a customer class-action lawsuit that resulted in the "customer appreciation sale," a three-day shopping spree where customers would have big, big! savings. There was a lot of disapproval regarding this: critics noted that this would benefit TJX, since lower prices drive higher traffic. On the other hand, no retailer wants to sell merchandise at lower prices that will eat into their profit margins.

So, it looks like a massive breach will leave a company $51 million in the red, and drag management's attention away from its actual business operations for two years.

All because, if I recall correctly, a paltry (in comparison) $2 million couldn't be found to upgrade the company's wireless encryption.

Forty-five thousand current and former students and staff members (and some dependents) are the victims of a data breach at Cornell University due to the theft of a computer. The university has set up a site (a FAQ, actually; see address at the end of this post) with details. It can be inferred from the site that data encryption software like AlertBoot was not used to protect the contents.

Unfortunate, really, because this means that potentially 45,000 SSNs and names are at a high risk of being divulged. At $1 each, it means a cool $45,000 in the black market. If I were criminally-inclined, I'd probably try to gain access to the computer--encryption or not.

The Breach - Outside the University's Control?

Before I begin with my negative criticism, as I'm wont to do, I'd like to point out that there was only so much Cornell could have done to prevent this data breach. According to point number five on the FAQ, a staff member who, apparently, works in the IT department was using the information for troubleshooting purposes. And he did it on an unprotected computer.

Also, Cornell has "information security policies and guidelines do not allow unencrypted confidential personal data to be stored on any computer device that is not in a physically secured location."

And, if someone were enforcing and tracking compliance of such measures, who would it be? Probably the IT department. So...who's gonna police the police?

The only way to prevent the breach would have been to force encryption on all computers used by staff at the university. And, this would have been (relatively) easy to do with a centrally managed encryption suite. But then, the IT department would also have been in charge of that.

So, there's very little the university could have done.

I've Got A Beef

That said, there're a couple of things I'm not crazy about in the FAQ, namely points 2 and 4.

2. If I didn't receive an e-mail or letter, does this mean that my information was not on the stolen computer?

Yes. We have conducted a very thorough analysis of a backup of the data on the computer. If you did not receive an e-mail notification or a notification letter, we did not find your personal information in the backup data from this computer.

Maybe it's just me, but where's the guarantee that Cornell will have current and valid addresses for all 45,000 people? My understanding is that the figure includes former students and staff as well. You've got to admit that there's a chance someone out there is not going to receive an e-mail or a letter, yet be a victim of this latest breach, unless the irresponsible IT staff member took care of only using names that have been verified to have a valid address, which is quite unlikely.

Although I'll grant that the project the guy (or gal) was working on involved donors who had given money recently, say, three months ago.

4. Has the data been misused?

To date, we have no knowledge that the personal identity information contained on the computer has been misused or exploited. We will update this website promptly if we learn otherwise.

I've read such statements before, and I've never, ever liked them--and this is why: I don't know about you, but this is the first time I've heard about this story. And apparently, this is news to the rest of the world: otherwise, it wouldn't have been classified as "breaking news." In fact, I believe Cornell's own WVBR may have been the first to report about this breach--and I'm pretty sure that the FAQ-site went up after WVBR's initial report.

So, what are the chances that the data may have been misused but not be on people's radar? Pretty high, I'd say. Even if an affected Cornellian found that, say, a mortgage had been granted by someone using his SSN, how would he know that it was related to this latest incident as opposed to some other data breach? He wouldn't know because this is the first time he's hearing about the Cornell breach.

In turn, it means that Cornell doesn't know about this either (the person hasn't had a chance to call in to complain), so of course the university has no knowledge of misuse...

Manchester City Council has been found in breach of the Data Protection Act (DPA) by the Information Commissioner's Office. Two laptop computers that did not feature hard disk encryption were stolen. Had data security software like AlertBoot endpoint encryption been used, the personal details of 1,754 employees would not have fallen into the wrong hands.

Batteries Were Being Charged

According to the signed undertaking, the two laptops were stolen while their batteries were being charged in the main office. Weird. Don't they have to outlets in more secure areas?

To boot, the computers were not chained to a desk; feature data encryption, as mentioned above; or even have something as lowly as password-protection in place. And, since these laptops were stolen, I presume no one was watching over these two computers while they were being charged, either.

Conclusion: there was absolutely no data security in place whatsoever. The presence of any of the above (with password-protection coming in last) would have meant that the risks of a full-blown data breach would have been greatly mitigated. Have two or more, and the risks would have been mitigated furthermore. (My guess is the presence of a person watching over the laptops would have prevented the data breach.)

Impossible Security

The problem is, you can't realistically have people guarding computers 24/7 throughout the year. To begin with, it costs too much. Plus, people are quite fallible. I think it was only a month ago I read a case where a security guard (the only security guard) left his post in the middle of the night to get a late night snack from McD's. Someone broke into the building while the guard was gone, and laptops were stolen.

So what can one do? Well, security generally requires layers, so that if one layer is penetrated, another set of layers will further obstruct access to the information.

Besides having a person watching over computers, companies can do the following to minimize the risks of a data breach:

Restrict what type of data is saved on computers: if a computer doesn't have sensitive data, and it's stolen, that's not a breach; it's just good, old-fashioned theft. I'm not saying that's a good thing, or that it's acceptable--but, it should be pointed out that the ramifications of a computer theft are less than that of a data breach

Use encryption: if you have to have sensitive information on a portable medium, at least make it so unauthorized people will find it nearly impossible to access the information.

Charles Schwab and Co. has filed a letter with the Attorney General of New Hampshire, according to pogowasright.org. An employee (now fired) took from company premises a hard disk with confidential information, which was subsequently stolen. It's not mentioned whether drive encryption software. Normally, I'd say that this is a strong signal that it wasn't used.

Of course, only Charles Schwab can say whether the contents were protected with encryption. Since they haven't revealed this point, I can only guess (well, I guess I could call their PR department and ask for a clarification...)

Anyhoo, I think it should be pointed out that NH's data breach notification laws don't provide safe harbor for companies that use encryption. In other words, the fact that the company alerted the AG is not necessarily indicative of encryption not being used (this law differs with the one in California, for example: if a computer was lost but full disk encryption had been used, and affected California residents, public disclosure is not necessary...for now).

Plus, companies--especially established, big name companies--in the finance sector have been at the forefront of protecting digital information. I'm not saying that all companies managed to get encryption in place; I'm just saying that the odds are pretty high this one is encrypted as well.

Relying on Company Policies

On the other hand, the letter to the AG also notes that the employee took out the disk from company premises when he shouldn't have. This wouldn't be the first time a company decided not to encrypt a computer, backup tape, external hard drive, or other data storage device because it was protected by locked doors and security guards.

Since it's not going anywhere, why install additional protection?

Short answer: because a lot of people don't do what they're told to do (or is it, they do what they're told not to do?)

This is not to say that such policies are unnecessary or ineffective. For the most part, if you hire smart people, educating them tends to work great. And, in the long run, it tends to work better and ends up being cheaper than a software-based solution.

But, mistakes happen. And when something needs to be protected absolutely, you need to go the extra mile.

Following on yesterday's post about laptop thefts at Bord Gais, here's another case that again shows encryption across an entire company or organization is no easy matter. The Health Service Executive (HSE) office in Ireland has been victim to a break-in, and fifteen laptops were stolen. Thankfully, thirteen of the laptops featured drive encryption (something similar to AlertBoot).

Two Laptops Not Encrypted

Of course, if only 13 out of 15 are encrypted, it means two laptops are unencrypted. Of the remaining, there was no sensitive information stored on one of the devices. The other, however, contained financial information of citizens "who had contacted community welfare officers." [irishtimes.com]

The HSE has been a victim of data breaches prior to this incident. They had a data breach in September 2008 and a second one a mere two weeks later, which prompted the HSE to review their data security practices.

It looks like the HSE had promised to encrypt all of their laptops, which the above clearly shows is not the case.

Also, it looks like they should have taken a look at their physical security as well: apparently, ten offices were broken into, but the police are not sure how.

Deploying Encryption Across An Entire Company

As I pointed out in yesterday's post, deploying encryption across an entire company is not an easy task. This is even more so if the selected encryption software is not optimized for deployment across multiple computers.

Consider a free encryption software package you can find on the internet that's not optimized as for mass deployments. You'd have to have an IT guy visit every computer at your company.

He'd have to install the encryption software (let's assume that there's no way to stop the encryption process once it's installed, so he doesn't actually have to stick around for full disk encryption to finish its job), make a copy of the encryption key, note which computer was encrypted, and come back later to verify the encryption status.

I've done this for one computer before (a friend's), and even if you get the hang of it, it takes about 10 minutes, from start to finish.

Now, I'd tack another 10 minutes to walking around, turning on computers, turning on monitors, etc. for a total of 20 minutes per computer that needs encryption. This assumes our IT guy won't face the problems of company laptops that were taken home, people who ask him to come back because they're busy on a project and need the computer, going to another floor or building or county, etc.--the real world issues that delay security roll-outs.

This means a guy can encrypt maybe three computers an hour. Assume he works 8 hours a day with a one-hour break, and he can encrypt about 21 computers a day. If the company has 500 computers, it will take him a month--again, assuming things go right (they never do).

Note that he probably has other responsibilities, so he can't actually use all 7 hours just for encrypting stuff.

However, if an encryption software suite designed for mass deployment is used, the time required to protect all computers is cut down significantly. AlertBoot, for example, uses the internet to distribute encryption software. The enduser of the computer can easily initialize the encryption process, meaning IT personnel don't have to literally visit each computer.

Instead, IT can monitor the encryption status of computers, and follow up with those particular employees who can't carry out the instructions. Plus, because everything is centralized, running reports to find out who's not in compliance; managing different encryption keys; and restricting access due to changes in employee status are easy to carry out.

Four laptops belonging to Bord Gais in Ireland (I think that's Irish for "gas board") were stolen. One of them was not encrypted and held the bank account details of 75,000 customers. It looks like the lack of data protection, however, was an oversight, since there was already a program in place for installing data encryption software on the energy company's laptops.

Stolen Two Weeks Ago

The four laptops were stolen on June 5 from Bord Gais offices in Dublin, but customers were not alerted, according to The Irish Times, because the gardai (the police) had a lead on who may have stolen the computers, and it could have compromised the investigation. A public announcement was made today (one assumes the investigation is over), and customers will be contacted individually next week.

It sounds like most of the 75,000 affected were customers who had switched over from the Electricity Supply Board (Bord Gais has recently gone into the electricity business, and had a pretty popular campaign to push its services).

One thing to note is that Bord Gais was not targeted specifically: other buildings in the adjacent area to their offices were burgled as well.

Inexcusable?

Labour Party spokeswoman Liz McManus has noted that "the loss of four laptops, including containing the details of some 75,000 customers may be excusable, but the abject failure to encrypt the customer details on the computer, is not," [irishtimes.com]

I agree...but I also disagree. It's true that encryption software should have been installed on the one computer, at least. And, when you consider the type of information on that one computer, it should have been installed on the double.

But, it's not as if Bord Gais was not implementing encryption. It's been doing so since July of last year; this particular computer just happened to slip through the cracks.

I don't know how many computers Bord Gais owns, but if they've got more than one thousand of them, I can see how it took them a while to encrypt all of them.

Encrypting a computer is not that hard; do it for a group of computers, and it becomes a logistical nightmare. Not only must one encrypt the computer (which requires someone sitting in front of the computer), one must also keep track of all the different encryption keys. And, how is one supposed to tell whether a particular computer was not encrypted?

But, depending on which encryption software Bord Gais decided to go with (I can officially say that it's not AlertBoot), it could have taken them longer than a year to protect all of their corporate laptops. This tends to be the case when people choose an encryption package that was not really designed for corporate settings, where a great number of computers must be protected.

So, while not having that one laptop computer encrypted is inexcusable, it's also understandable: it was just too soon to have all loose ends tied up.