Consider what would have happened if this were incident that had occurred in the airline industry. We would expect a team of accident experts from the TSB to perform a detailed open enquiry into the matter, come to strong conclusions about what went wrong and why. They would also disseminate all new knowledge and best practice to be mandated throughout the industry.

For a data breach we get is a webinar. Simply “candid” webinars with CEOs of companies who have been breached is just not good enough. We need publicly funded and accountable organizations to pick through the burning rubble of yet another data breach and force the industry to improve.

Openness and learning lessons from the mistakes of others is the norm in aviation. Aviation regulations are also much more rigorous and are often written through exceptions. For example Rule Zero:No one may fly. Rule One:... except if you are a registered compliant airline. Rule Two:... except …

The data world needs similar culture and framework. Perhaps we could start with– Rule Zero: no organization may hold data. What do you think Rule One: should be?

Monday, 27 July 2009

Should we panic? Researchers at Black Hat have just announced plans to package their hacking knowledge so that it can become part of a generic vulnerability exploit framework Metasploit. This will definitely lower the skill level needed to attack databases so that even a script-kiddie should be able to use it.

The reality is that ‘yes’ databases are very vulnerable and that system's owners from database administrators to network security professionals should be taking action. However, the deployment posture of databases changes their threat landscape – especially for external attack. Very few databases are directly connected to the outside world as they are nearly always connected via-applications. So in this situation the standard 'pre-packaged' exploits can have limited efficacy.

It is well known that the application layers are the worst offenders for holding exploitable bugs and security issues. As each application has been built to do a specific job, each application has its unique set of security issues. With some access to the application, it does not take much effort from an attacker to get at data they should not. Web sites are simply applications that provide access to everyone – including external attackers. Databases are also threatened by insiders often using their own login account. Sometimes data is accessed inappropriately simply out of curiosity rather than out of malice whilst highly privileged users make accidental mistakes resulting in corrupted data.

So back to my question at the top – “should we panic” about script kiddies attacking our databases? No – but we must not believe our databases are effectively secured. We must take calm and consistent measures to pro-actively defend our databases from poorly written applications, nosy internal users, and the bumbling error-prone high-privileged administrator.

Friday, 24 July 2009

Today, news is emerging of a credit card breach with the Japanese arm of global insurer Alico with the credit card data of approximately 110,000 customers affected. Of those affected, more than 1000 customers have seen fraudulent charges on their credit cards, and the credit card companies alerted Alico to the alleged theft.

An Alico spokesperson said that the company has yet to determine how the data could have been leaked. This statement and the fact that credit card companies alerted the company to the breach shows how difficult it can be to determine how a breach occurred, even if you know that one did occur.

This breach brings to mind RBS WorldPay and Heartland, in which customers saw fraudulent charges on their credit card bills before the companies realized they had been breached. As Alico looks for the source of the breach, we are also reminded that in this threat environment, personal data is constantly under attack. The link to criminal elements shows that these breaches are done with the express intent to grab personal financial data to be used fraudulently. In this type of situation, Alico is left playing “catch up” without the ability to stop additional damage to its customers, because their data has already been compromised. We hope that the company and all in the industry use this as a lesson as to the importance of knowing the location and status of their data at all times because it will always be an attractive target.

As they say, "an ounce of prevention is worth a pound of cure" -- now is the time to apply preventative measures to protect data.

Wednesday, 22 July 2009

Today, three units of HSBC were fined £3 million for losing customer data. Two of the breaches affected more than 180,000 people, and, although no customer has reported any loss from these incidents, the Financial Services Authority is sending a strong message to all UK financial services firms. At issue is how careful HSBC was with the customer data, rather than the outcome relating to these breaches. This last point should be resonating with all financial services firms, as well as those that handle customer data. No HSBC customer experienced a loss from these breaches, but the Financial Services Authority has still called the company to task for being careless and for failing their customers.

For all financial services firms, especially, this ruling should be given strong consideration. If these types of breaches can occur at the world’s largest bank and the Forbes’ ranked sixth largest business in the world, then they can happen anywhere. Details from the breaches show an alleged environment in which poor employee training in preventing and dealing with identity theft as well as lax encryption standards prevailed. These factors are part of “Identity Theft 101,” meaning that even the smallest, most regional bank, mortgage company, or insurance firm would make sure these controls are in place. There is some good news for HSBC in this. By cooperating, they have seen their fine reduced from a potential £4.5 million, savings that can be used for better protection.

The bad news for all of us is that the fine was really insignificant in the grand scheme of things. A few million pounds is still only loose change for these organizations, even in these times.

Thursday, 16 July 2009

This week, the after effects of the Twitter‘s May breach became known, with confidential employee and company information being acquired and sent to TechCrunch. At a macro level, the information shows the potentially embarrassing data that exist in every company and that no executive, shareholder, customer, employee or partner would want to see revealed.

For Twitter these include names of senior executives who interviewed for positions at the company and are currently employed elsewhere, earnings projections, new product information, and floor plans.

What is interesting to those of us in the security industry is what the breach initially appears to be – a security failure in the Cloud, and what it really is – an exploit of the password recovery system and other features of Google Apps.

Cloud is no more secure or less secure than any other environment, and what happened at Twitter could have easily occurred in a traditional implementation. This breach indicates, at the very least, that traditional password protection practices were not being followed. This is not surprising considering the stress placed on current IT budgets that results in security updates and practices being delayed. For every organization that holds information that could be deemed embarrassing if made public (so, everyone), Twitter serves as reminder that open does not mean secure and the protection needs to come from provide the appropriate care at the level of the data itself.

Thursday, 9 July 2009

Our good friends at the Ponemon Institute have issued some important and sobering findings, indicating that 70 percent of UK organizations have experienced at least one data breach in the past year. Equally alarming is the fact that less than half of these breaches were made public, as there was no legal or regulatory requirement for disclosure.

The report finds a direct correlation between an organization’s likelihood to experience data loss and its lack of a consistent, organizational-wide strategy and enforcement of data protection and encryption policies.

Many of the organizations surveyed indicated that data protection was among its top priorities, but what caused many to fall victim to data loss was the lack of a unified, consistent approach to protect data that would apply across every access point and device in an organization. This approach will quickly fade over the coming months. Legislation, increasing awareness, and the requirements of standards like PCI-DSS will cause companies that have not undertaken a unified approach to consider it – strongly.

If outside pressures are not enough, then financial ones will be. Recent research by Ponemon found that the average UK data breach costs a total of 1.7 million pounds Sterling; the equivalent of 60 pounds Sterling for every record compromised.

These numbers are too costly for us all. Personally, I don’t care about the cost – I simply want my data held safely!

Wednesday, 8 July 2009

The security world, government agencies and many others are abuzz at reports that North Korea is behind a series of powerful cyber attacks targeting government agencies in the US and South Korea as well as a host of other organizations, including Nasdaq and the Washington Post.

The timing would indicate North Korea, which has been increasingly aggressive in its dealings with the United States over the past few weeks. However, it is doubtful that North Korea has the ability to launch this devastating an attack on such a large scale.

The question, then, is who would plot and execute this type of strategic hit at two major world governments, as well as some very well-known companies? The answer might be found in a series of cyber attacks that US and UK government organizations endured in the middle part of this decade.

At the time, both countries were complacent in their security measures, without realizing that their actions were being monitored by entities that launched extremely targeted attacks to penetrate their systems. It took two to three years before the details and those purportedly behind the attacks were revealed outside security circles.

Today’s situation is analogous in that it will take at least that long before we realize the “who” and “why” behind these attacks. Until then, the news reminds us how important and continuous our efforts to protect government and private data must be.