S.C. Security Blunders Show Why States Get Hacked

Governor blames data breach on Russian hackers and the IRS, but states' by-the-book IT ethos shows rules and regulations are the real culprit.

This holiday season, millions of people who live or work in South Carolina have a special treat in store: the potential for their identities and savings to get misused.

That's thanks to the state's Department of Revenue having stored 3.3 million bank account numbers, as well as 3.8 million tax returns containing Social Security numbers for 1.9 million children and other dependents, in an unencrypted format. After a single state employee clicked on a malicious email link, an attacker -- unnamed Russian hackers have been blamed -- was able to obtain copies of those records. The state has now urged anyone who has filed a tax return in South Carolina since 1998 to contact law enforcement officials.

How could this happen? After attackers owned South Carolina's revenue systems, they were able to conduct weeks of reconnaissance undetected. That's because the Department of Revenue had opted out of the state's optional intrusion-detection-monitoring program. Thankfully, the U.S. Secret Service spotted some identity theft cases and seems to have traced the stolen information back to state tax returns.

Why wasn't South Carolina better prepared? The answer is simple: the state's leadership, from the governor on down, failed to take information security seriously or to correctly gauge the financial risk involved. As a result, taxpayers will pay extra to clean up the mess. Beyond the $800,000 that the state will spend -- and should have already spent -- to improve its information security systems, $500,000 will go to the data breach investigation, $740,000 to notify consumers and businesses, $250,000 for legal and PR help, and $12 million for identity theft monitoring services. Of course, such services only help to spot ID theft; they don't prevent or fully resolve it. Taxpayers get to do that themselves.

Had the state done the right thing, it could have avoided the data breach and saved $13 million, which is a tad ironic, given that Gov. Nikki Haley holds an accounting degree from Clemson University.

Haley said the breach isn't the state's fault. Instead, she blamed the IRS, noting that while the state complied with IRS regulations, the IRS doesn't require stored Social Security numbers to be encrypted. Helpfully, the governor has now written a letter to the IRS recommending that it begin requiring that Social Security numbers get encrypted.

According to The New York Times, Haley has also claimed that a hacker somehow managed to breach the state's state-of-the-art security defenses. Then again, maybe not: In a press conference, she later admitted that the state's revenue systems not only lacked encryption and strong access controls, but also are based on a lot of 1970s-era equipment.

"This is a new era in time where you can't work with 1970s equipment," Haley said. "You can't go with compliance standards of the federal government."

Shocking news, and evidently the governor still doesn't understand information security -- or the folly of "security by compliance," that is, mistaking regulatory compliance with adequate information security.

South Carolina isn't the only state treating information security as an afterthought. Earlier this year, hackers breached a Utah state server, stealing the Social Security numbers of 280,000 people and health information for 500,000 state residents, none of which was encrypted.

Last year, Texas discovered that 3.5 million records, including people's names, mailing addresses, Social Security numbers, and in some cases dates of birth and driver's license numbers, had been exposed. None of that personally identifiable information (PII), which was available for a year on a public-facing website, had been encrypted.

How could the South Carolina breach happen in the wake of the Utah and Texas breaches? The answer seems to be weak rules and regulations. According to a former Texas state IT employee, "a major flaw that occurred in Texas with that data leak incident was the complete and utter failure of three agencies to properly secure their confidential PII per best practice, common sense and agreed-to methodology."

That's because the state's agencies used only information security practices and procedures explicitly required by chapter 202 of the Texas administrative code (aka TAC 202)."If their government code states they don't have to, or makes something an option, state agencies just don't and won't do it -- it costs time and $$$ tax dollars," said the former Texas state employee, via email.

Never mind if encrypting the data would require five minutes and common-sense thinking. "Yes, encrypting payloads with Winzip, 7Zip or GPG/PGP is way easy to do," he said, but without a rule or regulation, state agencies won't bother. On a related note, the state employee said he left for a private-sector job after his efforts to get the Texas Department of Information Resources to update TAC 202 to require encryption for data, whether it's at rest or in transit, were rebuffed.

Gov. Haley, by blaming the IRS for not requiring her state to encrypt Social Security numbers, has hit upon a solution to this problem: Put stronger rules and regulations in place. Require anyone who transmits or stores PII to encrypt the data, safeguard it with access controls and use intrusion monitoring to detect unfolding hack attacks. Just in case that move doesn't make government agencies' responsibilities clear, add extra penalties for any agency that suffers a data breach, regardless of the security controls it has in place. Maybe that step will finally stop the buck-passing and focus state governors' thinking on information security.

With today's sophisticated exploits, the only information required by a Hacker to breach a system is the consumerG«÷s Login credentials. Apparently what happened within the IRS in SC is the Hacker sent out emails to employees with an attachment that if opened would allow malware carrying a Trojan to automatically enter that consumerG«÷s client. Note: Most of todayG«÷s malware is not detected by anti-virus software. Once the malware is in the consumerG«÷s client it waits for a specific login page to be called up and when it is the Trojan will put the consumer on a fake Login page and then using a real-time Keylogger it will steal everything entered by the consumer. It will immediately re-log in the stolen credentials on the real log page and gain access. So consumers do not have to give out any information, all they need to do is click on the attachment to the email they received and it is G«£Game OverG«•. What the IT folks must put in place is an authentication accessing solution that requires a credential that the consumer has but does not enter it. An obvious example we are all familiar with is a smartcard. Smartcards and USB Tokens are expensive and very cumbersome to use and manage. Software solutions designed after a smart card, such as SoundPass, also provide a dynamic virtual token that automatically sends a required credential, which the consumer does not know or enter and is very affordable plus user friendly.

What kind of recourse is there for those who had their identities stolen due to lack of foresight and planning by the government? It's unacceptable that there are guidelines in place that private industry must abide by that government agencies ignore.

The real problem is there is no tangible penalty for allowing these breaches to occur. The state's CIO should spend a few years in prison; this would encourage other states' CIOs to actually focus on absolutely preventing such things, not just looking to see if they are "in compliance with regs." I don't know if the state's tech people were too stupid to know they were exposed, if they just didn't care, or if they figured "well, it's not in the budget so I'm off the hook." Facing criminal prison time might just get their attention.

i don't see that the general computing population will ever be 'tech savy' enough to deal with hacking . what this means is that 'type approved' hardware and software will have to be used for commercial computing . liability for defects can then be shifted from the user to where it belongs: to those who have the ability to deal with the problem.

Had the state of SC been conducting proper monthly SAR reviews, with random spot-audits to ensure the process was being done correctly, this would have never happened.

A bit over a decade ago, I helped Sun Microsystems to put a SAR program in place, rolling out a website and business process for the entire corporation to use to better ensure ongoing improvement to corporate security.

A "Security Adequacy Review" is a monthly assessment done by all departments. It reports on the current state of, and the improvements made to, every single computing resource in each department. At the time, it was state-of-the-art. However, it wasn't rocket science - it was relatively straightforward. It was also mandatory.

The exec at Sun who I implemented this for was the VP of Corporate Security. His job depended on doing this right.

I know it's harder in Government to affect change than it is in a company... but, if you want this fixed properly, find the appropriate person within the state government, and make their job depend on doing it right, and it'll start being done right.

In the business world, we have long observed that tech-savvy CEOs give their companies a competitive advantage. There's a parallel in government -- mayors, governors, congressmen, and presidents who grasp the value and importance of IT and cybersecurity give their constituents an advantage. In New York, for example, Bloomberg is supporting public Wi-Fi and mobile app competitions, while trying to raise the city's profile as a global center of tech innovation. Going forward, successful political leaders and candidates will be those who promote well-conceived tech initiatives and policies. John Foley, InformationWeek Government

These breaches should never occur in the first place, this kind of data should never be accessible that close to the internet. Human nature says you can't count on best practices which few choose to implement anyway. The tech to prevent this has existed for a long time. These states and companies should have such online systems setup to only pass thru man in the middle servers that prevent such direct access and limit the info passed back and forth with encryption, as for internet access from inside the network of the agency the systems with this data should not have such access, You keep accounting systems separate from the ones that have internet access.Its easy to fix, it just takes for good engineering, and managers who have enough common sense in the first place to make sure its done right, instead of the usual casual approach we see so often.

You know if there was a big penalty for these breaches instead of a PR mess the people in charge would care about these things. But there is not. This problem will never go away no matter how many times and how much a deal is made about these breaches.

Every time I hear of a breach it just sickens me, Its pure stupidity to allow these to happen in the first place. Even academics get hit, you would think they would have the most secure systems and the newest innovations on protecting systems. But they don't.

The best protect for the consumer is don't give out the info. And when you do. Make the other party aware they are responsible for the data collected in writing. I do this anytime I give financial data, with an attachment that says the info is not to be shared without permission and that the receiving party is held responsible for any breaches of trust when such data is given them. It sometimes make a difference,. People think twice when giving out data or even approving stuff to make sure the party they are dealing with is the right party.

Published: 2015-03-03Off-by-one error in the ecryptfs_decode_from_filename function in fs/ecryptfs/crypto.c in the eCryptfs subsystem in the Linux kernel before 3.18.2 allows local users to cause a denial of service (buffer overflow and system crash) or possibly gain privileges via a crafted filename.

Published: 2015-03-03** REJECT ** DO NOT USE THIS CANDIDATE NUMBER. ConsultIDs: none. Reason: This candidate was withdrawn by its CNA. Further investigation showed that it was not a security issue in customer-controlled software. Notes: none.

How can security professionals better engage with their peers, both in person and online? In this Dark Reading Radio show, we will talk to leaders at some of the security industryís professional organizations about how security pros can get more involved Ė with their colleagues in the same industry, with their peers in other industries, and with the IT security community as a whole.