FCC invests $10M in new network security but leaves backdoor unlocked

In August of 2011, while in the middle of upgrading its network security monitoring, the Federal Communications Commission discovered it had already been hacked. Over the next month, the commission's IT staff and outside contractors worked to identify the source of the breach, finding an unspecified number of PCs infected with backdoor malware.

After pulling the infected systems from the network, the FCC determined it needed to do something dramatic to fix the significant security holes in its internal networks that allowed the malware in. The organization began pulling together a $10 million "Enhanced Secured Network" project to accomplish that.

But things did not go well with ESN. In January, a little less than a year after the FCC presented its plan of action to the House and Senate's respective Appropriations Committees, a Government Accountability Office audit of the project, released publicly last week, found that the FCC essentially dumped that $10 million in a hole. The ESN effort failed to properly implement the fixes, and it left software and systems put in place misconfigured—even failing to take advantage of all the features of the malware protection the commission had selected, leaving its workstations still vulnerable to attack. In fact, the full extent of the problems is so bad the GAO's entire findings have been restricted to limited distribution.

"As a result of these and other deficiencies, FCC faces an unnecessary risk that individuals could gain unauthorized access to its sensitive systems and information," the report concluded. And much of the work done to deploy the security system must be redone before the FCC's systems approach anything resembling the security goals set for the project.

The FCC's leadership acknowledges there's a lot left to be done. "The GAO's review of this project covers a period of time during which the Commission faced an unusual level of urgency, and we look forward to sharing our further progress with Congress and GAO at a later time, when these security initiatives are more fully deployed and developed," FCC Managing Director David Robbins wrote in response to the GAO's findings. But the commission also has some personnel issues to address—all of this is transpiring as the FCC looks for a new chief information officer. Ironically, the FCC's CIO Robert Naylor stepped down in January to take a new job; he is now the CIO of a cyber security firm that caters to the intelligence community.

Measure once, cut twice

The FCC is a small organization as government agencies go, with about 2,000 employees and a budget request for 2013 of $340 million. It relies heavily on outside help for its IT operations—and on more outside help to figure out how to buy that help. The aquisition of the ESN project was managed by Octo Consulting Group, a company led by three former Gartner executives and the former CIO of the Department of Agriculture's Forest Service. The company claims on its website to have "designed the FCC Cyber Security Strategy, and managed and executed three defining Cyber Security contracts." The consulting firm also provided contracting support for the FCC's CIO as all of its major IT support contracts were preparing to expire mid-2012.

Update: "Octo was responsible for providing 'acquisition support to the FCC' for the ESN contract (i.e. Assisting FCC Acquisition & Contracts personnel with developing the Statement of Work used to acquire the hardware and services for the $10M ESN contract you referenced)," Octo Consulting Group president Mehul Sanghani said in an email to Ars. ""Once the contract was awarded, Octo was also tasked with providing project management support to supplement the FCC IT staff that was tasked with overseeing the work." The actual work on ESN was done by MicroTech and subcontractor Booz Allen Hamilton.

At the time of the discovery of the network intrusion in 2011, the FCC's network security was dated at best. The ESN project, which was originally projected to be completed this month, is intended to "enhance and augment FCC’s existing security controls through changes to the network architecture and by implementing, among other things, additional intrusion detection tools, network firewalls, and audit and monitoring tools," according to the GAO. The program was also supposed to provide the FCC with an ongoing "cyber threat analysis and mitigation program" that would do continuous risk assessment and reduction and control the damage from attacks that managed to breach the commission's security measures.

Contracts to do the work on ESN were awarded in April of 2012, just two months after plans for the project were submitted to Congress. By June, all of the security hardware and software licenses had been purchased. Implementation was in full swing.

But apparently the work was done so quickly that no one bothered to check it. While new security hardware and software was deployed, the GAO found that "FCC did not effectively implement or securely configure key security tools and devices to protect these users and its information against cyber attacks… Certain boundary protection controls were configured in a manner that limited the effectiveness of network monitoring controls."

The rush to get things in place also led to some other sloppy work. The GAO's auditors found that passwords to gain access to some of the network monitoring systems "were not always strongly encrypted." And while tools had been put in place to detect malware and block malicious network traffic, the tools had been left only partially configured.

The mishandling of security is being raised as an issue by some who do business with the FCC, especially because news of the original breach was never disclosed to the public—even as the FCC was formulating a proposed a rule that would require people with commercial interests in broadcast stations to submit their social security numbers to an FCC database. As Harry Cole, a communications lawyer with the firm Fletcher, Heald, and Hildreth put it in a post to the firm's blog," it seems extraordinarily inappropriate for the Commission, knowing of those vulnerabilities, to then propose that a huge number of folks must provide to the FCC the crown jewels of their identity, their social security numbers."

29 Reader Comments

IME, working in IT, this FCC fiasco is the rule rather than the exception. Almost every org is vulnerable. It is very hard to guard against determined hackers. It is very hard not to give office users local admin rights, and prevent crappy software installation, which is the Achilles heal of computer security. Also, it is very hard to fire IT workers and developers who are eager but naïve or apathetic, and in a perfect position to open unintentional entry points into the network, because such naiveté is difficult to quantify in HR terms.

Edit: hacking is basically asymmetrical warfare. The IT people in a given org need to all be competent, and they need to be perfect every time. To compromise what they've built, it only takes one good hacker, and they only need to get lucky once.

Best case, perhaps such an incident will inspire the FCC to step up its game on in-house networking knowledge and experience. (If they could get money for it...) For us, the public, it only illustrates how ill-equipped they are to manage such things. And yet, we DO expect them to manage it, even if they haven't the teeth or knowledge base for it. So, either 'Net affairs are misplaced with them or ... well, we don't need another agency or regulatory body, so they really need to staff up and transition.

I would guess that in this case, the failure was not so much with the FCC's IT staff, but with outside contractors who love nothing more than to over-bid and under-deliver when it comes to taxpayer-funded contracts.

Would be nice if we could claw back our tax money because of this major cluster fk! The GOV keeps making it easier and easier to for China, Russia, black/white hackers, malware writers, etc to access our networks and private information whether it is civilian or GOV/DoD.

At the time of the discovery of the network intrusion in 2011, the FCC's network security was dated at best.

Kinda faded out near the end, but this line really makes me worry about the state of US computer infrastructure security. Why would Russia or China send soldiers to die when they can just shut off all our electrical/water/etc. remotely and laugh about it.

hacking is basically asymmetrical warfare. The IT people in a given org need to all be competent, and they need to be perfect every time. To compromise what they've built, it only takes one good hacker, and they only need to get lucky once.

So true, and for most of these organizations, they are hardened once, but vulnerability is a dynamic beast, where being perfectly hardened at one time may leave you wide open just a year down the road. This seems like a Sisyphean ordeal, where they struggle to secure themselves and spend millions of dollars, and then have to start all over again 2 years down the road.

IME, working in IT, this FCC fiasco is the rule rather than the exception. [...] It is very hard to guard against determined hackers. It is very hard not to give office users local admin rights, and prevent crappy software installation, which is the Achilles heal of computer security. Also, it is very hard to fire IT workers and developers who are eager but naïve or apathetic, and in a perfect position to open unintentional entry points into the network, because such naiveté is difficult to quantify in HR terms.

You blamed everyone but management.... Are you sure you're Dilbert, and not someone with pointy hair?

I wish people would get it through their heads that IT security specifically is about being REACTIVE. You can only be proactive to a point, and then all you are doing is adding layers of complexity which only serve as attack vectors themselves at some future date.

There comes a point where sensitive material simply should NEVER become part of an information system - especially an interconnected information system. Until the powers that be realize this, they will unfortunately keep throwing piles of money into the fire.

IME, working in IT, this FCC fiasco is the rule rather than the exception. [...] It is very hard to guard against determined hackers. It is very hard not to give office users local admin rights, and prevent crappy software installation, which is the Achilles heal of computer security. Also, it is very hard to fire IT workers and developers who are eager but naïve or apathetic, and in a perfect position to open unintentional entry points into the network, because such naiveté is difficult to quantify in HR terms.

You blamed everyone but management.... Are you sure you're Dilbert, and not someone with pointy hair?

CEO: hey Developer Bob, can you please verify we aren't open to SQL injection attacks?

That will never come out of a mouth of anyone in upper management, ever. They may, and should, ask IT to make security their priority. That's as high level as the upper management can or should get involved in securing the network and systems. The rest is up to IT, from the manager/CIO on down.

It is possible, and in fact very probable, to have a security mandate from management, and a secure system on paper, and still be wide open to attacks. Actually, this very news story is precisely about that.

The difference between being hacked and being jacked by run-of-the-mill Zeus deployed Malware is significant IMHO. Being intentionally hacked with directed intent is a lot more intense than opening an infected PDF, or visiting an infected website and being jacked by WinAntivirus 20xx etc. I just get the sense that "security analysts" are very aware of the difference but it is in their commercial interests to lump everything into a "The Chinese are coming" type of fear attack, and are merely taking advantage of the fact that internal IT and managers have very little experience in mitigating such "attacks".

So what is the actual solution? It can not be for thousands of small groups to badly reinvent the wheel. There needs to be standard best of breed approaches everyone can follow. "Did not configure correctly"? Really? Why is the software / hardware not good right out of the box?

Microsoft spent years getting their security house in order. We need the US to do the same. Some kind of open source project with hardware and software recommendations + how to set it up correctly?

IME, working in IT, this FCC fiasco is the rule rather than the exception. [...] It is very hard to guard against determined hackers. It is very hard not to give office users local admin rights, and prevent crappy software installation, which is the Achilles heal of computer security. Also, it is very hard to fire IT workers and developers who are eager but naïve or apathetic, and in a perfect position to open unintentional entry points into the network, because such naiveté is difficult to quantify in HR terms.

You blamed everyone but management.... Are you sure you're Dilbert, and not someone with pointy hair?

Dilbert was a developer. IT was another horrible department. But yeah, he sounds like management.

IME, working in IT, this FCC fiasco is the rule rather than the exception. [...] It is very hard to guard against determined hackers. It is very hard not to give office users local admin rights, and prevent crappy software installation, which is the Achilles heal of computer security. Also, it is very hard to fire IT workers and developers who are eager but naïve or apathetic, and in a perfect position to open unintentional entry points into the network, because such naiveté is difficult to quantify in HR terms.

You blamed everyone but management.... Are you sure you're Dilbert, and not someone with pointy hair?

Dilbert was a developer. IT was another horrible department. But yeah, he sounds like management.

Or 13 years working as an IT engineer, with 11 years posting on Ars about it, pretty much every day.

....Edit: hacking is basically asymmetrical warfare. The IT people in a given org need to all be competent, and they need to be perfect every time. To compromise what they've built, it only takes one good hacker, and they only need to get lucky once.

Some times it doesn't even have to be a good hacker or even a hacker. If the weakest link in the chain is already fractured/broken, then it only takes luck and some times not even that to compromise a system. Who needs to kick down a door if it's already open.

I would guess that in this case, the failure was not so much with the FCC's IT staff, but with outside contractors who love nothing more than to over-bid and under-deliver when it comes to taxpayer-funded contracts.

One problem for many agencies is the tendency not keep enough technical competency in-house. This is especially true with IT security because if you are forced to rely heavily on outside contractors you are their mercy. Some will be excellent, some mediocre, and some absolutely worthless. If you lack sufficient proficiency you will have a difficult time determining the contractors' qualifications.

I saw this many years ago with contractor laboratories versus having some in house competency (an small in-house laboratory) to review the outside vendors.

hacking is basically asymmetrical warfare. The IT people in a given org need to all be competent, and they need to be perfect every time. To compromise what they've built, it only takes one good hacker, and they only need to get lucky once.

So true, and for most of these organizations, they are hardened once, but vulnerability is a dynamic beast, where being perfectly hardened at one time may leave you wide open just a year down the road. This seems like a Sisyphean ordeal, where they struggle to secure themselves and spend millions of dollars, and then have to start all over again 2 years down the road.

It is not clear what the actual problems at the FCC are.

But there some basic measures that should be taken no matter who you are such as maintain an update, fully patched machine - os and applications; install and maintain appropriate security applications for the type of machine (desktops vs servers and type of server); have an intelligent access policy for all users to limit their access. Then monitor for security information appropriate for your situation and act appropriately. The exact scope varies whether you are dealing with your home network or with a larger organization.

My question is whether even these basics were being carried out? The article is unclear but I would not be surprised if they were not being done since the GAO implies the FCC was deficient on their security procedures.

Also, the FCC has public access to their database and I have no idea if it is harden against SQL injection. I am fairly certain they use an RDMS such MSSQL server for their website because their datafiles are set up for an RDMS database.

IME, working in IT, this FCC fiasco is the rule rather than the exception. [...] It is very hard to guard against determined hackers. It is very hard not to give office users local admin rights, and prevent crappy software installation, which is the Achilles heal of computer security. Also, it is very hard to fire IT workers and developers who are eager but naïve or apathetic, and in a perfect position to open unintentional entry points into the network, because such naiveté is difficult to quantify in HR terms.

You blamed everyone but management.... Are you sure you're Dilbert, and not someone with pointy hair?

CEO: hey Developer Bob, can you please verify we aren't open to SQL injection attacks?

That will never come out of a mouth of anyone in upper management, ever. They may, and should, ask IT to make security their priority. That's as high level as the upper management can or should get involved in securing the network and systems. The rest is up to IT, from the manager/CIO on down.

It is possible, and in fact very probable, to have a security mandate from management, and a secure system on paper, and still be wide open to attacks. Actually, this very news story is precisely about that.

The problem is when upper management is told they need to budget money for security will they actually come through. Their actions or inaction say more about their priorities than the "official" policy will ever say.

So what is the actual solution? It can not be for thousands of small groups to badly reinvent the wheel. There needs to be standard best of breed approaches everyone can follow. "Did not configure correctly"? Really? Why is the software / hardware not good right out of the box?

Microsoft spent years getting their security house in order. We need the US to do the same. Some kind of open source project with hardware and software recommendations + how to set it up correctly?

Some this can be done relatively easily such as prompting (demanding) that all default passwords be changed. But some of the configuration is situational and best practices can only be a guide. The problem is in many areas there is no real easy "one size fits all" method that will work but only intelligent application of best practices/design.

I see by this morning's WSJ that we can expect an executive order from the President tomorrow authorizing the Federal government to "help" critical infrastructure industries with cyber security. Wunnerful.

I wish people would get it through their heads that IT security specifically is about being REACTIVE. You can only be proactive to a point, and then all you are doing is adding layers of complexity which only serve as attack vectors themselves at some future date.

The point is that you make sure you go up to that point with being proactive. Make sure everything stays patched, make sure the necessary controls and procedures/policies are in place, etc. Implement a vulnerability management program that has some teeth, so if goals aren't met departments get penalized, and so on and so on.

Sean Gallagher / Sean is Ars Technica's IT Editor. A former Navy officer, systems administrator, and network systems integrator with 20 years of IT journalism experience, he lives and works in Baltimore, Maryland.