My Gaithersburg, Md., listening post has picked up strong signals that there’s a new mandatory cybersecurity standard in town. The Framework for Improving Critical Infrastructure Cybersecurity, developed in 2014 by the National Institute of Standards and Technology in close cooperation with industry, has always been a voluntary guide for organizations of all types and sizes to apply common best practices in risk management.

But a low-key change has taken place that sources say has shifted the NIST CSF from a purely voluntary practice to a mandatory standard for Federal agencies. For the first time, the government has linked the Federal Information Security Modernization Act metrics to the CSF. In fact, the fiscal 2016 FISMA metrics leverage the NIST CSF as a standard for managing and reducing risk, and are organized around the CSF’s five major functions of identify, protect, detect, respond, and recover.

“Since they tied the FISMA reporting metrics in 2016 to the Cybersecurity Framework, guess what? It’s now a de facto standard,” a source close to the development of the CSF since the beginning told The Situation Report. “You have to use it. It’s voluntary, but there’s no way of getting around it and still being compliant with FISMA.”

What’s In a Name?

Well, if you’re the focal point of the Federal government’s effort to get all of government and the private sector speaking the same language when it comes to cybersecurity then names matter a lot. Officials at NIST are seriously considering dropping “Critical Infrastructure” from the title of the Framework for Improving Critical Infrastructure Cybersecurity in an effort to boost adoption across a broader swath of industry. The framework is being credited with significantly helping to raise the bar in security across industries, but some in the private sector apparently have questioned whether a critical infrastructure guide applies to them.

The Open Source Battle at DHS

The Department of Homeland Security’s chief information officer Luke McCormack was put in a tough position recently when he had to publicly flip-flop on the department’s official position on the use of open source software.

McCormack was forced to post to GitHub a strong formal endorsement of a draft White House policy for publishing Federal source code in the open. “We believe moving towards Government-wide reuse of custom-developed code and releasing Federally-funded custom code as open source software has significant financial, technical, and cybersecurity benefits and will better enable DHS to meet our mission of securing the nation from the many threats we face,” McCormack wrote, reversing the concerns expressed a week earlier by members of his own team.

Those DHS IT officials had called out the misguided geeks at the White House noting that most security companies do not publish their source code because that would allow hackers to develop highly targeted attacks.

“Government-specific examples: citizenship anti-fraud rules that are coded into software, identification of special codes used to flag law enforcement actions, APT threat indicator scripts, Mafia having a copy of all FBI system code, terrorist with access to air traffic control software, etc. How will this be prevented?” a DHS IT official stated.

And what about protecting taxpayers’ interest in government-developed software? That’s right, some at DHS would like to know how the White House will prevent commercial entities from using taxpayer-funded software components in commercial systems that companies then sell back to the government.

McCormack may have caved to White House pressure, but The Situation Report has picked up on a rear guard action to stop the White House open source push–an effort that puts the nation’s security at risk through a deliberate decision to ignore the security issues surrounding software provenance and the threat of inheriting version-specific vulnerabilities in open source code.

“Given that national security systems are exempted from this policy, and virtually all DHS systems are deemed mission/business essential, any release of code is potentially exploitable,” DHS IT officials wrote. “To avoid having our in-house developed code becoming open source, we will have to either get the DHS CIO approval to the exceptions or declare our in-house developed systems to be National Security Systems and take them off the Sensitive But Unclassified (SBU) blue line and put them on classified networks, thus increasing our costs of operation and support.”

Digital Service Rebellion

My forward observers report signs of a massive rebellion against the U.S. Digital Service by the career Federal IT employees who are being blacklisted, maligned, and generally pushed aside for not being “from the Valley.” MeriTalk plans to bring you an exclusive look at this insurgency—penned by a current Federal IT insider who believes the Obama administration has gone too far in its attempt to import change from Silicon Valley. Stay tuned.

When it comes to resilience is setting the bar too low. The Cyber Security Framework and its five major functions of identify, protect, detect, respond, and recover ignore anticipation and avoidance.
For best results, the value proposition for resilience is based on the ability to anticipate, avoid, withstand, minimize, and recover from the effects of adversity whether natural or man made under all circumstances of use.
Instead, the Government is settling for the operations of withstanding, minimizing, and recovering. Why is this a problem?
The most consequential threat to resilience lies in the cascading and propagating triggers that lie hidden in the complexity of critical sector interactions and dependencies inherent in the system of systems that make up the Critical Infrastructure. Without anticipation and avoidance, cascade triggers are left unattended.

Maybe, FISMA needs a congressional revision to ensure the credibility of the trust relationships between the basic input/output of data processing that is now moving into the cloud computing which is 100% open architecture to make a transparency in the global business integration within the C2C model(s).
Turning the C2C to G2G or C2G is a hard-challenging task ... unfortunately, in today IT world the cyber security is indefensible due to the rise of new ghost computing. Ironically, ghost computing can be a legal way for business practices by many dubious information service providers.
BTW, the NIST sets the pathway for FEDRAMP to implement the federal cloud computing, but the IT industry won't like to see any new regulations for cloud computing to alter the C2C model.