NIST to release SCAP FDCC scanner list

Related Links

GCN InSight eSeminar:

Government Computer News will present Matt Barrett, National Institute of Standards and Technology senior computer scientist and information security researcher, at 2 p.m. Tuesday, Feb. 5 in an eSeminar on the implementation and use of the Secure Content Automation Protocol (SCAP). GCN contributing writer Jason Miller will moderate.

On Feb. 1 the National Institute of Standards and Technology will release a list of validated scanners that check for Federal Desktop Core Configuration compliance. The scanners all use the Security Content Automation Protocol (SCAP) to automatically scan desktop computers and return the results, said Peter Mell, NIST's SCAP validation program manager, at an FDCC workshop held yesterday in Gaithersburg, Md.

Last July, the Office of Management and Budget issued a clarification memo stating that agencies must monitor their desktop computers with SCAP tools "as they become available."

The scanners will ensure that the computers' configurations stay within the guidelines set by the FDDC, a group of OMB-mandated security-sensitive configuration settings developed by the NIST and the National Security Agency.

On Feb. 1, agencies will have to submit to OMB a report of all the desktop computers running the Microsoft Windows XP and Windows Vista operating systems, as well as the number of those that are FDCC-compliant, according to OMB FDCC lead Wendy Liberante, who gave an impromptu clarificationat the event. On March 31, agencies must submit a report to NIST of the status of the Windows desktop computers

As of January, however, no SCAP products have been validated by NIST, which just set up the validation program last summer.

SCAP is a framework "for automating and standardizing vulnerability management measurement and policy compliance," Mell said. It predates FDCC and can be used for checking computers to see if they meet other mandates, such as the Federal Information Security Management Act.

Although the SCAP validation process ranges across 12 different functions, this upcoming set of validated tools will be scanners, Mell noted. He did not speculate how many products would be validated, though the final testing is being done on about five.

"NIST is not recommending these products. We are not mandating these products. What we are doing is validating that the products correctly implement SCAP," Mell said. The validation will look at whether all the settings on the FDDC are checked, as well as if they are checked in the procedure that Microsoft and the government recommends.

"We have encoded in SCAP not just what to check for, but exactly way the tool should go about checking those things," Mell said.

The only items SCAP won't be able to check are the number of FDCC items that must be checked manually. Of the 729 settings that make up the FDCC, seven such checks exist for Windows XP and nine exist for Windows Vista, according to Drew Buttner of Mitre. Buttner said NIST is working with Microsoft to find ways to check these items without human intervention.

"For those we've submitted questions to the OS vendor, and we're waiting to get those answers back," Buttner said.

Despite the fact that no products have been validated, a few vendors have released their own SCAP FDCC-based scanners, such as McAfee and SignaCert. The SCAP validation, however, will ensure standardized reporting and product interoperability.

"The beauty of SCAP is that you can throw away the tool you bought, buy another SCAP-validated tool, put the same content into that, and be assured that the content will process correctly in that new tool," Mell said. "You're no longer locked-in to the same tool."

At the workshop, NSA technical director Paul Bartock talked about a pilot program that a SCAP development team held last December at the Maxwell-Gunter Air Force Base outside Montgomery Ala. The test involved three products scanning a set number of machines, some deliberately misconfigured.

In initial tests, "the tools reported about 90 percent of the same information," Bartock said. The differences in results were then used to make modifications to the SCAP protocol. "We knew the tools ingested the SCAP data correctly and performed the checks," he said.

About the Author

Joab Jackson is the senior technology editor for Government Computer News.