Government Against Full Disclosure of Vulnerabilities

The President's special advisor for cyber space security tells security professionals that only software vendors and the
government should be notified of security flaws in software before a patch is released.

The government is urging "white hat" hackers to search for security flaws in software, but also wants them to only pass information
about those flaws on to software vendors and the government, not to the rest of the security community as is common practice today.

Speaking at the annual Black Hat Conference of Information Technology Professionals in Las Vegas Wednesday, Richard Clarke,
President Bush's special advisor for cyber space security, said security professionals have an obligation to be responsible with the
disclosure of security vulnerabilities. They should first report vulnerabilities to the vendor who makes the software in which the
vulnerability is found, and then tell the government if the vendor doesn't take action.

Only after a patch for the vulnerability is distributed, he said, should others be notified about the vulnerability.

"It's irresponsible and sometimes extremely damaging to release information before the patch is out," Clarke said in his speech.

The debate over "full disclosure" has raged through the security community for years, with many vendors taking the stance that
information about security flaws should not be communicated to the world, and many security professionals arguing that such
disclosure is essential, especially when vendors are unresponsive and "black hat" hackers are capable of finding flaws on their own.

Elias Levy, co-founder and chief technology officer of SecurityFocus.com, which publishes BugTraq (the most widely-used mailing list
in the security community for communicating about security vulnerabilities), has long argued in support of full disclosure.

"Back in 1993, the Internet was actually far less secure than it is today because there was little or no dissemination of
information to the public about how to keep malicious users or hackers from taking advantage of vulnerabilities," Levy said in
October 2000 after Network Computing named him one of "The 10 Most Important People of the Decade."

"Full disclosure was born out of an effort to release the details of security fixes so that members of the public could repair
problems on their own without waiting for vendors to respond. BugTraq played a major role in that effort, and thereby changed the
nature of security forever. Today we take for granted that what the hackers know the public should know, too."

In an interview for NPR's Morning Edition program Thursday, Clarke noted that the law allows security professionals to
discover a flaw, but assumes that reverse engineering code -- unless done by a computer security professional -- is done for
"nefarious" purposes.

But security professional Richard Smith, who runs ComputerBytesMan.com, said the issue
is a complicated one, and that bug-finders are not the only part of the problem. "The bug-finders are part of this problem, the
vendors are part of the problem, and BugTraq is part of the problem," he told internetnews.com.

Smith advocates "limited disclosure," in which information about flaws would be distributed, but not exploits which show how to take
advantage of those flaws.

"In general, I'm against the idea of releasing exploit code," Smith said. "I don't think it really does anything. Writing exploit
code -- it just seems that it's helping out the bad guy."

He added, "When I find a problem, the first person I contact is the vendor."

He noted that Clarke's proposal is a good one in a perfect world, with perfect communication systems. However, that's not always, or
even often, the case. Often, he said, it is difficult to find the proper person at the vendor to notify of a flaw, and even if the
vendor is notified, it will not necessarily be responsive.

"There clearly is, on the vendor side, a culture that fixing bugs is right out there with taking out the garbage," Smith said. In
other words, it's a task that most programmers don't want to do. They would rather work on the "fun" new code, than fix old code, he
said.

Also, one solution does not fit all problems, Smith said. While he approves of allowing a vendor time to develop a patch before
going public with a flaw that is remotely exploitable, Smith noted that often there are "workarounds" which can pull a
vulnerability's fangs before it is exploited and before a patch can be developed.