The SANS Institute and MITRE Corp. issued an update to the CWE/SANS Top 25 Programming Errors List last week, focusing mitigation techniques that could be adopted into the security development lifecycle to help avoid multiple security bugs. But one expert says that while the programming error list helps contribute to improving software development, actually getting companies to implement a more secure software development process is a different story. Developers can become overloaded, said Chris Wysopal, co-founder and chief technology officer of Veracode Inc. Still, many companies can take smaller steps to introduce security into the software building process, said Wysopal, a secure software coder who helped create the first security research think tank known as L0pht Heavy Industries, which later became part of @stake, a consulting and research boutique that was acquired by Symantec Corp. in 2005.

I agree to TechTarget’s Terms of Use, Privacy Policy, and the transfer of my information to the United States for processing to provide me with relevant information as described in our Privacy Policy.

Please check the box if you want to proceed.

I agree to my information being processed by TechTarget and its Partners to contact me via phone, email, or other means regarding information relevant to my professional interests. I may unsubscribe at any time.

Please check the box if you want to proceed.

By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.

If you were part of a small software development team, how would you apply some of these programming errors?
I think there's two ways to apply the top 25. When you are doing the testing, before you deliver the software or before you deploy it, you can make sure you have a testing process that can check for all the defects or bugs that are in the top 25. Make sure you have software development tools, a manual process or other technologies that can test for those. Automation is always better. On the developer side, both the architects designing the software and the developers building the software should look at that Monster Mitigation list and make sure they are using those techniques because they will prevent many different flaws from getting into the software. You have to be preventative and you have to test to see if anything got in. Does the issue of speed and getting a project completed on time make it difficult to apply both preventative techniques and testing?
The biggest roadblock to secure software is actually just doing the work in the process. We have these lists, we have the information of what not to do, we have the mitigation list to prevent security errors. There are good static and dynamic testing tools out there. The challenge has been trying not to disrupt the development process and not to make it take longer or more people to do the job. To me, the challenge is, how can we insert, in a lightweight way, this information and testing into the development process and still have the software done on time. That's really the challenge that the industry is faced with right now.

Security Wire Weekly podcast:

Application security and the top 25 coding errors:Software security expert Chris Wysopal, CTO of Veracode talks about how the latest code analysis tools makes it easier to identify errors with fewer false returns. Also, Wysopal explains how the SANS/CWE Top 25 Programming Errors list can be applied effectively by software development groups.

Software development tools are getting better at detecting vulnerabilities, but as they get better, do they put too much pressure on development teams to correct the errors and cause problems?
Asking them to eliminate all 25 programming errors is going to be a challenge. Veracode Inc. doesn't have a black and white rating system for our software. Only for the highest assurance software, which we call level-five business critical software, we say you need to eliminate all 25. Certainly there's a need for lesser standards for software that is not business critical. It's not running the stock exchange or flying a plane. You can't have a one-size-fits-all list. So I recommend that it gets increasingly easier for vendors to hit the target for software that is not as critical as life or limb. A few years back when Hewlett-Packard Co. acquired SPI Dynamics Inc., IBM Internet Security Systems acquired Watchfire Corp. and Microsoft began really pushing secure software development, did the environment change at all?
When I started at Veracode three and a half years ago we told potential customers that you need to do static analysis in your software development lifecycle; a lot of people didn't know what that was. They didn't understand why they needed to be thinking of security as they were building the software. Some companies got it, like Microsoft and some of the other large product companies, but 95% of the people building software looked at me like I had two heads; that has changed a lot. I think the OWASP Top 10, the top 25 and other ways of describing the problem that focused on talking to developers has made them aware that if they don't do something as part of the software development process, they're going to end up with these vulnerabilities in their products. I also think it has made customers aware so they can ask developers do something about it.

Certainly there's a need for lesser standards for software that is not business critical. It's not running the stock exchange or flying a plane. Chris Wysopalchief technology officerVeracode Inc.

Has that awareness gone up the ladder to the business side of the house?
It has in certain verticals. We've seen it in financials and in government. When those organizations are building the software internally, they're outsourcing it or buying some sort of custom software; they're really aware that they need to put some requirements on the development teams to put some focus on security because if they don't they just know it won't be [secure]. Is there another issue where older legacy software is the cause of many major problems; the development teams are forced to go back and take a look at the software and some of them don't have the source code?
That is a major problem. Michael Howard at Microsoft said the highest correlation to how many security bugs are in a piece of software is how old it is. The older the software, the more likely it is problematic because it was built in a way that people didn't understand, either at the platform level, at the language level or the libraries used; no one was thinking about security. So the older the software gets, the worse it is. That is a big challenge because the way we build software is to reuse old code. No one builds software from scratch. It's pretty rare unless it's for a completely new platform or doing something new. In most businesses there's a lot of reuse. Shared libraries or just reused routines that have been around for a while. If you put in a process to secure your application you can't ignore the fact that it's not just the new code. You can't put something in a developer's hands and say whenever you write a line of code you're going to make sure it's secure, because that's only going to solve half the problem. You have to solve the problem of all the old code that's being reused. The other part of it is that there's whole packages of software that haven't been touched in years that are running in organizations. That is a ticking time bomb that is going to become a problem that companies are going to have to address eventually.

0 comments

Register

Login

Forgot your password?

Your password has been sent to:

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy