A page to show up #1 on Google when searching for "Jeremiah" (Currently #4). Only the prophet and TV show left! I have the edge, TV show is cancelled and the prophet isn't generating any new content.

The prophet, TV show, and that pesky Owyang guy going down!A page to show up #1 on Google when searching for "Jeremiah Grossman", and it FINALLY has!

Thursday, February 18, 2010

Infrastructure vs. Application Security Spending

A recent study published by 7Safe, UK Security Breach Investigations Report, analyzed 62 cybercrime breach investigation and states that in “86% of all attacks, a weakness in a web interface was exploited” (vs 14% infrastructure) and the attackers were predominately external (80%). These results are largely consistent with the US-based Verizon Data Breach Incident Report (2008) which tracks over 500 cases. Combine this knowledge with the targeted Web-related attacks against Google, Adobe, Yahoo, the US House of Representatives, TechCrunch, Twitter, Heartland Payment Systems, bank after bank, university after university, country after country -- the story is the same. It’s a Web security world.

It has been said before and it’s worth repeating, adding more firewalls, SSL, and the same ol’ anti-malware products is not going to help solve this problem!

The reason that Web security problems persist is not a lack of knowledgeable people (though we could use more), perfected security tools (they could be much better), or effective software development processes (still maturing). A fundamental reason is something myself and others have been harping on for a long while now. Organizations spend their IT security dollars protecting themselves from yesterday’s attacks, at the network/infrastructure layer, while overlooking today’s real threats. Furthermore, these dollars are typically spent counter to how businesses invest their resources in technology. To illustrate this point I’m going to borrow inspiration from Gunnar Peterson (Software Security Architect & CTO at Arctec Group).

Now, let’s take a look at five of the top Web-based companies, which make all their money online, and by extension whose core technology value is rooted in Web code. In 2009 their revenues were: Google $23.6B, Yahoo $6.5B, eBay $8.7B, Amazon $24.5B, Salesforce.com $1.1B. $64.4B in total. Keep in mind that the 2009 eCommerce market is said to be about $130B. To protect these Web-enabled systems let’s use Gary McGraw’s (CTO of Cigital) 2007 software security revenue numbers of $500 million. He takes into account the bulk of white and black box testing tools, professional services consultancies, and web application firewall vendors. Since Gary hasn’t updated his numbers yet, lets adjust up to $750M to reflect market growth between then and 2009.

So in effect ~$750M is being spent to protect $64.4B in eCommerce revenue.

This is further supported by analyst findings. According to Gartner, 90 percent of IT security spending is on perimeter security such as firewalls. Plus to my mind that 1.20% figure is way aggressive. It is likely the percentage is much lower because the $750M is actually being spread out among financial, healthcare, insurance, education, energy, transportation, and government verticals. Perhaps this is also the reason why so many application security professionals wage holy wars over what solution is the best, worst, most important, or should be purchased first.

For the last ten years, I’ve directly experienced application security professionals fighting amongst themselves for scraps off the Big-Security-Vendor’s table. Vulnerability scanners vs WAFs, black vs white box, pen-test vs source code review, certifications vs real world experience, developer training vs secure frameworks, and so on. FAIL! All these solutions are necessary and more! The only way I see to truly improve application security is for organizations to do one (or both) of the following:

Reallocate a portion of current infrastructure security spend to application security.

Grow overall IT security spending and increase application security to more than a rounding error.

Again, tally each up layer. If you are like most business you will find that you spend most on Network, then Host, then Applications, then Data. Congratulations, Information Security, you are diametrically opposed to the business!

If security spending can be justified on areas where attacks no longer occur, then perhaps we can justify more time, money, and effort on application security in the areas where the attacks are being waged.

9 comments:

Great post Jeremiah - there is far too little reasoned analysis of security spending and resource allocation. The data you and others have presented makes clear that security dollars are not being aligned with threats.

Of course a good chunk of security spending is motivated by compliance and contractual requirements versus actual risk reduction. And the regulations and standards are in turn completely focused on network security rather than application security issues.

The much more technical PCI Standard explicitly prioritizes network security over application security in the PCI Prioritized Approach. And most of the security language you see in RFPs is likewise network security focused rather than app sec focused.

I think that part of the solution lies in awareness raising and representation at the early stages of legislative and industry discussions around standards. Of course that is easier said than done!

Thanks Boaz. Though Im not sure compliance is the original culprit here. More so "best-practices" and hard-to-break infosec habits are really the enemies. These end up being rolled into compliance mandates and laws because no risk thinking goes into them. Their crafters just borrow and go.

Some excellent thoughts in this post and clearly these ideas need to be explored more. Having said that, experience tells me a couple of things that impact the spending you described.

1. App folks are under so much pressure to get things done that they tend not to worry about the security aspects of their code and hope that someone else will deal with it later. This generally falls to the network team.

2. Applications are not the target but the transport. They provide a way to get to the real target - the data. While app security should be improved, better internal security and database security would be a huge help.

3. For many organizations, traditional application security is not scalable. That is why the appetite for network controls is growing. While we may argue good, better, best among ourselves, the folks holding the purse-strings are looking for good enough.

4. Therefore, security spending lags. This is a business where we are trying to quantify phantom risks and prove negatives - which is really hard. In a world where executives think even having a password is too much of a pain, its hard to explain the need for protection against ephemeral attacks that are too complex for non-technical folks to understand.

I've had CIOs tell me that they didn't want to know what was going on in their networks because then they'd have to fix it! Ignorance can be bliss... for a time...

1) I think that particularly problem is a general lack of managerial / business application security mandate. No awareness. If the organization doesn't value or expect it, then I have a hard time blaming developers for going out of there way to "secure their code."

2)1/2 agree. Put all the controls you want, but if I can hack the app.. all the crypto, passwords, and access control won't help all that much -- because the app has access and I am the app.

3)Can't say that I disagree. If fact, I've written exactly that before. http://jeremiahgrossman.blogspot.com/2009/08/web-security-is-about-scalability.html

4) That's exactly why we need to raise the awareness of application security and tie it to today's leading threats. They only used to be emerging.

I am not sure you need to reach upto 10% for web applications. They are fundamentally different from normal applications. At the very least, web applications can roll out fixes much more quickly than applications can. I do agree that web application security deserves a little more spending, but the numbers in your post (10 vs 1 for e.g) are not justified - its not a apples to apples comparison.

Also, I would hope and pray that we learnt something after the security disaster that was application 'deploy first fix later' in the 90s. Thus, spending decreasing on security as we move to newer applications actually makes sense.

"The reason that Web security problems persist is not a lack of knowledgeable people (though we could use more), perfected security tools (they could be much better), or effective software development processes (still maturing)."

One reason for the difference between the spending on infrastructure vs application security is the fact that sys admins and IT personnel have the knowledge and training of putting a system in place and they can pretty easily justify the need for a firewall...etc. In addition, the infrastructure security controls are found in a lot of network diagrams, design documents...so execs are familiar with it. How many design documents or requirement documents that have any application security controls in it?My point is: a main reason for the difference in spending is the security knowledge of an average sys admin\IT engineer (whoever is responsible for the infrastructure) vs the average knowledge of software developer\technical project manager\product manager (whoever is responsible for the application)

What happens if we roll the clock back 10 years, or even 7 years on this? Then what do the attacks look like?

I think I can make a fairly compelling argument that the attacks have moved to the application layer *because* of improvements at the OS and Network layers. Remote OS vulns are way down. People with wide open SQL-servers to the internet are way down.

This means that the attacks have moved. Do people really believe that if we turned off network firewalls and stopped patching desktops and servers that they wouldn't get targeted for attacks, and we could redeploy all of that money to our SDL programs? If so, I also have some real estate in Florida to sell you :)

I'm not arguing that there aren't issues of spending, and that it shouldn't be rationalized, but to say that we need to complete shift the balance of the spending to applications is ridiculous.

@Sherif For the most part Im sure you are right. Sys admins and IT personnel tend to run for their comfort zone. Im just pointing out that these habits run counter to what the business invests in. Time for CIOs to realize this fact, especially in light on what is really going on out there.

@Security Retentive no, you are not misreading. Your logic is sound, perhaps the attackers did move on, but at the same time today we must adjust. One might also consider that if our "applications" were secure -- 10 years ago -- we may not need so many firewalls protecting weak apps. As you've said before, maybe we made the right choices with the best available knowledge back then. Right now, we are not.

While the "numbers" might say the applications are the hardest hit, the corporate view is that the infrastructure is my foundation. It is a lot harder to make changes to my infrastructure then it is my applications.

There's also the perception that an infrastructure person (at this point in time) should be aware of security issues. That isn't the case with developers.