Are 20% of Developers Responsible for 80% of the Vulnerabilities?

That’s the question I recently posed in a twitter exchange with @securityninja and @manicode. For those unfamiliar with the Pareto principle, also known as the 80/20 rule, is where roughly 80% of the effects come from 20% of the causes. This Pareto principle phenomenon can be seen in economics, agriculture, land ownership, and so on. I think it may also apply to developers and software security — particularly software vulnerabilities.

Personal experience would have most of us agreeing that not all developers are equally productive. We’ve all seen where a few developers generate way more useful code in a given amount of time than others. If this be the case, then it may stand to reason that the opposite could be true — where a few developers are responsible for a bulk of the shoddy vulnerable code. Think about it, when vulnerabilities are introduced, are they fairly evenly attributable across the developer population or clumped together within a smaller group?

The answer, backed up by data, would have a profound affect on general software security guidance. It would to more efficiently allocate security resources in developer training, standardized framework controls, software testing, personnel retention, etc. Unfortunately very few people in the industry might have the data to answer this question authoritatively, and then only from within their own organization. Off the top of my head I can only think that Adobe, Google, or Microsoft might have such data handy, but they’ve never published or discussed it.

In the meantime I think we’d all benefit from hearing some personal anecdotes. From your experience, are 20% of developers responsible for 80% of the vulnerabilities?

I think it might be true; but this is hard to meassure. An important aspect which e.g DailyWTF ties to capture is the amount of bad devs. Bad devs are productive, they finish tasks in a moderately productive fashion. But they deliver buggy crap with unreadable source code, and living with their crap is magnitudes less productive than doing it right.

A bad day with tainted source code: you estimate a trivial task to 1hour, maybe multiply by 4 to have a margin for known “stuff usually are a pain in this source code”. Once you start coding, you end up in insolvable problems, revert everything, restart by refactoring and fixing crap in order to make development possible. 1-3 days later the 1hour task is completed, but at least these few lines of code doesn’t suck any more.

What’s worse: most devs say “that code is hard to understand”, but have no clue on how to refactor it. You do a bit of refactoring with the

and they are surprised that unreadable becomes readable.

Also if good dev find a “hard” problem, they typically go to doc/google and search on how to solve such issues.

Many (bad) devs do NOT search for how a problem is to be solved. They just totally mess up source code and design, creating complicated unmaintainable crap. They write a comment about it, typically presenting their complete failure as clever. And it is not just hard tasks! It is beyond my comprehension why some devs re-invent buggy solutions for e.g the most basic tasks like string replace, number to string coveration, etc.

http://blaufish.wordpress.com/ Blaufish

Sorry, I think I misread the question, tired 🙂

Anyway, probably true. Good devs rarely do obvious bad code like SQLi vulns, XSS etc. Bad devs do. Because they can’t code well, they do trivial sec fails just as much as they do other trivial coding fails.

However; complex PoE, logic vulns etc, etc. Once you get to the stuff that isn’t basics, it’s more commonly written by someone that can code properly, and not all good devs are well trained in security. Besides, training doesn’t solve everything. People still err, good well trained ones just do it more seldom.

http://www.whitehatsec.com/ Jeremiah Grossman

@Blaufish thanks for the insights. clearly there is a void of intelligence in this area that would be really nice to have. Perhaps time that SAST cross reference vulnerability findings with SVN check-ins from a historical perspective. That might give the data we need.

fmk

The 80/20 rule might apply to other aspects of code quality, but it seems that security ignorance is much more evenly distributed. I’d say that 20% of our developers are creating 80% of our security conscious, non vulnerable code. I’ve been a software quality engineer for less than 3 years and I’ve focused on security testing for closer to 1. That’s the entirety of my professional experience, so it amazes me when significantly more experienced people (senior web application developers) have almost no understanding of common vulnerabilities. When I find flaws, I frequently have to spend time explaining why things like XSS or CSRF are even bugs worth noting.

Jeremiah Grossman

@fmk: thanks for sharing your experiences, others have offered very similar stories. Given that assumption, what security strategies do you think have the widest impact?

fmk

@Jeremiah: As I said, I haven’t been in the business for that long, nor have I worked for a bunch of different organizations, but this is how it looks to me. I think testing, especially negative testing, needs to be emphasized a lot more as being part of a developer’s responsibility. Developers are good at making things work. If you ask them to output ‘B’ for input ‘A’, they can get that done, but their solution won’t necessarily handle input ‘C’ gracefully. “They’re just supposed to send ‘A’, why would I ever get a ‘C’ here?” When we do push them into writing unit tests, there will be few or no negative cases. All of the emphasis is on “How do I make this work?” and “How do I demonstrate that it works?” There’s too little “How is this going to break?” and “What could go wrong?” Obviously in many cases input that is not properly validated represents a security vulnerability, but I don’t think there’s anything particularly special about a security bug compared to any other bug. If your code receives an inappropriate value because some other code has a flaw, because a user made an honest mistake, or because the user is malicious, the code should handle it gracefully. Trying to educate them about specific security issues won’t help until they’ve accepted that they should go looking for trouble in their solutions.

I think I’ve contradicted my previous comment a bit, but I tried to put a little more thought into this one 🙂

which show up through static analysis. Quite often, these are the guys the company absolutely

relies upon to put out the bulk of their functionality, moving them to new development again and

again, in order to get a good quick start at things, leaving the slower, more careful coders the

lower-recognized of maintaining their messes.

Calandale

@fmk – I know we weren’t adequately teaching testing techniques in the colleges and universities, up to three years ago.

I doubt that’s changed, except in a few standout schools.

Fausto

I would say this rule of Pareto doesnt apply here. Since almost all (I mean all) developers introduce security bugs inside the code (accidentally if u want but they do it anyway), I would say 20% of the developers introduce 80% of the most dangerous vulnerabilities inside code (and w/ the greatest impact). Something like that I would think could be better. What do u thik?

http://emergentchaos.com Arthur

I keep asking large orgs using products like fortify/coverity etc to start tracking these stats internally even if they don’t cover bus logic flaws etc. Would love to see some real stats here.

http://spohnconsulting.com Dan

The sheer quantity of security breaches, and the fact it’s business as usual for operating system companies to push updates as frequently as once a week makes me wonder if the 80/20 rule really applies to this at all. Like all things computer related, security is a field in and of itself. Perhaps all code should be reviewed especially for security before it is made publicly available.

http://www.terminal23.net LonerVamp

Darn, fmk beat me to saying that I think it might be more accurate to say 20% of the devs write 80% of the more secure code. In a strange twist, I’d say 80% of developers are responsible for 80% of the bad code? That’s not meant to be a reflection of any disdain for developers; but rather just how hard that requirement can be.

It’s been a while since I was in school and took any programming classes, but I still don’t expect students are taught anything about security. You’re taught how to do a strcpy way before you’re told how to do it in the fancy secure way. Code security is probably all practical experience, which means making the mistakes in the first place and having them corrected (either through internal identification or public attacks). (It doesn’t help when we switch and swap developers like light bulbs, either, or measure their success by dumping out code on time that just meets requirements.)

What makes this really tough to measure is how there are innumerable latent weaknesses in various code sitting on servers, sites, and apps right now that just hasn’t been exposed as shoddy.

http://www.easyspy.co.uk easy spy

Mobile app developers should pay close attention to the current controversy over smartphone data privacy. Would-be whistleblowers say modern smartphones — including the iPhone and all models of Android phones — track and store an ongoing record of wherever their owners travel, even when no applications that require location data are running. Prominent lawmakers have questioned Apple and Google for clarification of their location-tracking policies, while the allegations have already spurred a class-action lawsuit against Apple in Florida.

Someone

Also, keep in mind that probably 20% of developers are developing code that is used by 80% of the market share. How many devs are in charge of crapola browser code compared to how many people use it? So I think you’re right if you’re talking about penetration, for sure. Think about Matt’s Script Archive. Even today I was dealing with his crap code from 2000. That guy may have been the single greatest contributor of bad code that was widely deployed of anyone on the planet. You can still find his crap code everywhere even 12 years later – E.g. inside of cpanel. *shudder*

https://www.facebook.com/BKKGemstore พลอยแท้

It’s amazing in favor of me to have a web page, which is valuable in support of my know-how.

thanks admin

Cookie Use

We use cookies to store information on your computer that are either essential to make our site work or help us personalize and improve the user experience. By using this site, you consent to the placement of these cookies. To learn more, see our Cookie Policy.