The tech world is all too familiar with Twitter's "fail whale" and have become accustomed to Gmail failures (which are inevitably chronicled on Twitter.) And while sometimes it's infrastructure (such as routers and switches) rather than software that fails, it often seems as if we too readily accept that software will inevitably breakdown.

Mark Donsky, director of product management at Coverity, commented recently about a recent static analysis of open-source projects performed on the Scan site that showed a 71.9 percent correlation between the number of lines of code and number of defects found.

This is of course, not an open-source problem but a general issue that occurs as more code is integrated into products. I've been told that Windows is developed with two quality assurance people to every engineer as the product has grown over the years.

Coverity is focused on software integrity and advocates static analysis early in the development cycle. While testing of all kinds, including static analysis are obviously good ideas, the tools and methods vary dramatically by engineering organization. The Software Engineering Institute (SEI) at Carnegie Mellon University and the Object Management Group (OMG) recently paired up to form a consortium to establish standards for software quality.

Donsky pointed out four areas that support the case for static analysis:

Software is larger and more complex
Code bases today are orders of magnitude larger than they were just a few years ago. If you compare Windows NT 3.1 with Windows XP, the code base has grown 10-fold from 4 million lines of code to 40 million lines of code. If that weren't enough, there's added complexity with multithreaded applications thanks to readily available multicore processors.

Software defects carry a higher cost and risk
Perhaps as a result of the increased complexity of software, it automates and powers more mission-critical functions than it used to. So, when software defects are leaked into production, it costs the business more to recover from these errors. Recent estimates from NIST indicate that a field-discovered defect costs 30 times as much to fix as it would have if it were found during the development phase.

Software pervasiveness increases risk mindshare
Software underpins virtually everything today, be it air traffic control systems, vehicles, or medical devices. A single software defect can lead to billions of dollars in damages and have a negative impact on brand impact. In any given week, software failures are the direct cause of airport terminal delays, transportation system meltdowns, and high-profile security breaches.

Agile development needs automation and breakpoints
Agile software development requires automated testing methodologies to be baked into the development process. In agile development, both coding and testing occur continuously and responsibility for software integrity falls on the full project team. Static analysis throughout this process can give teams an advantage.

Ultimately, software defects offer no positives for the end-user. Effective testing and analysis solutions can protect software organizations from leaking software defects in production and making the kind of front page news that everyone wants to avoid.

About the author

Dave Rosenberg has more than 15 years of technology and marketing experience that spans from Bell Labs to startup IPOs to open-source and cloud software companies. He is CEO and founder of Nodeable, co-founder of MuleSoft, and managing director for Hardy Way. He is an adviser to DataStax, IT Database, and Puppet Labs.
See full bio