Intuit’s Five Tips on Static Analysis Deployment

On January 30th, I attended a very insightful talk given by John Ruberto, product development leader at Intuit, about his experiences in introducing static analysis in a mature codebase. What made this talk stand out was its broad coverage of the range of challenges in getting static analysis embraced by the development team. The journey began with a bad experience with an old technology tool that produced so many false errors it was deemed DOA and never gained traction.

Yet, Ruberto’s root cause analysis of past bugs showed that a whopping 39% of Intuit bugs found during system test were caused by programming errors. Further inspired by Capers Jones’ research that the combination of static analysis, code review and unit testing could result in a defect removal efficiency of 97%, Ruberto’s team decided to give static analysis another try.

The team was quickly reminded of the bittersweet truth of dropping static analysis on a large, mature code base. After an overnight run, you will be exposed to 1000′s of bugs you never knew existed! Now the fun begins!

Here, Ruberto made a savvy decision so as not to overwhelm the developers: let’s not defeat ourselves by trying to fix the 1000′s of legacy bugs, rather let’s a adopt a “no new defects” policy for new code. The tool was integrated into the central build so that every morning a new set of defects was reported. The beauty is those defects are attributed to code that was changed within the past 24 hours, making it an an easy fix since the code change is still fresh in the developer’s mind.

Down the road, when they had time to spare, they embarked on a “hardening” phase where they fixed a remarkable 3,000+ legacy defects over an 8 month period. They were able to attack these bugs with abandon because the team had wet their chops on bugs in new code first, thus gaining confidence in the tool.

Ruberto closed the talk with five tips for deploying static analysis (my comments in parentheses):

Choose the right tool for your codebase – (this is an obvious one since different tools perform differently depending on the underlying language)

Focus on new code, let “legacy” follow – (this was key because it allowed developers to ease into using the tool and not be crushed by a a mountain of legacy defects)

Focus on developer productivity, not finding fault – (shows good sensitivity to human nature: this tool is to help you become more productive not to put you on the wall of shame)

Automate issue management – triage, assignment, verification, closure – (eliminate the middleman! Intuit set up the tool so it would query the SCM system after each scan, find the owner of the buggy code and assign the bug to that developer. Every morning, each developer would get an email only if he/she created a bug in the last 24 hours. All running on auto-pilot! Sweet!)

All, in all, a great talk and chock full of lessons for anyone contemplating static analysis.

To learn more about how Ruberto implemented static analysis at Intuit, including tool evaluation criteria, best practices for driving developer adoption, and ROI realized, join our upcoming webcast on March 5th.

Thanks for the kind comments about our experiences.
Another great outcome from this practice, the developers were writing more resilient code in the first case. After a few months of this process, the tool was finding less than half the bugs from when we started, even though we were writing just as much code. With the rapid feedback loop, the developers were learning the patterns detected by static analysis.