There is an industry effort to define a "watch list" for common mistakes that lead to security flaws. Co-led by the folks behind the Common Weakness Enumeration at MITRE and the SANS Institute, the SANS Top 25 (full listing here) is being used as a requirements document for the security of purchased applications by the State of New York, among others.

It's not perfect--it omits backdoors and other intentional security flaws, among other categories--but it's better than nothing, by a long shot.

Disclaimer: I work at Veracode and was a co-author of the report that the original article was about.

If you take a look at the full report (registration required), you'll see that the application pool from which the report was drawn was 47% Java, 31% C/C++ (on Windows, Red Hat Linux, and Solaris), and 22%.NET. Other data is provided (industry, supplier type) to help frame the terms of the application pool from which the data was drawn. We acknowledge the inherent selection bias (the applications in the report come from our customers) in the methodology section.

You can test the quality of code you are developing yourself with a simple source code scanner, but testing third party code is a little more challenging. I don't know too many significant applications that are entirely created in house, with no dependency on third party libraries.