The Top Nine Performance Fallacies in Oracle Press Releases

Another week, another Oracle press release on performance. Whether you call it Groundhog Day, deja vu, or same old same old, I guess I’m just getting tired of writing about these. So I decided to do something about it. Here are the top nine things I’ve encountered that you should ask yourself:

Is Oracle comparing themselves with themselves?

Is Oracle comparing themselves with another vendor’s result — from 2 years ago?

Is Oracle using an industry standard benchmark or just their own internal performance test?

If an industry standard benchmark is used, is it one that anyone cares about?

Is Oracle using an “independent” benchmark that they OWN?

Are the products used in the performance test even available?

Do the comparisons take into account different hardware and software configurations?

Does the performance claim have a footnote with all the details?

Do the metrics cited make any sense?

I totally believe in reuse. Please refer to this blog entry whenever an Oracle press release on performance is encountered.

Oh, and everyone always says it’s got to be a Top Ten list. There has to be a tenth fallacy. Let us know yours here.

************************************************

The postings on this site solely reflect the personal views of the author and do not necessarily represent the views, positions, strategies or opinions of IBM or IBM management.

2 Responses

In their latest TPC-C benchmark result, Oracle satisfied the letter of the law but not the spirit of the law regarding their software license and maintenance costs (which are included in the $/TPC-C calculations). I believe they chose to include unusual low-cost software licenses and a bare minimum support and maintenance plan that users would be very unlikely to choose for any database application. Oralce had to do that to achieve a good price/performance results given the very large number of SPARC servers and cores required to get their leading result.