Performance & Best Practices

Wednesday May 01, 2013

Oracle produced a world record single-server SPECjEnterprise2010
benchmark result of
27,843.57 SPECjEnterprise2010 EjOPS using one of Oracle's SPARC T5-8
servers for both the application and the database tier.
This result directly compares the 8-chip SPARC T5-8 server
(8 SPARC T5 processors) to the
8-chip IBM Power 780 server (8 POWER7+ processor).

The 8-chip SPARC T5 processor based server is 2.6x faster
than the 8-chip IBM POWER7+ processor based server.

Both Oracle and IBM used virtualization to
provide 4-chips for application and 4-chips
for database.

The server cost/performance for the
SPARC T5 processor based server
was 6.9x better than the
server cost/performance of the IBM
POWER7+ processor based server.
The cost/performance of the SPARC T5-8 server is
$10.72 compared to the IBM Power 780 at $73.83.

The total configuration cost/performance (hardware+software)
for the SPARC T5 processor based server was
3.6x better than the IBM POWER7+ processor based server.
The cost/performance of the SPARC T5-8 server is
$56.21 compared to the IBM Power 780 at $199.42. The IBM system had
1.6x better performance per core, but this did not reduce the total
software and hardware cost to the customer. As shown by this
comparison, performance-per-core is a poor predictor of characteristics
relevant to customers.

The total IBM hardware plus software cost was $2,174,152 versus the
total Oracle hardware plus software cost of $1,565,092. At this price
IBM could only provide 768 GB of memory while Oracle was able
to deliver 2 TB in the SPARC T5-8 server.

The SPARC T5-8 server requires only 8 rack units, the same as the
space of the IBM Power 780. In this
configuration IBM has a hardware core density of 4 cores per rack unit
which contrasts with the 16 cores per rack unit for the
SPARC T5-8 server. This again demonstrates why
performance-per-core is a poor predictor of characteristics relevant to
customers.

The virtualized SPARC T5 processor based server
ran the application tier servers on 4 chips using
Oracle Solaris Zones and the database tier in a 4-chip
Oracle Solaris Zone. The
virtualized IBM POWER7+ processor based server
ran the application in a 4-chip
LPAR and the database in a 4-chip LPAR.

The SPARC T5-8 server ran the Oracle Solaris 11.1 operating system and
used Oracle Solaris Zones to consolidate eight Oracle WebLogic
application server instances and one database server instance to
achieve this result. The IBM system used LPARS and AIX V7.1.

This result demonstrated less than 1 second average response times for
all SPECjEnterprise2010 transactions and represents JEE 5.0
transactions generated by 227,500 users.

IBM has a non-virtualized result (one server for application and one server for database). The IBM PowerLinux 7R2 achieved 13,161.07 SPECjEnterprise2010 EjOPS which means it was 2.1x slower than the SPARC T5-8 server. The total configuration cost/performance (hardware+software) for the SPARC T5 processor based server was 11% better than the IBM POWER7+ processor based server. The cost/performance of the SPARC T5-8 server is $56.21 compared to the IBM PowerLinux 7R2 at $62.26. As shown by this comparison, performance-per-core is a poor predictor of characteristics relevant to customers.

Benchmark Description

SPECjEnterprise2010 is the third generation of the SPEC organization's
J2EE end-to-end industry standard benchmark application. The new
SPECjEnterprise2010 benchmark has been re-designed and developed to
cover the Java EE 5 specification's significantly expanded and
simplified programming model, highlighting the major features used by
developers in the industry today. This provides a real world workload
driving the Application Server's implementation of the Java EE
specification to its maximum potential and allowing maximum stressing
of the underlying hardware and software systems,

The web zone, servlets, and web services

The EJB zone

JPA 1.0 Persistence Model

JMS and Message Driven Beans

Transaction management

Database connectivity

Moreover, SPECjEnterprise2010 also heavily exercises all parts of the underlying
infrastructure that make up the application environment, including hardware,
JVM software, database software, JDBC drivers, and the system network.

The primary metric of the SPECjEnterprise2010 benchmark is jEnterprise Operations
Per Second (SPECjEnterprise2010 EjOPS). The primary metric for the
SPECjEnterprise2010 benchmark is calculated by adding the metrics of the
Dealership Management Application in the Dealer Domain and the
Manufacturing Application in the Manufacturing Domain. There is
NO price/performance metric in this benchmark.

The database ran in a separate Oracle Solaris Zone bound to a resource pool
consisting 64 cores (4 cpu chips). The database shadow processes
were run in the FX scheduling class and bound to one of four cpu chips
using the plgrp command.

The Oracle WebLogic application servers were executed in the FX
scheduling class to improve performance by reducing the frequency
of context switches.

The Oracle log writer process was run in the FX scheduling class at
processor priority 60 to use the Critical Thread feature.

Disclosure Statement

SPEC and the benchmark name SPECjEnterprise are
registered trademarks of the
Standard Performance Evaluation Corporation.
Results from www.spec.org as of 5/1/2013.
SPARC T5-8, 27,843.57 SPECjEnterprise2010 EjOPS;
IBM Power 780, 10,902.30 SPECjEnterprise2010 EjOPS;
IBM PowerLinux 7R2, 13,161.07 SPECjEnterprise2010 EjOPS.
Oracle server only hardware list price is $298,494 and total hardware plus
software list price is $1,565,092 from http://www.oracle.com as of 5/22/2013.
IBM server only hardware list price is $804,931 and
total hardware plus software cost of $2,174,152 based on
public pricing from http://www.ibm.com as of 5/22/2013.
IBM PowerLinux 7R2 server
total hardware plus software cost of $819,451 based on
public pricing from http://www.ibm.com as of 5/22/2013.

About

BestPerf is the source of Oracle performance expertise. In this blog, Oracle's Strategic Applications Engineering group explores Oracle's performance results and shares best practices learned from working on Enterprise-wide Applications.