README

Athletic is a benchmarking framework. It allows developers to benchmark their code without littering microtime() calls everywhere.

Athletic was inspired by the annotation format that PHPUnit uses. Benchmark tests extend the AthleticEvent class and are annotated with specific docblock parameters. The benchmark suite is then run with the Athletic command-line tool.

Branch

Unit Tests

Coverage

WARNING: No longer maintained

Athletic is not currently being maintained, so it may have bugs or other problems. I would recommend using PhpBench, which was inspired by Athletic. It is far more capable, and actively maintained

Why Benchmark?

Because fast code is good! While premature optimization is certainly evil, optimization is always an important component of software development. And sometimes you just really need to see if one solution to a problem is faster than an alternative.

Why Use Athletic?

Because it makes benchmarking easy! Athletic is built around annotations. Simply create a benchmarking class and annotate a few methods:

Without Athletic, you have to litter your code with microtime() calls and build timing metrics yourself. Or you end up building a benchmark harness remarkably similar to Athletic (but probably with less syntactic sugar...because who builds throw-away code to read test annotations and fancy output?)

Why can't I use xDebug?

xDebug is an excellent profiling tool, but it is not a benchmarking tool. xdebug (and by extension, cachegrind) will show you what is fast/slow inside your method, and is indispensable for actually optimizing your code. But it is not useful for running 1000 iterations of a particular function and determining average execution time.

You can find out more on how to install Composer, configure autoloading, and other best-practices for defining dependencies at getcomposer.org.

Usage

To begin using Athletic, you must create an Event. This is the analog of a PHPUnit Test Case. An Event will benchmark one or more functions related to your code, compile the results and output them to the command line.

Let's look at how this works.

First, we have a PHP file that is included in your project's repo, just like unit tests. In this example, the Event is saved under the Vendor\Package\Benchmarks\Indexing namespace. It uses classes from your project located at Vendor\Package\Indexing. It also uses a class from the Athletic framework.

classIndexingEventextendsAthleticEvent{

Next, we declare an indexing class that extends \Athletic\AthleticEvent. This is important because it tells Athletic that this class should be benchmarked. AthleticEvent is an abstract class that provides code to inspect your class and actually run the benchmarks.

Next, we have some private variables and a setUp() method. The setUp() method is invoked once at the beginning of each benchmark iteration. This is a good place to instantiate variables that are important to the benchmark itself, populate data, build database connections, etc. In this example, we are building two "Indexing" classes and a sample piece of data. (More details about setup and tear down are further down in this document)

Finally, we get to the meat of the benchmark. Here we have two methods that are annotated with @iterations in the docblock. The @iterations annotation tells Athletic how many times to repeat the method. If a method does not have an iterations annotation, it will not be benchmarked.

The default formatter outputs the Event class name, each method name, the number of iterations, average time and operations per second. More advanced formatters will be created in the near future (CSVFormatter, database export, advanced statistics, etc).

Further Information

SetUps and TearDowns

Athletic offers several methods to setup and tear down data/variables.

Method

Description

classSetUp()

Invoked at the beginning of the Event before anything else has occurred

setUp()

Invoked once before each iteration of the method being benchmark.

classTearDown()

Invoked at the end of the event after everything else has occurred.

tearDown()

Invoked after each iteration of a benchmark has completed.

There are two levels of setup and tear down to prevent "state leakage" between benchmarks. For example, an object that caches calculations will perform faster on subsequent calls to the method.

If the goal is to benchmark the initial calculation, it makes sense to place the instantiation of the object in setUp().

If the goal, however, is to benchmark the entire process (initial calculation and subsequent caching), then it makes more sense to instantiate the object in classSetUp() so that it is only built once.

Calibration

Athletic uses Reflection and variable functions to invoke the methods in your Event. Because there is some internal overhead to variable functions, Athletic performs a "calibration" step before each iteration. This step calls an empty calibration method and times how long it takes. This time is then subtracted from the iterations total time, providing a more accurate total time.