If you need a development or QA test tool,
why not use JUnit,
or XmlUnit?
Both of those tools require programming:
since PerformanceMeters is configured
entirely via properties and XML,
there's no programming needed.

If you need a performance test tool,
why not use
JMeter,
LoadRunner
or another general-purpose performance tool?
For some users, it's enough to note that LoadRunner costs money,
while PerformanceMeters is free.
Aside from the cost, however, LoadRunner would only
be useful for HTTP testing, out of the box.
Testing XDBC servers (using XCC or XDBC) would require development work.
However, LoadRunner is still the right answer for some applications,
since it supports web browser simulation
and much more sophisticated test scenarios than PerformanceMeters does.

Setting up PerformanceMeters

PerformanceMeters is a Java program,
and requires a Java 5 runtime environment.
If you don't have Java 5,
download
and install it.

Since PerformanceMeters connects to a MarkLogic Server,
you'll need the MarkLogic XCC libraries, too.
You can download the XCC Java zip archive
here.
After you download the zip archive, unpack it and find the jar files.
You can put xcc.jar
anywhere on your disk, but for this tutorial I'll assume
that both files are in the current directory.

Running PerformanceMeters

Now that we have a Java environment and all the libraries we need,
let's try it out with the simplest possible invocation.

Java command lines can get pretty ugly. I'll keep using them
for this tutorial, but you might want to put all of that into
a shell script (or a batch file, on Windows).
Remember, I put all my jar files in the current directory.
You might have used another location:
if so, just change the classpath in your command-line to match.

OK, so we copied that command-line and pasted it into a shell.
What happened?

Now we're getting somewhere: PerformanceMeters started up,
but exited immediately because it didn't see a required part of its
configuration. Let's learn more about that configuration.

Configuring PerformanceMeters

PerformanceMeters is configured via properties.
In Java, properties are
simply name-value pairs, such as SIZE=1
or PATH=/lib/java.
You can find a list of all the properties
that PerformanceMeters understands in the README.

PerformanceMeters already
gave us some clues about the properties we need to set, though.
The first line of the error message tells us the defaults for a handful
of important properties.

host=localhost

port=8003

user=admin

password=admin

inputPath=null

According to the error message, we have to set inputPath
to proceed. It should be set to the filesystem path
to an XML file. That XML file will describe the tests
that we want to run:
we can start with an
example
from the PerformanceMeters source code.

This is fairly easy to read: the entire document describes
a script, or test scenario, which is made of individual tests.
Each test has a name and a query: the name can be any text,
while the query must be a valid XQuery expression.
When PerformanceMeters sees this script, it will run
each test individually.

How will each test run? That's where the host, port,
user, and password come in. If those
properties aren't set, PerformanceMeters
will use the default values: that is, it will try
to submit each test to an XDBC server on localhost:8003.
If there is no XDBC server on localhost:8003,
or if the default user and password aren't valid,
you'll see another error.

Sure enough, I don't have an XDBC server on port 8003.
But I do have one on port 9000: let's tell PerformanceMeters to use it.
If you don't have an XDBC server, you'll need to configure one:
it can be on any port, as long as you set the property to match.

Hey, it worked! We can see that all three tests ran successfully,
and there were no errors. PerformanceMeters also wrote the results
to an XML file in the current directory
(we could change the location using the outputPath property).

Here we see much the same information as above:
a summary of the number of tests, the number of errors,
and some overall performance metrics.
In addition, we can see a result
element for every test that was run,
showing its timing information and other useful data.

Now that we have a basic working test scenario,
let's look at other ways to customize it.

After a while, you'll get tired of typing
all these properties. This is an easy problem to solve:
instead of using the -Dkey=value technique,
just put your properties into a file.
This also makes it very easy to re-run the same test
scenarios.

Customizing the Output

The results above were in XML. This is great if you want XML,
but if you plan to import the results into a spreadsheet,
you might prefer comma-separated values (CSV).
PerformanceMeters supports arbitrary output plug-ins
via the Reporter interface,
but you don't have to write one yourself:
just set reporter=CSVReporter and run the test again.

This time, the output file name ends with ".csv",
and we can see that the output is suitable OpenOffice Calc,
Microsoft Excel, or other consumers of CSV files.

What if we want to record the actual results of every test?
That's easy: just set recordResults=true
in your test properties.

By default, PerformanceMeters will only report errors
if the server refused the connection, or threw some exception.
We can check every expected-result element in our test scenario
by setting checkResults=true
(for this to work, you must also set recordResults=true).

In many test situations, it's useful to
report more timing information than just minimum, maximum, and average.
Performance requirements are often expressed in terms of
95th-percentile response times, or 98th-percentile response times.
PerformanceMeters supports this via an optional property:
set reportPercentileDuration=95
or reportPercentileDuration=98
in your test properties.

Of course, the 95th percentile of three test iterations
isn't very interesting. Let's look at more configuration
options, this time aimed at customizing test execution.

Customizing Test Execution

By default, PerformanceMeters runs every test in the scenario,
and runs it only once. We can change this behavior
to run a timed test, instead:
setting testTime=60 will run a 60-second test.
Now our 95th-percentile response times are more interesting:

We can also tell PerformanceMeters to shuffle the tests,
instead of running them sequentially, by setting
the isRandomTest=true property.
When running random tests, it's sometimes useful
to repeat the same sequence of pseudo-random numbers.
You can arrange for this to happen by setting randomSeed
to the same value every time.

What if you want to test multi-threaded performance?
That's easy, too: set the numThreads
property to the number of threads you'd like to run.

You can also use PerformanceMeters to test
with the older XDBC api, or to test HTTP servers,
or to perform simple document fetches.
To do this, set the testType
property.

Helpful Hints

When writing your test queries, remember that you are writing
XQuery expressions as XML text. This is usually fine,
but sometimes you'll need to escape certain expressions.
For example, suppose we want to change our
"hello world" test, above, to output a p element
with the "hello world" message.
Our first attempt doesn't work: