jueves, diciembre 13, 2012

When you are running long term applications like web applications, it is good to know some statistics about them, like number of requests served, request durations, or the number active requests. But also some more generic information like the state of your internal collections, how many times some portion of code is being executed, or health checks like database availability, or any kind of connection to an external system.

All this kind of instrumentalization can be achieved by using native JMX or using a modular project like Metrics. Metrics provides a powerful way to measure the behaviour of your critical components and reporting them to a variety of systems like, JConsole, System Console, Ganglia, Graphite, CSV, or making them available through a web server.

To install Metrics, we only have to add metrics dependency. In this example we are going to use Maven.

Now it is time to add some metrics to our code. In Metrics we can use 6 types of metrics:

Gauges: an instantaneous measurement of a discrete value.

Counters: a value that can be incremented and decremented. Can be used in queues to monitorize the remaining number of pending jobs.

Meters: measure the rate of events over time. You can specify the rate unit, the scope of events or event type.

Histograms: measure the statistical distribution of values in a stream of data.

Timers: measure the amount of time it takes to execute a piece of code and the distribution of its duration.

Healthy checks: as his name suggests, it centralize our service's healthy checks of external systems.

So let's write a really simple application (in fact it is a console application) which sends queries to Google Search system. We will measure the number of petitions, the number of characters sent to Google, the last word searched, and a timer for measuring the rate of sending a request and receiving a response.

The main class where Measures will be applied is called MetricsApplication and is the responsible of connecting to Google and sending the entered word.

The first thing we can see is the counter instance. This counter will count the number of characters that are sent to Google in the whole life of the applications (meanwhile you don't stop it).

The next property is a meter that measures the rate of sending queries over time.

Then we have got a timer that rates the sendQueryToGoogle method callings and its distribution over time.

And finally a LinkedList for storing all queries sent. This instance will be used to return the last query executed, and is used in gauge for returning the last inserted element.

Notice that in each measure we are setting a class which will be used as folder in jconsole. Moreover a label is provided to be used as name inside folder.

Let's see a screenshot of jconsole with previous configuration and an execution of three searches:

By default all metrics are visible via JMX. But of course we can report measurements to console, http server, Ganglia or Graphite.

Also note that in this example we are mixing business code and metrics code. If you are planning to use Metrics in your production code I suggest you to put metrics logic into AOP whenever possible.

We have learned an easy way to monitorize our applications without using JMX directly. Also keep in mind that Metrics comes with some built-in metrics for instrumenting HttpClient, JDBI, Jetty, Jersey, Log4j, Logback or Web Applications.

We Keep Learning,

Alex.

But I know that one and one is two, And if this one could be with you, What a wonderful world this would be. (Wonderful World - Sam Cooke)

jueves, diciembre 06, 2012

Yesterday my daughter was born in Barcelona. We named Ada in honor of Augusta Ada King, who is considered the first person that wrote the first computer program. Both are Sagittarius, Ada was born on December 10 and my daughter on December 5, almost the same day.

martes, diciembre 04, 2012

Acceptance testing are used to determine if the requirements of a specification are met. It should be run in an environment as similar as possible of the production one. So if your application is deployed into Openshift you will require a parallel account to the one used in production for running the tests. In this post we are going to write an acceptance test for an application deployed into Openshift which uses MongoDb as database backend.

The application deployed is a very very simple library which returns all the books available for lending. This application uses MongoDb for storing all information related to books.

So let's start describing the goal, feature, user story and acceptance criteria for previous application.

Goal: Expanding lecture to most people.

Feature: Display available books.

User Story: Browse Catalog -> In order to find books I would like to borrow, As a User, I want to be able to browse through all books.

Acceptance Criteria: Should see all available books.

Scenario:

Given I want to borrow a book

When I am at catalog page

Then I should see available books information: The Lord Of The Jars - 1299 - LOTRCoverUrl , The Hobbit - 293 - HobbitCoverUrl

Notice that this is a very simple application, so the acceptance criteria is simple too.

For this example, we need two test frameworks, the first one for writing and running acceptance tests, and the other one for managing the NoSQL backend. In this post we are going to use Thucydides for ATDD and NoSQLUnit for dealing with MongoDb.

Thucydides is a tool designed to make writing automated acceptance and regression tests easier.

Thucydides uses WebDriver API to access HTML page elements. But also helps you to organise your tests and user stories by using a concrete programming model, create reports of executed tests, and finally it also measures functional cover.

To write acceptance tests with Thucydides next steps should be followed.

First of all choose a user story of one of your features.

Then implement the PageObject class. PageObject is a pattern which models web application's user interface elements as objects, so tests can interact with them programmatically. Note that in this case we are coding "how" we are accessing to html page.

Next step is implementing steps library. This class will contain all steps that are required to execute an action. For example creating a new book requires to open addnewbook page, insert new data, and click to submit button. In this case we are coding "what" we need to implement the acceptance criteria.

And finally coding the chosen user story following defined Acceptance Criteria and using previous step classes.

NoSQLUnit is a JUnit extension that aims us to manage lifecycle of required NoSQL engine, help us to maintain database into known state and standarize the way we write tests for NoSQL applications. NoSQLUnit is composed by two groups of JUnit rules, and two annotations. In current case, we don't need to manage lifecycle of NoSQL engine, because it is managed by external entity (Openshift).

So let's getting down on work:

First thing we are going to do is create a feature class which contains no test code; it is used as a way of representing the structure of requirements.

Note that each implemented feature should be contained within a class annotated with @Feature annotation. Every method of featured class represents a user story.

Next step is creating the PageObject class. Remember that PageObject pattern models web application's user interface as object. So let's see the html file to inspect what elements must be mapped.

The most important thing here is that table tag has an id named listBooks which will be used in PageObject class to get a reference to its parameters and data. Let's write the page object:

Using @DefaultUrl we are setting which URL is being mapped, with @FindBy we map the web element with id listBooks, and finally getBooksTable() method which returns the content of generated html table.

The next thing to do is implementing the steps class; in this simple case we only need two steps, the first one that opens the GetAllBooks page, and the other one which asserts that table contains the expected elements.

And finally class for validating the acceptance criteria:

There are some things that should be considered in previous class:

@Story should receive a class defined with @Feature annotation, so Thucydides can create correctly the report.

We use MongoDbRule to establish a connection to remote MongoDb instance. Note that we can use localhost address because of port forwarding Openshift capability so although localhost is used, we are really managing remote MongoDb instance.

Using @StepsThucydides will create an instance of previous step library.

And finally @UsingDataSet annotation to populate data into MongoDb database before running the test.

Note that NoSQLUnit maintains the database into known state by cleaning database before each test execution and populating it with known data defined into a json file.