Read it to Believe it — Addition of a Single Letter Increased Sales by 20%

Have you ever accidentally hit gold? Like you were trying to figure out how the slot machine works and surprised yourself by hitting the jackpot? Sounds too good to be true? Well, good things can happen when you split test.

The following A/B test was conceived and made live by a conversion consultant while he was sitting in a meeting with his client, trying to demonstrate how easy it is to set up tests in Visual Website Optimizer. What followed was something no one expected.

The Company

Aarhus Teater is one of the oldest and largest theatres in Denmark. Its theatre tickets are sold online at AarhusTeater.dk. On the upper right corner of the website’s homepage, there’s a link to the ticket e-shop. The link text said “Køb Billet”, which translates to “Buy Ticket” in English.

Take a look at the homepage:

The test

Aarhus Teater hired Klean.dk to optimize its website. Jesper Sorensen, a consultant with Klean, took on the testing work. But first he had to demonstrate to the decision makers at Aarhus Teater how VWO works and how powerful these tests can be.

And thus was run the simplest A/B test on the surface of the earth. Jesper changed the link text from “Buy Ticket” to “Buy Tickets” – a mere addition of a letter. Here’s how the variation looked:

The results

The results were beyond anyone’s imagination. Changing the link text from “Buy Ticket” to “Buy Tickets” increased Aarhus’ sales by 20%. The variation had a 98% chance to beat the original. The test was run on over 10,000 visitors for 20 days before Aarhus implemented the changes on the website.

“This test was meant to be a test case/showcase for the decision-makers at Aarhus Theatre, and we’re naturally very happy that it turned out the way it did,” said Jesper.

Why the variation worked?

“I wish I could say the test was part of a cunning plan – but it was not,” Jesper said. Though it was a shot in the dark, the shot was aimed with enough precision to increase conversions. Here’s why it worked:

1) Element tested was crucial

Though it was a seemingly small test, it had a bigger impact since it had direct impact on sales.

“I chose the link because it has direct effect on the sales. I set up the test at the very meeting with the decision makers as part of my demonstration. That said, I have done some conversion work on a number of websites and I usually have a good eye for spotting elements to test,” said Jesper.

2) ‘Buy Tickets’ is more clear

People seldom go out for movies or plays alone. They go out in groups of two or three. To them, the message of “Buy Ticket” can be slightly misleading whereas “Buy Tickets” clearly conveys that they can buy multiple tickets at once. Also, as Jesper points out “Tickets signal that you can buy tickets for several shows”.

A word of caution

We at VWO in no way encourage random testing. We encourage people to run intelligent and educated tests depending on their traffic, target audience and respective goals. We publish all kinds of case studies to showcase the sheer power of A/B testing and how even small changes can sometimes result in big wins. On that note, let me wish you happy testing.

A lover of the written word, I plan to be the planet's first sit-down comedian. When I am not rethinking a misplaced comma, I write about conversion optimization and website usability. You can follow me @mohitanagpal

(4) Comments

I have these kind of results also once in a while. I change one single word and conversions are 10% higher with more then 95% probability.

BUT… when you let the test run much and much longer that changes. 10k visitors is not enough in my opinion.

From my perspective you are making a rookie mistake.

Or do you really think that if tomorrow every website in the world changes the word ‘buy ticket’ in their menu to ‘buy tickets’ that there will be 20% more purchases and all these theaters get 20% more visitors…

You should add sample size and number of conversions to such “case studies” so that savvy blog readers can draw their own conclusions about the statistical significance. Such numbers would also add to the crediblity of your examples.

I suppose that experiment was done using asynchronous tracking code (default option). It causes content that we test to blink after the page is shown and brings attention to it. You need to repeat experiment with synchronous tracking code, until that I assume that experiment is not valid. And even better – try prominent button instead of the link.

I can’t say I agree with the final “Word of Caution”. I have found that, while intelligent, theory based testing should certainly make up the majority of one’s strategy, having a systematic strategy of an occasional ‘random’ test is a great way to make sure there aren’t things you’ve overlooked.