In December 2007, the Obama campaign started a single experiment that would raise $60 million by adding a splash page on the campaign website. The team tested the effectiveness of four different buttons ("Join Us Now", "Sign Up Now", "Sign Up', and "Learn More") as well as three videos and three images — resulting in 24 combinations.

Without the experiment, the team would have gone with a combination that was strongly favored by all but resulted only in a sign-up rate of 8.26%. The winning combination, however, generated more than 7 million email addresses (40.6% improvement!). That is a difference of more than 2.8 million email addresses and since every person who signed up donated about $21 to the campaign, we are talking about raising a substantial amount of money!

Now, going through a presidential election again, and knowing what Obama's team was able to achieve, you would think that every single candidate uses A/B Testing, right?

Well, wrong. The only candidate in the 2016 Presidential Election Campaign who is seriously employing the power of A/B Testing to improve his website's conversion is Bernie Sanders! And while Hillary Clinton had Optimizely on her website and was doing a decent job achieving her experiments, Sanders, as a man of the people, works hard to fine-tune his imagery and messaging to get his message across in the most effective manner.

Over the last few months he tested different slogans that reflect his core values and messages, such as:

“If the environment were a bank, it would have been saved by now.”

“Our economy is rigged and our political system is corrupt.”

“The Koch brothers shouldn’t be allowed to buy our politicians.”

“Nobody who works 40 hours a week should be living in poverty.”

He also tested the button color and text. From a plain conversion perspective, I love the splash page above. Humans are prone to follow the direction of the gaze of someone they are looking at. Next time you talk to someone and they suddenly look somewhere else, try not to look in that direction as well, you know how hard it is to resist that urge. So, back to Bernie. Your eyes will follow and end up on the green "I Agree" button. The color choice is positive and supports the message of the button, while red or blue would probably underperform here.

How To Build A Successful A/B Testing Experiment?

The first thing you should know about A/B Testing is that since you can test virtually anything under the sun, you will have to prioritize and decide what exactly to test. That is the hardest part! So the next question would be, where should you start, right? The biggest mistake that a lot of companies make is to randomly move some levers and hope for significant results. While it is understandable to want to dive right in, you will be much more successful with purposeful and deliberate planning.

1) Set Goals & Key Performance Indicators

The million dollar question is: How do you know what goals to pick?

Most people who dip their toes into A/B Testing for the first time say they want to improve their website! But how do you measure website improvement? To answer it, you will have to dig real deep and figure out what to the ultimate goal for your website (or the part of your website you want to run tests on) is.

As with anything else, it pays to set specific, measurable, attainable, relevant and timely (SMART) goals. (To learn more about how to correctly set SMART goals, please check out this blog post or download our smart goal template.) By setting SMART goals, you can derive key performance indicators that will help you determine which experiment is successful regarding contributing to your business.

Your goals could be anything from the number of completed purchases, the number of products added to the cart, percentage of carts abandoned for e-commerce websites to page views, articles read, forms completed and much more on lead generation or content websites. Once you picked your goal, double check that it is actually relevant to your business success.

2) Identify Any Bottlenecks

Now that you know how you will keep score and determine a winner, you can now start rolling up your sleeves and dig into your analytics to identify any bottlenecks or obstacles that are present and hinder you from reaching those goals.

For the Obama campaign, the primary reason for the A/B Testing was NOT the number of donations or even the average donation amount, but the number of email sign-ups. The team found that while people organically reached the website and donated, the chance of a donation significantly improved once they joined the email list. So the bottleneck here was getting visitors' email addresses.

3) Carefully Craft A Hypothesis & Plan Your Experiments

You know what you want and what is holding you back. Now it is time to figure out how to overcome that obstacle and come up with candidates of page variations for testing!

Let's look at the presidential election examples again: There is a significant difference between President Obama's and Bernie Sanders' tests — other than knowing the outcome of one. We know Obama had a set of variables (4 buttons, 3 images, and 3 videos) that made up one cohesive experiment: Each one of these 24 combinations would have made sense. There was a strategy behind it with a simple goal: drive more email sign ups that could be used later to raise funds.

On the other hand, Bernie's tests seemed to test a variety of things: messaging, button colors and images. Of course, we cannot know for sure without any insider knowledge, but there seems to be no apparent strategy behind the testing — which makes A/B Testing less successful! (I would love to hear from the Bernie Sanders Campaign more about the results... )

The best way to construct a working hypothesis would be to do user interviews, testing, focus groups and such to understand what is going on — however; you often will not have that chance. So, put yourself in your visitor's mindset and look at the page and ask yourself: What would I like or dislike? What can be improved? Is important information or call-to-actions below the fold? What is the key message on the page and how is it different from what it should be?

4) Prioritize To Maximize Your ROI

When constructing your hypothesis, it is crucial to prevent goal cannibalism - meaning don't sacrifice one important goal to achieve another. In the example below, the donation landing page from the Clinton Bush Foundation to help Haiti's earthquake victims, including the image on the page initially decreased the average amount of donations per page view because the image was on the top and the form was pushed to below the fold.

Especially in this situation, where every second counts to raise funds for lifesaving rescue missions, you will need to make sure you construct your hypothesis very carefully. The campaign raised more than 1 million dollars in less than 48 hours by testing eight variations of the donation page; ultimately changing the button text from "Submit" to "Help Haiti", decreasing the number of fields and, of course, by placing the image beside the donation form.

So when it comes to prioritizing (and you will have to do so rigorously to run successful experiments), emphasize the return on investment (ROI).

Conclusion

Now you are ready to start testing! In summary: To maximize the success (and return on investment) of your A/B Testing, limit your experiment to one particular area that needs attention or improvement. Start with setting realistic goals, identify any potential bottlenecks, carefully build a set of hypothesis to build your experiment on and then prioritize! Good luck!

Want to leverage data-based science, A/B Testing and user feedback to build your possibly best website? Find out how Growth-Driven Design can help your business reach their goals. Click below to book a coaching session with us.