How We Increased Donations By 17.5% With This One Small Change #DMA

A/B testing is the practice of showing two slightly different versions of the same thing (an A and a B) to two small groups of people, figuring out which version works the best, and then showing that version to everyone else. You can use A/B testing in many different marketing channels – from social media to email newsletters to your website; you could even A/B test conversations that you have all the time (like fundraising pitches to donors)!

There are 7 key steps to any A/B test:

Goal: A/B testing depends on having a singular optimization goal that defines which version wins. It could be a click, the open rate, time on page, downloads, or nearly any other singular metric.

Hypothesis: Google famously A/B tested 40 different shades of blue to figure out which one got the most clicks, and then changed all links to that colour blue. As arts organisations, we just don’t have that kind of time. So a good hypothesis helps us focus on the things that matter most: the high value goals (ticket sales, fundraising, attendance, etc), and the aspects of the marketing channel that are likely to matter most (images, headlines, big noticeable changes).

Segment: A/B tests usually apply to a small segment of your audience – they’re the “test group” that helps you make a decision about which version (A or B) is most effective; it might be 10% of your email subscribers or a “week’s worth” of your website visitors.

Split: one half of your segment sees the “A” version, the other half sees your “B” version. It’s important these two groups are very similar to each other (and so we often randomly assign members to A or B)

Show: Don’t forget there should be only one small change between your A version and your B version. If you change multiple things at the same time, they’re likely to cancel each other out and you won’t know which test is the winner, or why. Once you’ve got your two A and B versions, you need to show them to the A and B groups for a period of time. This calculator from Optimizely helps you figure out for how long the test needs to be running before you know the winner.

Measure: each of your A and B versions will have their own conversion rate, which equals “success metric” divided by “number of people exposed to the test.” In other words: 14 successful downloads divided by 100 people who saw the version B is a conversion rate of 14%. If version A’s conversion rate is 20%, then version A clearly wins.

Change: once you know the winning version, you need to roll it out to the rest of your audience as a permanent change.

Let’s take a look at this 7-step process in action, using the example above.

Goal: complete the donation transaction

Hypothesis: website visitors pay more attention to the left side of a page and by the time they get to this donation page, they no longer need to be convinced (so the “a gift of hope” panel is distracting)

Segment: this test ran for 1-month, to 100% of website visitors to this page

Split: visitors to this donation page were randomly shown the A version or the B version, using the tool Google Optimize

Show: the single change between these two versions is the reversal of the left and right panels

Measure: After 1-month, version A had 7.4% of visitors complete a donation, and version B had 8.7% of visitors complete a donation. So over the course of a year, just by changing to version B we would see a 17.5% increase in online donations ((8.7-7.4)/7.4 = 17.5)

Change: after the test finished, we made a permanent change to the donation page (and started a new A/B test!)

For more tips on how to A/B test in social media, emails, and your website, check out this presentation from a recent workshop I facilitated.