A/B Test: What Kind of Guarantee Boosted the Opt-in Rate 48.32%?

Advertising a money-back guarantee can definitely convince hesitant customers to open their wallets. It makes intuitive sense, but there’s also some data behind this. For instance, a 2011 study in the Journal of Retailing found that money-back guarantees “evoke a positive emotional response, thereby increasing consumers’ purchase intentions and willingness to pay a price premium.”

But can you increase your revenue simply by changing the way you present that guarantee?

And beyond that: can you drive additional engagement even before the purchase stage by mentioning the guarantee?

Almost certainly. In today’s featured A/B test, one LeadBox™ for a coupon outperformed the other by 48.32% when it framed its money-back guarantee differently.

Version A invited customers to “Put it to the test risk free. You have nothing to lose but your competition.” Version B was more straightforward: “Guaranteed to improve performance or your money back.”

Which LeadBox™ do you think increased the opt-in rate by a relative 48.32%?

Note one complicating factor before you guess: the headline color also changed between variations (red in Version A, blue in Version B), so consider whether this could have an impact, too.

Go down to the comments and tell us which one you’d choose and why—then vote below to see if you were right!

Version B, with a 96.07% probability of outperforming Version A, produced a relative 48.32% conversion-rate lift.

Although we can’t say with certainty what was behind this increase, here are a few of my speculations:

1. Although cleverly worded, the guarantee language in Version A was potentially less clear than in Version B. It’s likely that customers who are motivated by a guarantee will be even more comfortable purchasing the clearer the terms of that guarantee are.

2. Version A emphasized the idea of risk, whereas Version B led with a promise to improve performance.

3. Version A’s headline was red instead of blue, which may have caused visitors to subconsciously view it as a warning message rather than an offer.

Not all customers are the same, but consider testing how you frame your guarantee the next time you offer one.

What Do You Think?

Did this test’s results surprise you? Why do you think Version B increased conversions so dramatically? Leave a comment below and let us know your thoughts.

If you’re new to LeadPages, you should know that all Pro and Advanced users can run any A/B test inside LeadPages in just five clicks.

Do you have a LeadPage like this one that you would like to test? If so, you can set up the exact same type of test in under a minute. You can also A/B test your headlines, body copy, calls to action, and just about any other change you can think of.

Watch the quick video below for an introduction to enabling split testing on your LeadPages account.

https://youtube.com/watch?v=3h3pQKLagng?showinfo=0

Post navigation

I’ve been reading over and over again about this A/B Test, but nobody seems to have given me a clear explanation of what the heck it is. I think (but am not sure) it means putting out two alternatives of one subject and seeing which performs best. How you’d determine the “winner” is again a mystery. Above you are asking us to choose one, which is fine. But how do I know which of my own two alternatives is best? Please consider doing a post on just an explanation of this subject.

Daphne Sidor

Yes, I think you have the right idea! Great questions. I’ll try to answer them as clearly as I can.

A/B testing (also called split testing) is a way to see which of two variants of a web page (or other web asset) performs best. To determine a winner, you need to decide how you’ll measure success and find a way to track it. In this case, we’re using the opt-in rate as our success metric.

Note that the poll in this post isn’t actually *determining* which version won the split test—it’s more of a fun way to test your predictive skills. Plus, it’s interesting to see whether how people voted matches up with the actual results. Sometimes the answer is intuitive, sometimes not. (Which is why it’s important to split test in the first place.)

Most people will use some kind of software to run their split tests, since it’s otherwise generally difficult to accurately track the kind of data you’d need to determine a winner. LeadPages Pro and Advanced members get built-in split testing for their landing pages and LeadBoxes. You can set this up with just a few clicks, and LeadPages will automatically track your opt-in rate for both versions and declare a winner once a statistically significant difference in opt-in rate appears between the two versions. (If you are a Pro or Advanced LeadPages member, our Support team will be happy to help you set up your first A/B test.) There are also standalone split-testing services available, though many of them are a little pricey for smaller businesses.