Kissmetrics Blog

A blog about analytics, marketing and testing

4 A/B Testing Mistakes That Can Kill Your Business – And How to Avoid Them

Everyone agrees that optimization and testing are important keys to success. And everyone thinks they know the best ways to use them.

However, in my experience, testing is a double-edged sword. If it’s done right, the benefits pour in; but if it’s done wrong, you literally can drive a business into the ground.

The following testing and optimization missteps are four serious and, sadly, common testing pitfalls I see businesses make every day. The good news is that, once identified, they become easy to avoid in the pursuit of optimization that gets you where you need to go.

Mistake #1 – Optimizing for Maximum Conversion at the Expense of Your Promise

As the saying goes, left unchecked, all optimization leads to gambling and porn. While a bit extreme, the adage makes a clear point: when you optimize solely for conversions, it’s easy to lose touch with what you really do.

There is a dangerous allure to the quick win that pulls focus from the true value of your product. It can happen little by little, like a subtle current; but, eventually, you look up and you’re miles away from your core value. If you find yourself thinking solely about maximum conversion at the expense of everything else, you’re setting yourself up for failure.

Successful products share a common attribute: they are a must-have experience. The experience people can’t live without is what inspires them to share with their friends. When the must-have experience resonates with your audience, you’re on your way to product / market fit.

The experience delivers a promise, and that promise must anchor all optimization efforts. Optimization that doesn’t align can badly damage your business by diffusing the message and confusing customers.

An easy example of this is the late night TV ads that deliver miraculous results, like the “Shake Weight.” They may make a lot of one-time sales with their amazing promises, but the customer receiving the vastly overrepresented product is sure to be disappointed.

Optimizing for conversions at the top of the funnel creates a glut of dissatisfied customers who trash your business on the back end. The promises may drive conversions, but they don’t create lasting value for your business.

Instead of worrying about how you can tweak your landing page to be more and more aggressive, focus on creating a compelling hook or promise that is true to your product and offers real benefits to the consumer. Resist the urge to optimize on value propositions and promises that aren’t congruent with what your product really delivers.

Create a testing plan that keeps the core experience at the forefront. This ensures optimizations are in line with your product vision and converted customers are there because of the promise of the must-have experience. So, rather than feeling disappointed or tricked, customers will feel like they got exactly what they were looking for, which creates real value for your business.

How do you understand what optimizations may or may not be relevant and properly aligned? Ask your customers. You can use surveys – we humbly suggest Qualaroo – customer development calls, keywords on inbound traffic, or heat maps. These signals can point you to the parts of the product that most resonate with your audience. Use that data to create the hooks and promises that are most likely to trigger positive responses.

Mistake #2 – Putting Conversion in the Way of the Must-Have Experience

Organic and sustainable growth comes from customers who love the must-have experience of your product. They use it regularly; they pay for it; they give you feedback to make it better; and they tell their friends. This is growth nirvana: a world of passionate users accelerating growth that keeps on going.

But the key to the kingdom is the must-have experience that users fall in love with, not the ad or the white paper. If you’re optimizing for conversions from your ad traffic but simultaneously making it harder for users to access the must-have experience, you’re shooting yourself in the foot.

For example, if your must-have experience comes from testing the product and playing with it, but you’re optimizing for conversions to a drip marketing campaign and digital whitepaper download, you’re probably doing it wrong.

You may be optimizing the conversion rate on the front end, but you’re adding friction to the user’s quest to get to the must-have experience. This friction blocks people from the must-have experience, thus starving that engine of growth for your business.

Solution #2 – Get Users to Your Must-Have Experience as Fast as Possible

Focus your conversions on getting visitors to the must-have experience with the least friction possible. This means optimizing your user flow to remove unnecessary steps and complexity in order to maximize the number of visitors who make it to the must-have experience. Ask yourself “What are the absolute required pieces of information to get started?” and “How can I eliminate extra clicks?”

Funnel analysis can show you where the big drop-offs are in your current process. Look for ways to eliminate the big bottlenecks so that you get the biggest lift from your efforts. Complement your funnel research with surveys and customer development to get a clear understanding of the user dynamics in your funnel, and use the feedback to improve conversion.

Optimizely understands the importance of a direct and simple flow. They’re currently testing paid search ad units that ask users to input the URL they want to try Optimizely on. When a visitor enters a URL and clicks “try it out,” they hit a landing page with a sign-up form.

Right behind the form, though, they can see the website and the experiment builder waiting for them. No landing page with an email confirmation. No extra steps. From the ad to the testing interface, the visitor is ready to go with two clicks. That’s taking people directly to the must-have experience.

Compare this to Adobe and their A/B testing tool. Their ad shows up in the same search and takes you to this landing page. This page asks for everything under the sun just to get a white paper. It’s completely disconnected from the must-have experience and as full of friction as you can imagine. It’s no wonder that Optimizely is one of the first names that comes to mind in the testing space, while most people have no idea that Adobe offers A/B testing.

Hello Bar also gets visitors right to the must-have experience. When you click “try it out,” you are taken right to the Hello Bar builder. You can customize it and even preview it on your site before completing the account setup process. This allows visitors to get the must-have experience first, before jumping through registration hoops. And once you see Hello Bar on your own site, you’re far more likely to sign up for the service.

Mistake #3 – Wild Goose Chases and Random Testing

Too often, tests start with a random, offhand question from an executive who has given little thought to how the results actually will be used to improve the business. These can be in the form of micro-optimizations (see mistake #4) or tests that aren’t focused on creating quantifiable learning. When you take a “test whatever” approach, you’re missing out on discovering what truly is preventing conversions.

While you may catch lightning in a bottle through sheer serendipity, it is more likely you will end up with a heap of inconclusive data, leading to little learning and little true optimization. The effort and the lack of payoff can frustrate the team and take the momentum out of the program entirely. This leads to stagnation in optimization and growth.

Solution #3 – Structure Your Testing around Core Hypotheses

To keep from ending up on fruitless chases without any actionable results, start by identifying points of confusion for the user in either the hook/promise or the funnel itself. User testing and surveys can help you determine where to focus your optimizing efforts first.

Once you have user feedback, create your own hypotheses about what changes will move the needle or provide learning to improve your business. From those assumptions, you can build a testing plan that helps you work through the optimizations that will prove or disprove those hypotheses. If your testing is not anchored in a plan, you’re liable to eat up your business’s valuable time and resources and, in the worst case, grind progress to a halt.

When you are testing, always start with the same questions:

Where are users confused in the funnel?

What’s our hypothesis for the test?

Is this test likely to impact results?

Is this test the best test we can run?

Does it make sense to test this in light of what we’ve learned so far?

How long will it take to learn from this test?

With these questions, you can refine your testing activities to the specific ones that truly can help your business.

In a great example, DHL focused on imagery and form placement in their landing page A/B tests and drove massive lift. As you will notice, instead of starting with the promise, the copy stayed the same. This test was run to see if a long-term champion template that had plateaued could be unseated by a new challenger. From the blog post on the conversion:

The A/B split tested a long-standing “winning” template against a Challenger. The Challenger template increased the visibility of the form, moving it into the top right corner, adjacent to the courier image. Additionally, a friendly male courier image replaced the logistics image.

By having a hypothesis around where the page could be improved, they were able to focus on the elements they thought would be most likely to move the needle. Read the full post here.

Mistake #4 – Micro-Focused Testing

It’s easy to think of testing and immediately go to button color tests or copy tests. After all, these are the kinds of tests most often cited in blog posts about optimization. These tests are easy to run, and you can think of dozens of variables to implement. The trouble is you can run micro-optimizations for months and, in the process, leave a pile of money on the table with little gain.

This narrow testing limits you to optimizing around a local maximum, while a broader perspective on testing can help you find the true maximum lurking just out of sight. Like rearranging the deck chairs on the Titanic, micro-testing gives you incremental improvement in your landing page conversion, but, down below, the fundamental economics of your acquisition funnel and revenue model are in flames.

Solution #4 – Test Broadly

Take a broader, macro view of your testing and optimization strategy. Test and optimize everything, from your business model to your method for delivering the must-have experience, to find the upside that really will move the needle.

When you create your testing plan, step back and look at the big picture. Ask yourself what tests you can run that will give you insights into how to improve the performance of your business:

How quickly can you get someone to the must-have experience?

How can you change the user flows and funnels to reduce friction and improve the rate of visitors to users?

Once you’ve identified those broader tests, start to drill down into smaller-scale optimizations around landing pages and their elements.

SmartShoot focused on optimizing their pricing and products page by testing new products that didn’t even exist yet. By understanding what features their customers really wanted, and by thinking of optimization at a broader level, they were able to gain a 233% increase in conversions. If they had focused on optimizing only the layout or design of their pricing page, they might have found incremental lift but missed this massive win.

Putting It All Together

Testing is crucial to the success of your business, but equally important is ensuring that you’re taking an approach to testing that will move you toward success. Focus on tests aligned with your core experience to ensure that conversions create happy customers who will help spread your message.

Also, optimize your conversion funnels to get visitors to the must-have experience with the least frustration possible. Be rigorous in your plan and test broadly to avoid the trap of being too narrow or micro-focused in your approach. If you avoid these mistakes, you’ll be able to focus on the big opportunities and test your way to results that really move the needle.

Have an optimization that really worked for you? Show it off in the comments.

About the Author: Sean Ellis is currently the CEO of Qualaroo, a marketing software company that empowers marketers to better engage, understand and convert their website visitors. Prior to founding Qualaroo, he was the first marketer at Dropbox, Lookout, Xobni, LogMeIn (IPO), and Uproar (IPO) and also held interim marketing executive roles at Eventbrite, Socialcast, and Webs. Follow him on Twitter.

Where Google Analytics Falls Short

* snorted my coffee up my nose when I read “left unchecked, all optimizing leads to gambling and porn” – that’s so true and makes me wonder if porn and gambling website owners are guru optimisers by nature. Great article.

Very interesting read. It’s great that you mentioned the impact of case studies (that are, of course, very interesting) which show the miraculous effects of just changing the button color. It may lead to narrow-focused split testing and its really interesting that you offered a remedy for that.

Yep Anna, I’ve wasted plenty of time testing buttons in the past. My biggest improvements tend to come when I truly understand what is preventing conversions and address those issues. Rarely is the size, shape, or color of the button a factor that prevents conversions.

Robert agree it’s amazing that people optimize trying to claim their product is something people want when often it is completely the wrong fit. Best to understand why people love it, find new people with similar needs and connect the dots. Creating something people love is really hard. When you do it, this user love needs to guide you optimization efforts.

I think that all these mistakes generally fall under the fact that many people aren’t looking at the big picture when conducting A/B testing and end up focusing too much on little things.

Instead of focusing on the value of your product or service or the actual customer experience may cause someone to create detrimental changes to their website. Not only does it make user experience worse but you end up wasting a lot of time and resources on A/B testing that will only make things worse.

Thanks for reminding us to stay on track, and I’ll definitely keep this post in mind for the future!

Nice post Sean. Thanks for briefing about ab testing mistakes. Loved reading. I’ve shared a guest post on this ab testing, you can check the post might be a good add to your article, Here’s the link tech.pro/blog/1710/ab-and-multivariate-testing–a-quick-go-through

I’ve been exploring for a bit for any high quality articles or blog posts on this sort of area . Exploring in Yahoo I eventually stumbled upon this site. Studying this information So i’m satisfied to express that I have an incredibly excellent uncanny feeling I came upon exactly what I needed. I such a lot undoubtedly will make certain to do not omit this site and provides it a look a relentless basis|regularly}.

Follow Us

Article Categories

What is Kissmetrics?

We're more than just a blog! Our online software helps marketers turn analytics into insights that guide decision-making and growth. Kissmetrics is different because it ties every visit on your website to a person – even if they're using multiple devices.