What do these experts have to say about good tests, bad tests and everything in between? We’ve distilled their wisdom down to seven applicable tips.

1. Get all your colleagues involved and invested in A/B tests

In her past position as Head of Conversion Rate Optimization at Shopify,Tiffany Da Silva found that she wasn’t working as closely with her coworkers as she would have liked. Because she knew that more hands on deck equaled greater potential, she asked her SEM guy to set up and run a test with her – start to finish. She explained to him:

For you to understand what I do and for me to understand the problems that you’re having right now, we need to pick one keyword and follow it the whole way.

The results were dramatic: “When we actually worked together and did this test for ecommerce and created this one page, we were able to see a 33% increase [in conversions] for just that keyword.”

That was enough to get her coworker on board. Bringing in help from different departments brings in fresh perspectives and a different set of experiences.

And having others invested in the process – start to finish – can result in big wins that foster more enthusiasm. As Da Silva puts it:

He’ll tell other people, and before long I have the guy who does Facebook coming up to me, and the guy who does SEO coming up to me, and we’re all working together to create really big tests.

2. Gamify your testing process

Put weird teams of people together, put $100 on the table, and say whoever designs the winning test wins.

Joanna Lord, VP of Consumer Marketing at Porch and former VP of Growth Marketing at Moz, agrees: “I lost $10 to my CEO last week.”

If you don’t want to use cash bets, you can still find fun ways to gamify the testing process.

Lord’s team uses a Plinko board: “At the bottom of the Plinko board are all of these things you can win, like gift certificates or Amazon gift cards.” People who create winning tests get to play Plinko and claim their prize.

When you gamify, remember that you can only win or lose within the context of the game. After all, you never lose when you test as long as you learn something from the test.

5. Don’t lose track of your company’s voice

Joanna Lord warned testers not to lose their company’s point of view as they optimize their pages for conversion.

“That balance between conversion and the point of view of your company is essential,” Lord explains. “Make sure that you’re not over-indexing on conversion and losing what’s special and beautiful and rare about your brand or your voice.”

At Porch, Lord starts by putting together a list of winning words that represent the brand and weighs them against their conversion potential.

Lord described an A/B test that appeared to indicate that the word connection led to high conversion rates, until the Porch team realized that the people who converted on connection weren’t qualified leads:

Those people were not revisiting. They were not engaging, their time on site was lower, their bounce was higher, their revisit rate was lower.

Understanding what type of language your most loyal customers relate to will help you optimize for the right type of conversions:

We might lose a little of that up-front conversion, but we’re winning in the long haul.

6. Don’t assume that “wins” apply across all customer segments

Even if a test was conclusive for one of your customer segments, the takeaways won’t necessarily apply across the board.

Hoeppner learned this the hard way. When he was testing sliders on his ecommerce site, he found that they lost to a static image.

However, he’d only been testing for his US users. When he ran the test with Canadian users, he got very different results.

Surprise: “Canadians like sliders.”

The bottom line? Best practices can help guide your next A/B test – but err on the side of caution and test to validate assumptions.

7. Watch out for downstream impacts

A test that leads to a conversion increase in one area of funnel may lead to a negative impacts further down the funnel.

“Whenever we change something, there’s some other downstream impact,” Hoeppner explained. “We’ll see click-through rates go up, and then dramatic drops in average order size.”

Newton’s Laws of Motion tell us: every action has an equal and opposite reaction. It’s true for A/B testing, too. When you make changes based on the results of an A/B test, make sure you track both the positive and negative reactions that occur throughout the rest of your conversion process.

Look for the downstream impacts, and make sure they aren’t negatively impacting your conversion rates or sales.

Ready to learn more?

Then get ready to improve your testing and your teamwork. Remember that it’s never too late to become a smarter A/B tester. As the Chinese proverb puts it:

The best time to plant a tree was 20 years ago. The second best time is now.

How are you going to improve your next A/B test?

About Nicole Dieker

Nicole Dieker is a freelance writer and copywriter. She writes the "How A Freelance Writer Makes A Living" column for The Billfold, and her work has also appeared in The Toast, Yearbook Office, Boing Boing, The Penny Hoarder, and The Freelancer by Contently. Follow her on Twitter @HelloTheFuture.