A/B Testing

A/B testing compares two or more versions of a webpage, app, screen, surface or other digital experience to determine which one performs better. Use conversion rates and user engagement to reveal whether a specific version had a neutral, positive, or negative effect. Results help you improve campaigns, customer experience and conversion, and sharpen audience targeting.

Test as many variations as you need, but keep traffic requirements in mind if you want results to be statistically significant for each variable. You can also compare variations using different audience segments.

Which design elements should be tested?

Any page element, such as shapes, colors, sizes, and messaging, can be tested. You can test entire digital experiences, single pages, or full customer journeys for their impact on metrics and conversion goals.

How is A/B testing different from multivariate testing?

Multivariate tests examine multiple combinations of elements at one time, which can help to reveal the relative contribution elements have as they interact to trigger engagement.

What is a null hypothesis?A null hypothesis occurs when results show no significant statistical difference in engagement or conversion rates across two or more versions, and any observed difference is likely due to sampling or experimental error.