SUMMARY: How can you improve your lead generation efforts? Learn from your customers. A/B testing is one way to do that.

To help lead generation marketers learn how to use A/B testing, we conducted a live test at MarketingSherpa Lead Gen Summit 2013 in which the audience even helped shape the test.

In today's case study, learn how to improve your A/B testing efforts in a unique, behind-the-scenes look that explores not only what we learned from the test, but also how we pulled off the entire project.

CHALLENGE

Conducting an A/B split test from launch to result analysis in two days does not sound like a walk in the park for many people, but for the MECLABS team and attendees of Lead Gen Summit 2013, it was a chance to gain testing insights and learn about lead gen in an interactive format.

The goal of this Summit's live test was to learn how presentation of incentives and form field impacted lead gen rate … but also to provide an example of how to A/B test for lead gen marketers.

Background

Every new project or campaign should be built on the shoulders of all of the projects that came before it.

In this case, presenting a live, interactive test at Summit has become a tradition at MarketingSherpa Summits. Beginning at Optimization Summit 2011, then Optimization Summit 2012, to this year's Optimization Summit 2013, the process of designing a live test for a Summit audience has been streamlined into a process that helps audience members gain actionable insights and take back ideas for their own testing efforts.

This year's live test at Lead Gen Summit 2013 was the culmination of lessons learned from past tests, considerations of what does and doesn't work, as well as teamwork and collaboration of multiple departments at MECLABS, parent company of MarketingSherpa.

In partnership with Act-On, a marketing automation company and premier sponsor of Lead Gen Summit 2013, MECLABS ran a two-day test focused on discovering which incentives and form fields yield the highest lead gen rate. However, the main goal of the test was to gain new insight, regardless of the outcome. The test was developed by Kyle Foster, Research Manager, and Brittany Long, Research Analyst, both of MECLABS.

"We wanted to build a test that can’t lose," Foster explained, emphasizing how test design should be focused on learning about the customers. In this way, even if treatments produce a negative result, the increased customer intelligence can be applied to future campaigns for future gains.

The results of weeks of strategy and planning was an A/B split test of incentive offers and form field additions to discover which approach yields the greatest number of leads.

Here are the steps the team took to design and launch this test.

Step #1. Collaborate to develop strategy for test

Cross-departmental collaboration can be crucial to effective A/B testing.

According to Foster, the beginning stages of the test strategy revolved around meetings with team members to nail down what would be the best possible test to run at Lead Gen Summit. Foster met with team members involved in previous Summit live tests to discover what worked, and what didn't, in those tests.

From there, the test developers met with the content and marketing teams to receive feedback on the test thus far, and build on their ideas. Then, the team met with members from the MECLABS optimization team. The optimization team supports the development of test plans, experiment designs, wireframes and idea development at MECLABS.

The next step was to present the test design in a Peer Review Session, or PRS. These sessions are an opportunity for any team member or members to present ideas and gain feedback from colleagues they might not normally collaborate with.

This was a beneficial part of the process because Foster and Long had produced several ideas at this point of the test development as to what should be tested and how it should be presented on the landing pages. Through deliberation and collaboration with many members of the MECLABS team, a final test design was proposed and approved.

Lead generation marketers often use incentives in the lead capture process. So it was decided this experiment would focus on the best way to present these incentives along with determining if presenting incentives in a way that connoted more value could encourage prospects to include more information in the lead gen form.

Test objectives:

Determine if a single incentive versus a selection of incentives is more effective for generating leads on a landing page.

Discover whether a choice of incentives increases the perceived value of the incentive and will encourage visitors to offer more information for higher-quality leads.

Test hypotheses:

Treatment #1 versus Control: By offering prospects a choice between three incentive options, we will add to the perceived value of the offer and increase the lead gen rate.

Treatment #2 versus Control: By increasing the perceived value with the additional incentive options, we can collect additional information without negatively impacting the overall lead gen rate.

Research question:

Which incentive approach is more effective for generating leads?

The chosen incentive for this test was a choice of MarketingSherpa Quick Guides, a $45 value, which would be offered for free after the visitor filled out a short or long form.

Final test design

The live test consisted of a control landing page and two treatment pages. On the control page, visitors would be presented with a MarketingSherpa Quick Guide and a short form to complete. Treatment #1 contained the same short form as the control, but visitors would have the choice of one of three Quick Guides. Finally, Treatment #2 had a choice of incentives, but also required a longer form for visitors to complete before receiving the incentive.

By adding a choice of incentive, the team attempted to see if giving visitors their pick of incentive would raise perceived value of the incentive. In the treatments, the choice of incentive was indicated by tabs on the page of the different incentives that were available.

However, the team also wanted to test if adding more fields to the lead capture form would add too much perceived cost (time, in this case) to overcome the perceived value for visitors.

Step #2. Put feedback in test design

At this point in test development, all of the ideas and concepts for the pages were finalized and sent to the development team at MECLABS.

The first step for that team was to evaluate the compatibility of Act-On's marketing automation platform with what the team had planned.

Once the development team understood how the platform worked, the test design process went to a design team to plan the look and feel of the pages.

After all of these elements were developed, the pages were built, analytics were set up, and a quality assurance (QA) process began on tracking and development. The entire process from building the pages through QA testing spanned two weeks.

Step #3. Present the test at Lead Gen Summit and let the audience determine final versions of the treatments

During the opening welcome on day one of Summit, Foster presented the live test to the audience.

In addition to learning the objectives of the test, attendees received an audience sheet.

The survey sheet gave attendees the opportunity to choose which incentives would be options for visitors to the pages, the number of additional form field(s) that should be added, and what those additional form field(s) should ask visitors.

Step #4. Launch the test

After collecting the surveys, the results were sent back to the lab in Jacksonville Beach, Fla., for the deveopment team to launch the test. Out of 153 surveys, the chosen incentives were:

With all of these variables in place, the test launched Tuesday at 1 p.m. Eastern Time, day one of Summit, and concluded in the afternoon of day two. The team had two hours to incorporate the audience's changes into the pages from the time the surveys were received till the launch.

To drive traffic to the pages, an email was sent through the Act-On platform to both MarketingSherpa newsletter subscribers as well as a list from Act-On, totaling around 300,000 recipients.

Updates

The live test process didn't stop there. The sciences team in Jacksonville Beach evaluated the experiment as results streamed in from around the world during Lead Gen Summit, and collaborated with Foster, who was on-site, to craft updated data with interpretations of the results.

After lunch on the first day, the MECLABS team updated attendees with how the test was performing. At this point in time, the chosen incentives and form fields were revealed to the audience, as well as the predicted winner of the test.

On day two of Summit, attendees were updated first thing in the morning. The control landing page had received a conversion rate of 58% while Treatment #1 received a rate of 57%. However, the statistical level of confidence for Treatment #1 was 33%. Finally, Treatment #2 received a conversion rate of 51% at a 99% level of confidence.

Step #5. Overcome challenges

Not every test can run perfectly without a hitch, and this test was no different.

One challenge the MECLABS team had was releasing the test and getting the audience's changes in quickly.

However, the test was accidentally released via LinkedIn before the audience's changes were implemented. Luckily, the mistake was caught and retracted, the channel was small, and the test was released correctly and on time.

Another complication was an issue with tracking conversions with analytics software. While there were a few hiccups, the team managed to recover and track conversions on the pages.

"The importance of a QA process when testing cannot be understated. If we would have tested the analytics platforms two hours before we launched, instead of afterward, we could have had those problems fixed," Long said.

Step #6. Evaluate the results

In the end, there was no significant difference between the control (single offer, short form) and Treatment #1 (choice of offer, short form). Treatment #2 underperformed both the control and Treatment #1.

From the results, marketers learned that a choice of incentive does not increase perceived value of that incentive. Having the choice between the one versus three MarketingSherpa Quick Guides did not make a significant difference into the number of leads generated from the form.

Also, when comparing the control page with Treatment #2 (choice of offer, long form), a takeaway from the results is the perceived value of the Quick Guide did not outweigh the perceived cost of having to fill out a longer form with more information.

The top choices for incentive selected by visitors to the site were:

Lead Generation Quick Guide at 64%

Email Marketing Quick Guide at 18%

Content Marketing Quick Guide at 18%

RESULTS

The results of the live test are as follows:

Control (single offer, short form) — 58% conversion rate.

Treatment #1 (choice of offer, short form) — 57% conversion rate, a -1.3% relative difference from the control with a 31% level of confidence. When compared to Treatment #2, it had a 12% relative difference with a 99% level of confidence.

Treatment #2 (choice of offer, long form) — 51% conversion rate, a -11.9% relative difference from the control with a 99% level of confidence.

Treatment #2 with the additional form field asking for job title reduced lead gen rate by 11%.

(Editor’s note: Due to the teaching nature of the live test, two validity threats were intentionally introduced that might have skewed test results. Priming a small number of test recipients likely took place, since members of the Lead Gen Summit audience may have received test treatments after learning about the test. This may have introduced a small selection effect. Also, since the test was conducted in such a short time frame to complete the test during Lead Gen Summit, a history effect may have been introduced.

"The Control and Treatment #1 performed similarly, but when you compare it to Treatment #2, it had a significant difference for both the control and Treatment #1. So if anything, we take away that Treatment #2 lost," Long said.

However, "it depends on what you mean by lost, because Treatment #2 had the additional form field, it was an 11% relative decrease in lead gen. But to some people, extra information, like knowing job title, might be worth taking an 11% loss. That's up to the organization," she added.

"Even though the treatment didn't win, it doesn't mean it wasn't a successful test," Foster said.

Another element the team analyzed when reviewing the results was the tabbed layout approach. The team knew a drop-down would hide the incentive choices, so using the tabs on the top of the page was a way for visitors to see all of the available incentives.

Visitors chose the Lead Generation Quick Guide most frequently on the treatment pages, but it was the default choice, the first in the order of tabs.

"If I could keep testing this, I would test the tabs in the layout. I'm not sold that this was the best approach. I want to see if the tabs created friction, not the additional form field," Foster said.

Another element the team called into question after the test was the value of the Quick Guide, and the importance of offering an incentive relevant to your audience.

"I feel like for those on the MarketingSherpa mailing list, they understand what a Quick Guide is, and know the value that's attached. But, maybe those on the Act-On list did not know what a Quick Guide was and asked, 'Why do I want these? Is it reputable?'" Long said.

Overall, the goal of the test performed at Lead Gen Summit was to learn new customer insight, which is exactly what occurred.

"Even if you're losing, it could still be successful. It's more about the learning. Even though lead gen was flat, it gave us information about what our customers wanted," Foster said.

From one tester to another, Long explained her key takeaways from the test for marketers to apply to their own testing efforts.

"Be decisive about your choices about what you're going to test before you run the test, and make sure that you have safeguards in place for your analytics, or anything really, so that you know the data you're collecting is going to be accurate and valid," Long said.

Post a Comment

Note: Comments are lightly moderated. We post all comments without editing as
long as they (a) relate to the topic at hand, (b)
do not contain offensive content, and (c) are not overt sales
pitches for your company's own products/services.

The views and opinions expressed in the articles of this website are strictly those of the author and do not necessarily reflect in any way the views of MarketingSherpa, its affiliates, or its employees.