SUMMARY: Best practice typically states that you should sell your products or services on the landing page, rather than in an email. But does that stand true for generating leads by giving away free content?

The MECLABS live test team set out to answer that question, and to show email marketers how to use A/B testing, at MarketingSherpa Email Summit 2014. Read on to see how MECLABS and live test sponsor BlueHornet planned, designed and executed the email and landing page live test.

While performing a live test has become a tradition at MarketingSherpa MarketingExperiments Optimization Summit, this was the first year the MECLABS team undertook the project at Email Summit. The Email Summit 2014 live test was sponsored by BlueHornet.

CHALLENGE

The MECLABS live test team was charged with what could be considered a daunting task. They had to develop not just one test, but three test options that would engage and interest the audience and provide actionable insights about email testing and the MECLABS email list.

With the three completely separate test options to choose from, this yearís test took a slightly different turn. Benjamin Filip, Senior Manager of Data Sciences, MECLABS, said the MECLABS live test team chose this path for three reasons.

"First, your hypothesis drives your test treatment. Ö Another reason was time and resources. It's honestly easier to have all the treatments ready to go and call back to the office and say we're going to run this, this and this," Filip explained. "Plus, we didn't think picking elements on a page was as intriguing as picking what you want to learn, so we thought we would get more audience engagement this way."

Step #1. Uncover gaps in customer knowledge and behavior

The beautiful part about testing is that you have the opportunity to not only earn lifts and increase metrics, but to truly learn about your customers.

"As a tester, you get to be your own teacher, if you will, and pick tests that make you want to learn. And structure tests that give you the knowledge youíre trying to gain," Filip said.

The team uncovered three specific areas they wanted to know more about MECLABS customers' behavior:

What impact does relevance and specificity really have on the customer response?

What stage of the conversion process is optimal for presenting detailed information about the incentive download: email or landing page?

What is perceived as a greater cost by the customer: social sharing or lead form completion?

To make the most of the email test, the team asked for additional information they did not have about the subscribers, as they were all already subscribed to a MECLABS list. Each form asked for subscribers' area of interest.

"That was one of the easiest ways to segment for future Summits, so getting their interests would help us determine which Summit they would be interested in," Lauren Pitchford, Optimization Manager, MECLABS.

Step #2. Craft possible research questions

For an effective test, you "have to have a goal, a question you want to answer," Filip said.

Once they had customer insight gaps in mind as the questions to answer, the team set out to craft more specific research questions to test:

Which selling point, in the email or on the landing page, will generate more form completions?

Which type of incentive — generic or specific — will entice people to share more information about themselves?

Which action is the audience more likely to take: provide more information about themselves, or share about my product to their friends and followers?

"The more specific your question, then the better you can test and the more you can isolate whatís causing the change," Filip said.

He suggested thinking beyond wanting to increase conversion — a broad goal all marketers have in any situation — and narrow that goal down to increase your potential insights.

Step #3. Brainstorm ways answer those questions

Brainstorming began with the live test team. As they worked through the ideas, the team would ask a series of questions to make sure the ideas were right for the test:

Is this really going to work?

If it does work, will it make sense?

Is it going to actually teach us something, or is it just a cool test?

If you're building a test based on a properly crafted research question, then you shouldn't have issues with the test having the capability to teach something. That's the importance of the question — to guide your experiment in the right direction.

After developing their initial ideas, the team then took those ideas to their peers. A big part of the MECLABS culture is receiving feedback on ideas from others around the office. Formally, this is referred to as Peer Review Sessions, or PRS. But informal, spur-of-the-moment peer review occurs, as well.

"You can just call a bunch of people over to your desk and say, 'What do you think about this?' And meet for 10 minutes and you've had a PRS," Pitchford said.

Some ideas they presented to PRS were confirmed, while others were met with confusion.

"And other times, you think you have a good idea and are ready to go, and then it didn't make sense to other people. In a way it's good, because it catches the error early but itís frustrating because you have to start over again," she said.

One of her preferred outcomes of PRS was receiving new ideas the live test team hadn't thought of. However, even that could be a double-edged sword. As the new ideas flow in, you have to eventually settle on a final selection.

Step #4. Build out test plans

First, the process started with the live test team creating wireframes of what each treatment would potentially look like. This included the general layout and page copy. At this time, the email copy was also written.

Here's a brief breakdown of the test options:

Option A. The power of relevance

The treatment for Option A allows visitors to select an interest they have, which lets them pick the MarketingSherpa Quick Guide topic they would like to receive.

Option B. The impact of sequence

The treatment for Option B takes all the details of the Quick Guide incentive from the landing page and places them in the email. This test option requires different email copy, but the subject line and "from" field still remains the same to keep open rates similar and to help ensure test validity.

Option C. The perception of cost

The treatment for Option C requires a different "cost" than most lead generation techniques. Subscribers on the MarketingSherpa list who receive the treatment must share the Quick Guide download landing page with their social networks to download the incentive. Then, anyone who clicks on the social shares will be taken to the control page to capture the potential new leads.

After mock-ups were created, Design and Development came onboard to help in the rest of the test creation process.

"Our team created mock-ups based off those wireframes for the three different test strategies, and we then took those mock-ups and presented them back to the live test drivers," said Steve Beger, Senior Development Manager, MECLABS.

From there, design made updates from the feedback from the live test team.

"Once they were satisfied with all the mock-ups that were provided, then we went and sliced out those designs and prepped them for development," Beger said.

After familiarizing themselves with the sponsored platform, Beger and his team implemented the design slices into the platform. The webpages were hosted on the MarketingSherpa website, as it allowed the most flexibility for testing regardless of the platform. Beger also worked with the BlueHornet team to upload the email copy into its platform.

The last stage involved editing and testing the webpages and emails. Both the landing pages and emails were reviewed by a copy editor. Then, with the changes in place, the team tested the pages and emails for bugs or other issues.

On day 1 of Email Summit, Filip and Pitchford along with Austin McCraw, Senior Editorial Analyst, MECLABS, presented the test options to the live Email Summit audience. The audience was then asked to vote for the option they were most interested in.

Here are the results of the audience vote:

Option 1: 23%

Option 2: 46%

Option 3: 31%

With almost half of the vote, Option 2 was the audience favorite. The team then set into action to launch the selected test option.

Step #6. Launch the test

During the original setup process, two placeholder pages were created for testing purposes. Once the development team learned of the Summit audience's choice, they set out to clear the webpages and emails of any signs of "control" or "treatment" so the end users had no indication that they were part of a test.

With the selected test option in place, a final quality assurance was performed to make sure the pages worked and the metrics were being recorded by Google Analytics. Once the QA process was complete, the data was eliminated to start the test with a clean slate.

Once the test was launched, teams were allocated to monitor the real-time collection of data in the analytics platform and to ensure the data was being collected properly on the system internally. The team wanted to make sure that the test elements were split properly, with even deliverability numbers.

Monitor incoming results for validity threats

When performing a test, you always want to go in with eyes open to problems. Some common validity threats the MECLABS team looks for are:

History Effect

Instrumentation Effect

Selection Effect

Sampling Distortion Effect

For the live test, Filip said the team kept in mind that it was the MECLABS email list, so subscribers were already familiar with the brand and MarketingSherpa Quick Guides. That familiarity could potential inflate the key performance indicators, including clickthrough and capture rate. Additionally, Summit attendees participating in the live test construction could cause some concern.

"Obviously people at Summit, some of them got the email, so they knew what was coming, they knew it was a test. They knew which [test treatment] they got, so if they really wanted the control to win, then they could say, 'Oh, I got the treatment, Iím not going to fill it out.' I don't think too many people do that, but itís still a validity concern," Filip said.

Overall, Filip said, "This test was pretty clean. We had plenty of people make it through the test. We validated at a high level. We didn't see any flip flop. It was a pretty straightforward test."

RESULTS

After 24 hours, the MECLABS live test team had a conclusive test.

Open rate:

Control: 8.1%

Treatment: 8.1%

No relative difference, which meant the test was successfully split in sending

Clickthrough rate:

Control: 17.1%

Treatment: 15.9%

The treatment saw a 6.6% decrease in clickthrough, with a 92% level of statistical confidence

Lead capture rate:

Control: 0.7%

Treatment: 0.8%

The treatment increased leads captured by 17%, with a 99% level of confidence

It appears that while less people with the treatment email clicked through to the landing page, those who did were either more motivated or better understood the value of the MarketingSherpa Quick Guide.

"By no means did we come away with a definitive answer. That's how all testing is — you have your assumptions," Pitchford said. ďBut we definitely learned that bringing the specific details of the [Quick Guide] farther in the funnel worked better."

Filip added, "With the MarketingSherpa audience, right now, we know that if we're going to emphasize giving free content that we should say we're giving free content and say something about what that content is in the email, not to just wait until they get to the landing page."

He also warned that not everyone can gain a takeaway from this discovery. However, many lead gen and content-focused marketers could find success by running a similar test on their own lists. For e-commerce, it could be a bit difficult, but the same theory could possibly be applied to free shipping.

As far as follow-up tests, Filip said, "Future tests would be [a] test saying 'free' in the subject line to see if that would have gotten more people to open the email."

With the test successfully complete, Pitchford concluded, "It's all about the team. If you don't have the team on board, then it can all fall apart quickly."

Comments about this Case Study

Mar 12, 2014 -
Bob Scheier of
Bob Scheier Associates says:
Sorry, this is so deep into the weeds of the testing procedure I don't get the "bottom line" of what this means to me as a marketer. Thought this was about how more details in an email lifts results?

Post a Comment

Note: Comments are lightly moderated. We post all comments without editing as
long as they (a) relate to the topic at hand, (b)
do not contain offensive content, and (c) are not overt sales
pitches for your company's own products/services.

The views and opinions expressed in the articles of this website are strictly those of the author and do not necessarily reflect in any way the views of MarketingSherpa, its affiliates, or its employees.