SUMMARY: Most email marketers are happy to have a major testing breakthrough every once in a while. But see how a financial services brand recently scored big with multiple tests in this initial Case Study of a two-part series.

First: How an A/B test showed that past response data can be used to send emails to list members' individually-preferred times. Includes tips on list growth efforts, subject line tests and segmentation.

Lisa Friedman, Sr. Director, Marketing, Mint.com, could see from her open and clickthrough rates that her email list was underperforming. This low engagement meant the free personal financial management service was losing opportunities to move subscribers down the funnel to obtain one of their third-party paid services, and therefore earn the site an affiliate fee.

Friedman and her team suspected email messages werenít reaching subscribers at times when they were ready to engage. They also wanted to encourage a viral effect by having users refer their friends to the service.

"One of the things we really were looking to do was better target our users and get them to spread the word about our service," Friedman says. "We wanted to test out different tactics and psychologies to improve our email programís performance."

CAMPAIGN

Friedman and her team also wanted to dramatically increase new registrants at the site (and subsequently, grow the email list). As youíll see, they certainly had a lot on their hands.

To get the job done, they took the following five steps:

Step #1. Build a bigger list

Before the initiative started early this year, the young brand knew they hadnít been as committed to their email program as they could have been. Simply put, it was time to focus on list growth.

Friedman and her team came up with a six-pronged campaign to drive traffic to the site. They used: o Social media siteso A blog on the flagship siteo Search engine optimizationo Public relations, including TV and radio news segmentso Word of mouth

Visitors attracted by these promotions landed a single click away from a registration page designed to be simple to complete. It included a handful of fields:o Email addresso Confirm email addresso Zip codeo Passwordo Confirm password

Step #2. Program personalized send times

Next, the team worked with their email service provider (see useful links below) to analyze recipient behavior on a rolling basis and predict the best delivery time for each address on the mailing list.

The team began delivering messages to recipients at the time of day they had shown they were most likely to open and click.

Here are a few examples of how the program worked:

- If recipient A displayed a tendency to open emails at 5 p.m., Mint.comís campaigns, alerts and triggered messages would be sent at that time.

- If a recipient B often opened emails at 3 a.m., all messages would be sent at that time.

- If recipients started opening the emails at different times of the day, the system adapted and began sending according to this newly exhibited behavior.

Step #3. Run A/B test

Before implementing the timing system across the board, Friedman and her team ran a month-long A/B test to see if it indeed worked:o Half of the list received messages at their behaviorally established times o Half received messages at 6 a.m. EDT

"We had been using 6 a.m. EDT as our send time, and thatís why it was the control send time," Friedman explains. "We always sent it then so it would be in the morning inboxes on the East and West Coasts."

Step #4. Remove inactive names

The team knew they werenít going to improve opens and clickthroughs by optimizing send times alone. They needed to take a hard look at names that showed a fading interest in the brand.

Friedman and her team decided to create a segment of names that had opened an email during the last four months. They targeted these recently-active subscribers from that point forward.

Step #5. Test subject lines

Lastly, Friedman wanted to get a better grasp on which types of subject lines and offers made the Mint.com audience respond. Her team tested two incentive-based subject lines/offers vs. a generic for the control group.

- The control subject line: "Spread the word."

- Test subject line A: "Spread the word and win an iPod."

- Test subject line B: "Want early access to new Mint.com features?"

The email body copy reflected the subject line messaging, but all three emails used the same design template. The creative reflected the aim of the campaign, which was to drive referrals.

RESULTS

The lead-generation campaign and its simple registration page worked miracles for the teamís list growth. Roughly 3,300 new site users/email subscribers were gained a day for six consecutive months.

Then, the A/B test showed that basing send times on a behaviorally-indicated preference was a winning approach. The personalized send times delivered improvements over the 6 a.m. control time on two all-important data fronts:o Open rate increased 7%o Clickthrough rate increased 13%

"We were looking to improve our open and clickthrough rates, and thatís what [we] did," Friedman says. "Obviously, if the timing is right, thatís important. Having that edge makes a big difference."

Friedman adds that segmenting the list to people who opened or clicked in the previous four months was also part of the numbers lift. "One of the reasons why it was so successful is because it targeted active users of our product. We didnít just send it to everybody."

While specific metrics for the subject line test werenít available, Friedman says her team was intrigued to see the audience open the "Want early access to new Mint.com features?" subject line at the highest clip. It showed her team that emphasizing the siteís usability upgrades and features was a strong way for her brand to break through the inbox clutter.

"They use our service and like it," Friedman explains. "And they wanted to get a sneak peak at the new features offered more than [they wanted to] win an iPod Nano."

Finally, even though they hadnít had a refer-a-friend program before and therefore had no benchmark to compare against the test performance, the team was confident that they accomplished a key goal by increasing viral.

"The more sharing we get, the more referrals, and ultimately, more registered users," Friedman says. "And thatís the bottom line."

Post a Comment

Note: Comments are lightly moderated. We post all comments without editing as
long as they (a) relate to the topic at hand, (b)
do not contain offensive content, and (c) are not overt sales
pitches for your company's own products/services.

The views and opinions expressed in the articles of this website are strictly those of the author and do not necessarily reflect in any way the views of MarketingSherpa, its affiliates, or its employees.