Lessons Learned From 2,345,864 Exit Overlay Visitors

Back in 2015, Unbounce launched its first ever exit overlay on this very blog.

Did it send our signup rate skyrocketing 4,000%? Nope.

Did it turn our blog into a conversion factory for new leads? Not even close — our initial conversion rate was barely over 1.25%.

But what it did do was start us down the path of exploring the best ways to use this technology; of furthering our goals by finding ways to offer visitors relevant, valuable content through overlays.

Overlays are modal lightboxes that launch within a webpage and focus attention on a single offer. Still fuzzy on what an overlay is? Click here.

In this post, we’ll break down all the wins, losses and “holy smokes!” moments from our first 2,345,864 exit overlay viewers.

Psst: Towards the end of these experiments, Unbounce launched Convertables, and with it a whole toolbox of advanced triggers and targeting options for overlays.

Goals, tools and testing conditions

Our goal for this project was simple: Get more people to consume more Unbounce content — whether it be blog posts, ebooks, videos, you name it.

We invest a lot in our content, and we want it read by as many marketers as possible. All our research — everything we know about that elusive thing called conversion, exists in our content.

Our content also allows readers to find out whether Unbounce is a tool that can help them. We want more customers, but only if they can truly benefit from our product. Those who experience ‘lightbulb’ moments when reading our content definitely fit the bill.

As for tools, the first four experiments were conducted using Rooster (an exit-intent tool purchased by Unbounce in June 2015). It was a far less sophisticated version of what is now Unbounce Convertables, which we used in the final experiment.

Testing conditions were as follows:

All overlays were triggered on exit; meaning they launched only when abandoning visitors were detected.

For the first three experiments, we compared sequential periods to measure results. For the final two, we ran makeshift A/B tests.

When comparing sequential periods, testing conditions were isolated by excluding new blog posts from showing any overlays.

A “conversion” was defined as either a completed form (lead gen overlay) or a click (clickthrough overlay).

All experiments were conducted between January 2015 and November 2016.

Experiment #1: Content Offer vs. Generic Signup

Our first exit overlay had a simple goal: Get more blog subscribers. It looked like this.

It was viewed by 558,488 unique visitors over 170 days, 1.27% of which converted to new blog subscribers. Decent start, but not good enough.

To improve the conversion rate, we posed the following.

HYPOTHESISBecause online marketing offers typically convert better when a specific, tangible offer is made (versus a generic signup), we expect that by offering a free ebook to abandoning visitors, we will improve our conversion rate beyond the current 1.27% baseline.

Observations

Stay in the loop and get all the juicy test results from our upcoming overlay experiments

Learn from our overlay wins, losses and everything in between.

By entering your email you'll receive weekly Unbounce Blog updates and other resources to help you become a marketing genius.

Experiment #2: Four-field vs. Single-field Overlays

Data people always spoil the party.

The early success of our first experiment caught the attention of Judi, our resident marketing automation whiz, who wisely reminded us that collecting only an email address on a large-scale campaign was a missed opportunity.

For us to fully leverage this campaign, we needed to find out more about the individuals (and organizations) who were consuming our content.

Translation: We needed to add three more form fields to the overlay.

Since filling out forms is a universal bummer, we safely assumed our conversion rate would take a dive.

But something else happened that we didn’t predict. Notice a difference (besides the form fields) between the two overlays above? Yup, the new version was larger: 900x700px vs. 750x450px.

Adding three form fields made our original 750x450px design feel too cramped, so we arbitrarily increased the size — never thinking there may be consequences. More on that later.

Anyways, we launched the new version, and as expected the results sucked.

Things weren’t looking good after 30 days.

For business reasons, we decided to end the test after 30 days, even though we didn’t run the challenger overlay for an equal time period (96 days).

Overall, the conversion rate for the 30-day period was 48% lower than the previous 96-day period. I knew it was for good reason: Building our data warehouse is important. Still, a small part of me died that day.

Then it got worse.

It occurred to us that for a 30-day period, that sample size of viewers for the new overlay (53,460) looked awfully small.

A closer inspection revealed that our previous overlay averaged 2,792 views per day, while this new version was averaging 1,782. So basically our 48% conversion drop was served a la carte with a 36% plunge in overall views. Fun!

But why?

It turns out increasing the size of the overlay wasn’t so harmless. The size was too large for many people’s browser windows, so the overlay only fired two out of every three visits, even when targeting rules matched.

We conceded, and redesigned the overlay in 800x500px format.

Daily views rose back to their normal numbers, and our new baseline conversion rate of 1.25% remained basically unchanged.

Large gap between “loads” and “views” on June 4th; narrower gap on June 5th.

Observations

Increasing the number of form fields in overlays can cause friction that reduces conversion rates.

Overlay sizes exceeding 800×500 can be too large for some browsers and reduce load:view ratio (and overall impressions).

Experiment #3: One Overlay vs. 10 Overlays

It seemed like such a great idea at the time…

Why not get hyper relevant and build a different exit overlay to each of our blog categories?

With our new baseline conversion rate reduced to 1.25%, we needed an improvement that would help us overcome “form friction” and get us back to that healthy 2%+ range we enjoyed before.

So with little supporting data, we hypothesized that increasing “relevance” was the magic bullet we needed. It works on landing pages— why not overlays?

HYPOTHESIS Since “relevance” is key to driving conversions, we expect that by running a unique exit overlay on each of our blog categories — whereby the free resource is specific to the category — we will improve our conversion rate beyond the current 1.25% baseline.

We divide our blog into categories according to the marketing topic they cover (e.g., landing pages, copywriting, design, UX, conversion optimization). Each post is tagged by category.

So to increase relevance, we created a total of 10 exit overlays (each offering a different resource) and assigned each overlay to one or two categories, like this:

Creating all the new overlays would take around three hours, but since we already had a deep backlog of resources on all things online marketing, finding a relevant ebook, course or video to offer in each category wasn’t difficult.

And since our URLs contain category tags (e.g., all posts on “design” start with root domain unbounce.com/design), making sure the right overlay ran on the right post was easy.

URL Targeting rule for our Design category; the “include” rule automatically excludes the overlay from running in other categories.

When we were just using one overlay, that was easy — a simple “Frequency” setting was all we needed.

…but not so easy with 10 overlays running on the same blog.

We needed a way to exclude anyone who saw one overlay from seeing any of the other nine.

Cookies were the obvious answer, so we asked our developers to build a temporary solution that could:

Pass a cookie from an overlay to the visitor’s browser

Exclude that cookie in our targeting settings

They obliged.

We used “incognito mode” to repeatedly test the functionality, and after that we were go for launch.

Then this happened.

Ignore the layout… the Convertables dashboard is much prettier now :)

After 10 days of data, our conversion rate was a combined 1.36%, 8.8% higher than the baseline. It eventually crept its way to 1.42% after an additional 250,000 views. Still nowhere near what we’d hoped.

So what went wrong?

We surmised that just because an offer is “relevant” doesn’t mean it’s compelling. Admittedly, not all of the 10 resources were on par with The 23 Principles of Attention-Driven Design, the ebook we originally offered in all categories.

That said, this experiment provided an unexpected benefit: we could now see our conversion rates by category instead of just one big number for the whole blog. This would serve us well on future tests.

Observations

Just because an offer is relevant doesn’t mean it’s good.

Conversion rates vary considerably between categories.

Experiment #4: Resource vs. Resource

“Just because it’s relevant doesn’t mean it’s good.”

This lesson inspired a simple objective for our next task: Improve the offers in our underperforming categories.

We decided to test new offers across five categories that had low conversion rates and high traffic volume:

A/B Testing and CRO (0.57%)

Email (1.24%)

Lead Gen and Content Marketing (0.55%)

Note: We used the same overlay for the A/B Testing and CRO categories, as well as the Lead Gen and Content Marketing Categories.

HypothesisSince we believe the resources we’re offering in the categories of A/B testing, CRO, Email, Lead Gen and Content Marketing are less compelling than resources we offer in other categories, we expect to see increased conversion rates when we test new resources in these categories.

With previous studies mentioned in this post, we compared sequential periods. For this one, we took things a step further and jury-rigged an A/B testing system together using Visual Website Optimizer and two Unbounce accounts.

And after finding what we believed to be more compelling resources to offer, the new test was launched.

We saw slightly improved results in the A/B Testing and CRO categories, although not significant. For the Email category, we saw a large drop-off.

In the Lead Gen and Content Marketing categories however, there was a dramatic uptick in conversions and the results were statistically significant. Progress!

Observations

Not all content is created equal; some resources are more desirable to our audience.

Experiment #5: Clickthrough vs. Lead Gen Overlays

Although progress was made in our previous test, we still hadn’t solved the problem from our second experiment.

While having the four fields made each conversion more valuable to us, it still reduced our conversion rate a relative 48% (from 2.65% to 1.25% back in experiment #2).

We’d now worked our way up to a baseline of 1.75%, but still needed a strategy for reducing form friction.

The answer lay in a new tactic for using overlays that we dubbed traffic shaping.

Traffic Shaping: Using clickthrough overlays to incentivize visitors to move from low-converting to high-converting pages.

Here’s a quick illustration:

Converting to this format would require us to:

Redesign our exit overlays

Build a dedicated landing page for each overlay

Collect leads via the landing pages

Basically, we’d be using the overlays as a bridge to move readers from “ungated” content (a blog post) to “gated” content (a free video that required a form submission to view). Kinda like playing ‘form field hot potato’ in a modern day version of Pipe Dream.

HypothesisBecause “form friction” reduces conversions, we expect that removing form fields from our overlays will increase engagement (enough to offset the drop off we expect from adding an extra step). To do this, we will redesign our overlays to clickthrough (no fields), create a dedicated landing page for each overlay and add the four-field form to the landing page. We’ll measure results in Unbounce.

By this point, we were using the Unbounce Builder to create the entire campaign: the clickthrough overlays were built in Convertables using drag and drop, and the landing pages were copied over (and updated) from existing Unbounce campaigns (that already offered the resources)

We decided to test this out in our A/B Testing and CRO as well as Lead Gen and Content Marketing categories.

After filling out the form, visitors would either be given a secure link for download (PDF) or taken to a resource page where their video would play.

Again, for this to be successful the conversion rate on the overlays would need to increase enough to offset the drop off we expected by adding the extra landing page step.

These were our results after 21 days.

Not surprisingly, engagement with the overlays increased significantly. I stress the word “engagement” and not “conversion,” because our goal had changed from a form submission to a clickthrough.

In order to see a conversion increase, we needed to factor in the percentage of visitors who would drop off once they reached the landing page.

A quick check in Unbounce showed us landing page drop-off rates of 57.7% (A/B Testing/CRO) and 25.33% (Lead Gen/Content Marketing). Time for some grade 6 math…

Even with significant drop-off in the landing page step, overall net leads still increased.

Our next step would be applying the same format to all blog categories, and then measuring overall results.

About Angus Lynch

As a freelance copywriter, Angus helped ecommerce site owners increase conversion rates. In 2014, he joined Rooster Engagement Tools, which was purchased by Unbounce in April 2015. He now serves as copywriter on the Unbounce Marketing Team.

It’s absolutely true that this is part of our philosophy on using Convertables (overlays) responsibly. Here’s our philosophy, verbatim:

“Convertables are powerful and have the ability to vastly increase conversion rates (and even open up new avenues for conversions). But with great power comes great responsibility: when used improperly, the potential exists to annoy and alienate users.

By testing Convertables on our own web properties, we’ve learned the best way to use this tool is make offers that are relevant and valuable without harming the user experience. This approach helps us “walk the line” between 1) maximizing the potential for conversion, and 2) respecting the goals of the user.

This means:

1. Promoting offers that are relevant (and complementary, when possible) to the product or service on the page; and

2. Promoting offers that are valuable to the user; not offering low quality resources, and not just restating the offer on the page;

3. Targeting to ensure the right mix of timeliness and frequency; making sure users see the right offer at the right time; not showing the same offer repeatedly to the same user; and

4. Not showing offers in a manner that is intrusive, negative or insulting.”

I’m assuming you’ll cite your poll from last year about people’s negative perceptions of “pop-ups.” We have some issues with this poll, mostly 1) that the study was done with a 2-second trigger to all visitors, which is very disruptive, and 2) that the study failed to account for the residual negativity towards old school pop-ups that had their own navigation and hijacked control of the browser.

It’s important to note that these experiments were done with exit overlays, triggered when the user was about to navigate away from the page. They do not inhibit users from leaving the page, nor do they interrupt users who are actively reading a post. And within the overlays, we offer highly valuable content that we put plenty of time and energy into — not throwaway resources.

I appreciate your comment, but it’s true that we abide by this philosophy with our own campaigns, and true that we encourage our customers to uphold the same standards. After more than 3.5 million views on our web properties, we’ve received fewer than 5 complaints.

The poll I conducted last year was, as your rightly say, was not about exit pop-ups. That’s why I didn’t link to the study in my original comment.

As a result of that poll, I received a deluge of comments on Social Media from people expressing their utter contempt for all types of pop-ups.

If you are confident that your exit popups are valuable to the user, how about we run a new study to test public perceptions about the impact “Convertables” have on brand trust and reputation?

I predict that the findings wouldn’t be quite as conclusive as they were for entrance pop-ups, but the very fact that 98% of your visitors ignore your “valuable and timely” offers suggest the results won’t differ by much.

If we do run this poll and find that the majority of visitors feel negatively towards your “Convertables”, would you agree to stop using them and persuading others to do so?

Convertables was created to meet the needs and demands of Unbounce customers — to give our users more opportunities to drive conversions. We believe there’s a way to do it that’s valuable to both the marketer and the visitor.

When it comes to overlay conversion rates, blog traffic is on the low end of the spectrum. Yes, the conversion rates cited in this post aren’t that impressive, but I still do not agree with your logic that because a user doesn’t convert it also means they find overlays (and what they offer) “of no value whatsoever.”

When we run overlays on campaign traffic (or pages outside the blog) we typically see conversion rates between 8% and 15%, but that can sometimes be as high as 19%. If you skip ahead to 26:50 in this webinar (2 minute watch) http://webinar.unbounce.com/get-conversions-using-overlays-recording, I break down an example of high-converting overlay we ran on campaign traffic, and one that our visitors got excellent value out of. By the same logic, we don’t believe the 81% of viewers who didn’t convert found the overlay annoying.

I understand there are a lot of differing opinions on how overlays affect the user experience. We definitely agree that many marketers use overlays in a less-than-delightful way, more reminiscent of pop up ads in the web’s early days.

Our goal is to equip Unbounce users with the tools to use overlays in the right way — one that allows them to drive conversions while respecting the user experience. To that end, we’ve loaded Convertables with triggers and targeting options (URL targeting, cookies, referring URLs, multiple triggers, location targeting) to help make that happen.

>>> “we don’t believe the 81% of viewers who didn’t convert found the overlay annoying.”

This is easily tested if you would like to know the answer rather than work on an assumption.

Daniel Davidson

Unless I’m missing something, which is very possible, it seems like your best performer was experiment #1 with a 2.65% conversion rate across the board. You’ve seen higher percentages for smaller subsets, but globally still fallen short of that 2.65%

As of today, the overall campaign conversion rate sits around 2.45%, which is lower than our original of 2.65%. That said, the 2.65% was achieved with a single field overlay, and a conversion on the 4-field overlay is much more valuable to us. So yes, in a sense we have abandoned the original 2.65% test as a control group.

I should have also pointed out that blog traffic converts at a much lower rate than campaign traffic or website traffic. When we run overlays on non-blog pages we routinely get conversion rates between 5 and 15%, but the traffic volume is lower.

Ultimately our challenge here is finding ways to glean value from huge volumes of blog traffic, with the implicit understanding that we won’t be able to achieve the big conversion rates we do on other pages.

Angus

Daniel Davidson

Hi Angus,

Wow, okay 5-15% sounds promising. 2% sounded rough, and you guys being pros with dedicated resources for a 2 year test. I gotta say, it took the wind out of my sails a little ;)

That clears things up considerably for me.

Your article did spawn a few additional questions.

I’m assuming your percentages are global and not segmented for a specific device in this article. Do you have any significant difference between desktop conversions vs mobile?

I was also wondering, with a 2% conversion, had there been test of a CTA offering between an overlay vs. inline (say in an article)?

I’d be curious to see if inline performed better, if inlined increased the overlay offer (sort of like a second chance to grab it before you go) or if it’d have no effect at all.

Again, great testing. Thanks for the article.

Janine Jurji

This is so helpful! It would be great to this more frequently to see if there are any new takeaways.

Liyans Infotech

Nice Information.
Thanks For Sharing.

Jayen Ashar

Have you tried a two-stage overlay? Like your clickthrough, but on the first stage, you collect the email address only and on the second stage, you ask your survey questions? Would be nice if it was all in the overlay and not another webpage.

i’ve prsonally never seen that on an exit pop but I’m sure its possible

Jayen Ashar

What I mean is: You should try a two-stage overlay. Should get the same conversion rates on the first stage as having just the email address, with the possibility of still getting survey questions answered.

Simba Mudonzvo

It all comes down to perception – a pop-up is like that ‘annoying’ person who butts in your conversations and usually tries to push their ‘own’ agenda. Also exit overlays are used aggressively by the internet snake-oil salesman who promise to make you rich overnight (see binary trading, bitcoin/cryptocurrency investments etc). Exit overlays smell of desperation of times, because why was it not a CTA in the said article/page in the first place?
Plus most popups are not optimised for mobile devices – it is annoying to try locate the ‘x’ just to get rid of them.
they don’t feel natural (or native should I say)
After 2 years for a site like unbounce to see 2 – 5% on the blog pages says a lot.

Really interesting – wow. Awesome case study. I’d love to see a parallel test done on say the fishing or lifestyle niche. Our niche (internet marketing) as a whole can be really skiddish. We are almost trained to “x out of things”, close popups and unsubscribe from being pummeled with so much info.

Either way, duly noted. Definitely going to think again about doing an exit pop now.

Also question – during these exit pop tests did you have any other popups running? E.g. was the exit pop the second popup they saw, or the only one?