This massive new Google policy change has led to Google also implementing upgrades inside of some of their ad products like Adsense. Tools like Auto Ads, and being able to run Ad Balance Experiments, sounds really good on paper. Less ad density, fewer ads, and the same amount of revenue! Great, right?

Unfortunately, broad solutions to ad balance generally don’t work – and these products have not gotten the best feedback from users. So how do publishers account for ad balance, reduce ad density in light of Google policy changes, and improve visitor experiences along the way without losing any revenue?

Fortunately, I have access to tons of data on this subject that I’ll share below. I’ll highlight the impact these things have on overall ad revenue and other important factors like SEO as well.

Why do you balance ads on a web page?

There are really two primary reasons why publishers typically want to explore the idea of balancing ads. Balance in this case usually refers to a general idea of finding a happy medium between earning revenue and preventing visitors from being annoyed by lots of ads on a website.

The first is to preserve the long-term viability of the web property. Publishers understand that ads could be negatively impacting visitor experiences and want to ensure the long-term health of their traffic. Additionally, they understand that visitor experiences have some bearing on SEO as in that respect as well.

The second is a newer one. Google is now failing sites with ad density violations in the new Google abusive experiences report. This means that all of their ads will soon be blocked by Chrome unless they are fixed. It’s also possible violations like this could also carry over into SERP positioning as well.

The challenge in doing true ad balance experiments is doing it in an effective and objective fashion. This means using data and proven methodologies of reducing the number of ads on a page while still maximizing the value of each particular session on your website.

Objectively understanding the impact of ads on visitors

As I mentioned, using data to understand how ads are impacting visitors is massively important. Most of the bad experiences people have had with tools like AdSense Auto Ads is directly related to losing revenue as a result of broadly removing ads without understanding specifically how they affect the value of other ads and the objective user experience metrics that impact things like SEO.

Understanding these user experience metrics — and the deeper ones that specifically highlight visitor engagement — can offer an objective view of how your ads are impacting traffic and revenue over the course of history. These can be measured inside of Google Analytics and observed easily by running ad balance experiences that measure each of these metrics against different ad densities on the same landing page (as shown in the blog linked above).

Running ad balance experiments yourself

As mentioned above, running ad balance experiments really centers around being able to both understand the impact of ads on visitor experiences and being able to implement a solution that minimizes that impact without harming revenue.

When running these experiments, it’s important to start by isolating landing pages. By looking at landing pages you’ll be able to compare true apples to apples when deciphering how visitors journeys are impacted by experiments. By looking at landing pages you’ll be able to see how different users behave that all land on the same page (giving them similar opportunities to click on the same navigation, links, read the same length of content, etc.).

One of the best ways to start these landing page experiments is to then see how ads impact visitors in 3 distinct ways…

ad density

ad location

ad type

Over time we’ve learned that these three verticals govern user experiences more than any other ad factors. All experiments can be run the same way but require close monitoring.

Experimenting with ad density

Let’s start with ad density. Start by adjusting the number of ads on a page over similar date ranges (Mon-Fri for each experiment for example). See how bounce rate, session duration, and pageviews per visit are changed by the different ad density experiments. Do the numbers become skewed at 3 ads, 5, ads, 7 ads?

Then, segment your audience in Analytics as you review these UX metrics in the context of these experiments. How were organic visitors impacted vs. social media visitors? It may be really important for SEO to understand specifically how organic visitors are impacted by these changes. You may learn that 7 ads works well most of the time, but not for organic visitors. This something you’d ultimately want to account for if you get a lot of organic traffic on that particular landing page.

If you’re an Ezoic publisher, this is constantly occurring. Our system is always optimizing for SEO, UX, and revenue related to density. One of the interesting things we’ve learned from seeing thousands of these experiments is that managing publisher UX on organic visits plays a really strong role in SEO. When you’re able to influence this in a positive way, we see publishers improve their search rankings for keywords with associated landing pages.

Experimenting with ad location

If you can account for ad density parameters for all of your different users, the next step is to figure out which locations work best for different users. Again, this is best done by landing page. All visitors are different and one of the best ways to optimize for total session revenue is to segment experiments by landing page (as these visitors will have the most similar types of sessions).

Why EPMV (total session revenue)?

Often, ad locations are selected based purely on a subjective bias. Publishers will think that an ad in one location is “ugly” and may be disruptive to visitors browsing the site. If there’s one thing I’ve learned from seeing thousands of sites worth of data on this subject it is this… it is nearly impossible to predict which ad locations will be annoying to certain visitors without data.

I’ve seen publishers convinced that certain ad locations (specifically at the top of the page or mid content) will make the website too ugly and will cause visitors to think badly about their site. Then, once testing occurs, they found the opposite. The ads they thought were in very clean locations actually caused bounce rates to go up!

The best advice I can share here is this… test ad locations on different visitor segments (organic vs. social) and customize the delivery for both. They will likely prefer different locations and being able to extend the session length for these visitors will result in better UX metrics and more revenue.

I also would test more than just the ad locations that you like. One of the things we’ve learned from data is that the more ad locations that are tested, the better the results when results are tailored to different audiences. For example, Ezoic users that add the most amount of ad placeholders, see the largest increases in revenue and the biggest decreases in bounce rate.

Ezoic works by allowing users to set as many ad placement locations as they like using a Chrome extension. Ezoic then uses machine learning to test these potential locations all kinds of different visitors. The more that publishers test, the better results they get. The lesson here is that the more variables you can tailor and test the more there is to gain.

Experimenting with ad types

There is actually more to consider after you have sweet-spotted ad density and ad location per visitor. Selecting the right ad types for different pages and visitors is another variable that can impact visitor experiences and ad revenue.

For example, a text ad may load faster for visitors coming on mobile devices; resulting in lower bounce rates or higher earnings from this ad than typical display ads. Additionally, you may learn that a native ad could be causing much higher bounce rates among organic visitors (a trend we’ve seen from time to time).

It’s important to look at your landing pages once again and to test different ad types. Understanding how different types of visitors landing on certain pages are affected by native, display, link, and text ads can have a major effect on the visitor’s session length and the publishers overall revenue.

What’s more, getting this wrong could have far-reaching repercussions. I talked about this before in relation to native ads. If you show them to all visitors and your organic visitors hate them, they could be causing a negative impact on SEO and you wouldn’t know until it’s too late.

Do AdSense Auto Ads & Ad Balance Tools Work?

As we’ve shown above using some of our data, running ad balance experiments is more complex than simply adding and removing ads that may not broadly seem to be adding a lot of value. Additionally, adding and removing a single ad impacts the value of all other ads on the page.

Reducing or increasing the competition for bids from one ad on the page could impact the demand/value of other ads on the page in the RTB protocol. Sometimes removing ads drives up the value of all other ads on the page. Other times it has the opposite effect; despite the fact that an ad may actually have lower overall earnings.

Without being able to tailor changes to each landing page or different visitor segments, these experiments are largely shooting in the dark and rarely have the ability to positively impact visitor experiences. Additionally, they offer almost no chance at increasing revenue (often being advertised as only slightly DECREASING revenue). Google’s intention here is good. They want to provide publishers with some tools to help them measure and record the impact ads have on their visitors.

Unfortunately, tools that broadly apply ad changes to large visitor segments — across all landing pages —will almost always provide disappointing results; as we have learned from thousands of sites that have performed these experiments.

Wrapping it all up

If you’re interested in improving visitor experiences on your website, there is plenty of information above (or linked to the content above) to help you get started. I recommend avoiding some of the Google products for AdSense publishers that broadly apply changes to ad density or location; as the results to this point are not very strong. There are some threads about these in the AdSense product forums.

The more you can tailor individual visitor experiences, the better results you will get as a whole. This is usually a good starting point for any experiment you plan to run on your website.

About The Author

Tyler is an award-winning marketer, SEO expert, successful blogger, and keynote speaker. He has composed content for some of the world's top publications and has over a decade of experience building businesses in the digital space. Tyler is the current Head of Marketing at Ezoic and serves as an SEO and marketing expert for start-up competitions across the U.S.

3 Comments

Am I understanding that graph correctly? Your bounce rate with new social visitors was LOWER when you had 7 ads instead of no ads? If that’s true and not just a made up graph, I can’t quite get my head around that one!

It’s true. This is not uncommon for some sites; usually seen with social traffic. However, you actually have to look at several other metrics to determine if this is still indicative of a good experience or a bad one. For example, the ads themselves may be causing the browsing experience to be disrupted in a way that causes accidental clicks, etc. Meaning, visitors may not be bouncing away from the site, but bouncing from page to page, more frequently. This would be a case where navigation bounces and total engagement time should also be monitored.

Good piece of information though. I believe running ‘ad balance’ on websites with huge traffic could give better information and understanding. Google removes the policy on three adds per page and now introduces “ad balance”. You can have more than three ads on a page but running ad balances reduces to less than three active/viewable. Thats it the reason why google tells its publishers that running ads won’t loose revenue…..