We Tested Over 100 Different Facebook Ads in One Month – Here’s What We Learned

Getting your Facebook campaigns working like a well-oiled machine is harder than quitting gummy bears for a month. Believe me; I know what I’m talking about.

Unlike gummy bears, Facebook ads are highly unpredictable. Unless you’re an oracle, it’s impossible to forecast whether people will like your ads and which images will catch their attention.

So we decided that the best way to find out what works was to test over 100 different ad combinations over a one-month period.

It’s been a tough ride with its ups and downs, and we’ve learned our fair share of know-how.

I’ve also written about our initial experiences here if you’re eager to find out more.

Join our journey and see what we learned, testing after testing. Fasten your seatbelts and off we go!

Lesson #1: You really DO NEED to create more than five ads

Do you know what’s wrong with creating 1-5 ads and then sitting back and waiting for the campaign results? You might never see any results at all. Furthermore, you will never know what could have happened had you tweaked some more details. And this means that you’ll never learn what works.

There’s an old wisdom by Lao Tzu: “To attain knowledge, add things everyday. To attain wisdom, remove things every day.” And that’s basically what we did.

We created over 100 ad variations differentiated by headlines, ad texts, images, call-to-actions, and landing URLs.

It might be the right moment to note that we wouldn’t have been able to create such a systematic ad tracking system without AdEspresso.

[I know I’m writing for this blog, and you might just think that I’m promoting the tool, but that’s not the case. We really did save tens of hours (and hundreds of dollars) by using AdEspresso for setting up and optimizing our campaigns. So THANK YOU ADESPRESSO for being there!].

By using AdEspresso, creating all 64 ads for this particular ad campaign took me about 10 minutes. I’m truly grateful that I didn’t have to spend the best of my day doing all this work manually, and my tan looks better than ever! 😉

What did we test with all these 64 ads? – Everything from images to call-to-action buttons.

Here’s a great explanatory graph from a Facebook ads A/B split-testing guide. There are so many different possibilities waiting to be explored.

By creating tens of different ad combinations, we were able to discover the most attractive ones and optimize our campaigns accordingly.

It’s up to you to decide which aspects of the ad will affect the outcome the most. Keep reading to learn what made the biggest difference in our case.

Lesson #2: Ads are like pets – they need constant attention

Out of our 105 ads, nearly 80% failed to attract a high click-through rate and didn’t return low cost-per-click and cost-per-conversion.

Our job was to find out which ones of the ads didn’t belong to the Team Awesome and optimize the campaigns accordingly.

Want to know how we did that? We optimized the hell out of our campaigns. And that’s what you too should do with every single Facebook ad campaign.

Here’s an example:

Look at the differences between those ads’ performance. Hadn’t we chosen one image over another and killed the worst-performing ads, we could have lost many valuable leads.

We also set up some custom optimization rules so that whenever one ad set performed outstandingly well, its budget was increased in the cost of other ad sets.

By using various optimization rules and suggestions by AdEspresso, we finally reached the situation where our campaigns started to perform as expected. (note: Before the campaigns really took off, it needed about five days of testing and optimizing, so be patient!).

Lesson #3: What works for one might not work for another

There are so many different people in the world. Does it mean that you should also keep dozens of different ads running all the time? Yes, it does.

You need to create multiple ad campaigns and target different audiences to find out what works for each group of people.

Do not forget to test the Facebook and Instagram ad targeting just as often as you experiment with your ads! We discovered that in various countries, similar audiences preferred different ad images.

Out of these two images, the red one performed the best in countries A and B, while the light image worked best in country C.

Even Facebook recommends using multiple images for your ad: “Selecting multiple images for your ads in a single ad set is an easy way to understand which ad performs best and to get the most out of the amount you spend on advertising.”

Here’s another example of successful image testing by Shopify. Notice how these images have a similar message but slightly different hues and layout.

Key takeaway: Avoid the attitude of “one size fits all. Try to predict how different will people react to various advertisements.

Now that we’ve gone through the more general rules and findings, I guess you’re eager to find out what exactly worked – which images, CTAs, and landing pages performed the best.

Lesson #4: Practical Findings

Here we are, this is the part of this guide you all have been waiting for.

We’ll share with you the key findings of what worked. You’ll learn if the words of the CTA made any difference, what type of ad images were the most attractive, and whether Mobile or Desktop ads topped the game.

Let’s find out!

“Learn More” or “Sign Up” Call-To-Action?

The most effective call-to-action buttons have been discussed many times before. According to our research, advertisers should stick to call-to-actions such as “Learn More”, “Sign Up”, and “Shop Now”.

We tested two types of call-to-actions – “Learn More” and “Sign Up.”

Guess which one yielded the highest click-through rate?

In our case, the “Learn More” CTA returned a 22.5% higher click-through rate than “Sign Up”.

But the problem with click-throughs to a landing page is that in most cases, it is not the end goal. Instead, marketers aim for sign-ups, purchases, or email subscriptions.

So we decided to test the conversion rate of both CTAs as well.

This time, “Sign Up” call-to-action outperformed “Learn More” by 14.5%.

So guess which one’s the winner?

The “Sign Up” call-to-action managed to get us more trial sign-up as that’s what people were prepared to do when clicking on the button. They were sincerely interested in signing up for our product instead of learning more about our offer.

Key takeaway: Always keep in mind that ads with high click-through rate do not equal the ads with the highest conversion rate. Track both values to optimize your ads according to your end goal.

Light or Bold Images?

There are two options when it comes to perfect Facebook ad image: be bold or be fragile.

Anything between those two extremes will likely fail to catch people’s attention. I bet you’ve experienced it yourself – after a busy day at work, the mind just wants to rest and ignores the newsfeed content that dissolves into the surrounding noise.

That’s why smart marketers test combinations of both light and bold ads that attract the eye more effectively.

Here are the two ad images we tested (in addition to slight variations of both):

Want to know which one of these images performed better?

When setting up the campaign, I made a bet on the bold red image. But guess what turned out? The truth is that both visuals had the same click-through rate.

To avoid ad fatigue, we made a really simple yet effective maneuver. Instead of creating brand new ad visuals, we used Photoshop to flip the visuals vertically and changed background hues. We ended up with new ads managed to catch more people’s eyes, resulting in more conversions

Key takeaway: While many people prefer bold ads, some have built up the ability to ignore beaming advertisements. Create both light and bold images and test which one works best with your audiences.

Desktop or Mobile ad placement?

If you’re new to Facebook ads, all the choices in the campaign creation phase can seem overwhelming.

There are so many types of Facebook ads that one can choose from. It’s like standing in front of a candy stand and not knowing which ones to pick. (Once again: I know what I’m talking about… /me sighs).

As we created our ad campaigns, we weren’t yet sure which ad placement would work the best. I had read a lot about the effectiveness of advertising on Mobile and Audience Network. But personally, Desktop ads seemed like a logical choice.

So we headed to our personal oracle for advice. We tested the performance of both Mobile vs. Desktop ads and used AdEspresso to measure the results.

Here’s what we found:

Desktop ads had a 534% higher cost-per-click than ads placed on Mobile + Audience Network. But that’s not all this next screenshot has to say.

Notice that Mobile ads failed to return any conversions while in this particular campaign, while Desktop ad placement yielded eight conversions.

While Mobile ads got tons of more clicks (at a lower cost-per-click), Desktop campaigns were the ones that managed to attract conversions and brought new leads (which is usually the end goal).

In our case, Desktop worked better as we offer a fairly complex product. So it’s best if people take their time to discover our web page in-depth, which is unusual for mobile users (their attention gets distracted, they run out of time break, etc.).

We all like to think we know our audiences like we know our children. But you should always leave room for error and test as many options as possible.

Key takeaway: I urge you to test as many ad placements as you think efficient. Mobile ads work well with discount offers, awareness-raising campaigns, and subscription-based offers. Desktop ads serve as a means to sell more complex products.

The final verdict

Facebook gives you different possibilities all waiting to be tested. It’s our job as marketers to apply our creativity to finding new ideas for A/B tests, new ad types, and compelling offers.

AdEspresso has been an excellent companion on our journey of creating tens of different Facebook ad combinations and optimizing the campaigns according to outcomes.

Take with you the following insights:

To succeed, you need to test a huge amount of different ads. Let your creativity fly!

Creating an ad campaign is 20% of the work. The other 80% of your time should be spent on optimization and analyzing the best practices.

Not all ad creatives work for all audience members. Create different ads and find out which bring in your most profitable new customers.

Trust the numbers – see which ad combinations, call-to-actions, ad placements, etc. work the best and tweak your campaigns accordingly.

Alright, it’s time for everyone to return to Facebook Ad campaigns and use the new insights to make them even more successful.

I would also like to invite all of you to share your own Facebook ad A/B testing stories and interesting findings in the comments section below. So that we can all benefit from each other’s mistakes and success stories!

If you’re paying for clicks, would it ever make sense to try out strategies where you directly mention the price in the actual ad? For example a $3,000 website building service, would that help with conversions or hinder it?

Yes, I would definitely say that testing with the price in the ad is a great opportunity. Especially if your prices are lower compared to the industry average.

People love to see numbers in ads as it helps them instantly evaluate the offer.
Still, if you consider your price to be higher than the industry average, it might not be the best idea to mention it in your ads.

Mentioning the price should be done with a clear objective and understanding of how it will affect the audience’s decisions.

Justin: no way, please don’t 🙂 The goal of the ad going to the cold audience isn’t to sell the high-ticket item such as website building service. It’s to attract the click. Then it’s your landing page goal to educate and inspire to want to learn more about your company, or consume more valuable content. Then your landing page may or may not make an offer to sign up for some special and highly valuable and relevant offer, such as download a “7 Worst Website Planning Mistakes That Smother Startups” cheatsheet, or a webinar, where you continue to educate your prospects on a subject matter and prove your worth along with demonstrating social proof and other trust indicators, so that when you make that $3,000 offer your audience knows and trusts you. It’s a process, and one must focus better on one step of the process at a time instead of the end result: it’s like demanding someone to marry you without even having a small talk first 🙂 Hope that makes sense.

BPMedia,
It seems to me that you didn’t read the previous posts carefully enough.
The idea of mentioning the price of a product/service is to attract clicks from people who find the price acceptable or even cheap. That’s the perfect audience you’d want to attract because they’re interested in buying the product NOW.

I agree that capturing leads by using the combination of Facebook ads and content marketing is a highly efficient tactic, but you missed the point here: The idea of mentioning product prices in the ad is to get buyers in short term (to complement long-term lead nurturing efforts).

I suggest advertisers engage both in audience building and direct selling offers while working with Facebook ads.

Along with what others here have already said, I think that’s a great idea that makes a lot of sense. Whether you display the $3,000 price or break it down into months, if you’re paying per-click, then you’ll only pay for those who know the price, and are therefore much more qualified.

My guess is that the number of conversions you see will go down because you’re narrowing your audience a bit by throwing the price on the ad, but your CPA will go down a lot! Great idea.

Totally agree, we do a lot of testing on Facebook with video ads made with Shakr.com and it really matters which design style you choose. A video could use the same photos/video clips and text but if the “style” is different you’ll get a completely different result.

Let me know if you guys want to make some joint collaborations/testing/experiments!

In lesson 1, your campaign image shows 3 image campaigns which i’m assuming is then broken down into ad sets and ads. But looking at the ads, you have them segmeneted by Interest #1,2,3,4,5. I can’t change a target audience for a single ad, is this something Adespresso has the capabilities to do?

Another question in regards to lesson #4 about desktop vs mobile:

How can you make that decision that desktop is better than mobile, because it had 8 conversions compared to 0? I ask because you’ve spent far more on desktop than mobile. Additionally, your cost per conversion is 33 which you have not spent that amount in mobile. So wouldn’t it be best to spent equal amounts on the two placements, or at least spend up to the amount on mobile that it cost you on desktop to convert someone.

Regarding your second question: As Facebook often shows a wrong number of conversions, we use other tools to track the number of conversions and then analyze the results.
In our case, desktop has always returned more conversions on a lower cost, that’s why we spent more on desktop placement. But we still left some budget on mobile for testing.

Nice article again. Ok I have a question. For example one of your ad is started doing great and you also increased the budget. How long it will run well and then you need to re-start with new campaign? Because we have to check ad Frequency and when it’s near to 5% we should stop the ad because it increase the cpc price etc. Looks like this FB advertisement is on-going testing game where you can not get the same results everytime you run a new campaign.

We kept our target audiences big as we were just getting started with Facebook campaigns. Each ad set had a wide audience of 40 000 – 80 000 people. We were looking to learn about the audience afterwards, so we decided to start big and create large custom audiences.

In this particular test, we didn’t use remarketing. That’s a subject for another article!

I work for a small company and we have a comparatively tiny overall ad budget (~$120) – I’ve found that when split-testing so many variables with such a small budget means there’s not enough engagement, per ad, to gain any reliable insights.

Which variables would you say were *most* crucial, out of all the ones you tested? I will probably only be able to afford split testing for one or two variables, realistically.

You’re on the right track: to get relevant A/B test results, you should have at least 5,000 impressions on both each variation and the total of 500 clicks.

I suggest that you start out by testing ad visuals as they’re the first thing that catches people’s attentions.
Also, conduct split tests on your Facebook ad audiences as it can have a huge effect on your overall ad performance.