Results of our “real-world” testing to prove that this is a legitimate & widespread issue.

A quick guide to help you identify scenarios where you’re at high-risk vs. low-risk of seeing major impact (on your reporting-data) from overstated conversions.

The solution to fixing it going-forward (even if you’re completely intimidated by the FB Pixel, tagging, javascript, etc.)

Part 1 - Why Duplicate-Conversions Occur

The primary-cause of this “over-counting phenomenon” is: counting the same conversion-event multiple times for the same user (when the underlying event should only count once per-user). This is especially true for events that fire “on page-view” (including URL-based custom-conversions).

The Facebook Pixel doesn’t “de-duplicate” conversions (no matter if they’re URL-based custom-conversions, standard-events, or event-based custom-conversions). While it’s seemingly impossible to find the documentation on this, I seem to recall a conversation with Facebook’s support / engineering where they stated that the Facebook Pixel WILL de-duplicate conversions on a per-user basis if the duplicate event-calls occur within a 3-minute window.

I’m not sure how true that is / the way in-which FB determines which conversions are duplicates, but the fact-remains: even with a built-in “safety net” from the Facebook-side, the data shows us that duplicate-conversions can significantly alter the underlying performance-reporting of Facebook campaigns where website-conversions are involved (optimization or reporting-only).

A Realistic (And Common) Example Of "Conversion Duplication"

Assume that you run Facebook Ad traffic to a landing-page (squeeze-page) where the user opts-in for a lead-magnet, training, etc.

You fire a lead-event (or use a URL-based custom-conversion) on page-load of the next page.

The problem here is, in many cases, there is often content / information on the “thank you page” that the user might want to revisit at some point in the future (download-links, embedded-videos, product-sales info, etc.).

Every time the user refreshes / revisits this thank-you page, you record a new conversion (“Lead”), which leads to artificially-inflated performance-metrics.

Now, let’s look at some real-world examples of this phenomenon & the particular scenarios where you need to double-check your data-accuracy

Due to the limited documentation around how Facebook de-duplicates conversions in their systems, we want to be sure that this is ACTUALLY a problem, using real-world ad-accounts & reporting-data.

To test this out, we built a “conversion caching” system in Google Tag Manager and applied it to a few of our key FB Ad accounts & conversion-events.

In a nutshell, this “conversion caching” system does 3 key things:

Allows us to control which Facebook “events” should only fire once per-user

On the initial conversion, we fire the event-tag like normal & then set a cookie to persist the fact that they’ve already converted for this particular-event

If the user re-triggers the “event” tag (by reloading the page, etc.), then we fire a custom-event called “cachedEvent”, which includes the name of the “original event” as one of the parameters.

#3 above is perhaps most important to our test, because it enables us to see what % of conversions are unique (per-user) vs. duplicates (and break-down the results by “original event” — i.e: Lead, Purchase, etc.).

The design, testing & publish of the “caching system” (CACHEMONEY) took our internal-team about 2 weeks. Once we had a working caching-solution, we deployed it to a few key client GTM-accounts in a few clicks by using the “import container” feature in GTM’s admin-area.

Thus, we were able to “bolt-on” our solution to any existing GTM-account in a matter of a few minutes. From there, we only had to reconfigure a few of our Facebook “event tags” in GTM to use the new caching system, test & publish.

Of course, we needed a way to capture the occurrence of these “cachedEvent” tag-fires, so that we could pull them into our Ads Manager reports as a metric.

We created a new “custom conversion” in FB Business Manager where the settings look something like this:

For this particular test, the client’s underlying “funnel” is pretty straightforward:

Upon clicking the CTA in the Facebook Ad, the visitor is directed to a campaign-specific squeeze-page where the desired-action is: the visitor enters their name & email-address & submits a form to “soft join” the forum.

Once they opt-in on the first-form, they’re taken to a thank-you page (note: when this page loads, we fire our caching-equipped “Lead” event) that holds a longer-form that enables them to complete their forum registration (set username, password, etc.)

If they submit this longer form, they are taken to a final thank-you page, where we fire our caching-equipped “CompleteRegistration” tag on page-view.

Part 3 -Test Results & Key Takeaways

We waited about 20-days to allow the data to accrue, then pulled a few reports:

In FB Business Manager → Data Sources → Pixels → “Details”:

We changed the date-range to “Last 14 Days” then noted the total volume of the Lead, CompleteRegistration & cachedEvent rows.

Immediately, we were struck by the relative-volume of “cachedEvent” fires (vs. Leads & CompleteRegistrations).

As you can see in the image above, we see 4,100 cachedEvents (which are duplicate-conversions caught by our caching-system) vs. 4,900 legitimate “Lead” & “CompleteRegistration” events. Ouch — this indicates that we have substantial occurrences of these “duplicate-conversions”.

Now we need to dig-deeper to see if the over-counting is equally distributed between the two events in the funnel. This is where our “original_events” param comes in-handy.

For this, we go to Facebook Analytics (in the pixel-screen above, there’s a link in the upper-right of the screen to “View Analytics” & choose the pixel as the “event source” once inside the platform).

Once in Facebook Analytics, we go to “Breakdowns” (in the left-hand menu) and create a new report that looks something like this:

We’re breaking-down the occurrences of “cachedEvent” by the value of the “original_event” parameter, and we see that the vast-majority of the “cache-saves” were related to the Lead event … and moreover, that the count / user is just above 2!

For reference, for the same time-period, we tracked 2762 total occurrences of the Lead-event (where the breakdown is the “caching ID”)… notice how the “Unique Users” nearly matches the “Count” field in the screenshot below

THUS, without the caching-system, we would have counted 2762 (legit leads) + 3906 (duplicate-conversions) = 6668 Leads, which is over-stated by 141%!!

Next, we want to see if this pattern holds-true for our visitors from Facebook Ads.

We recorded 1587 cachedEvents vs. 2133 leads… meaning, without the caching in-place:

We would have overstated the number of leads-generated by 74.4%

CPA (Lead) would have been reported at $1.67 (understated by 43%)

While this is a substantial finding, imagine if this was a Purchase event (with associated “value”) and we had optimized our campaigns based on ROAS!?

We might have scaled losers, cut winners & LOST money when we thought we were making money. This is why this entire topic is so critical to performance-driven Facebook Advertisers.

Although this is a single example / small sample, we found similar patterns in other accounts as-well. Given the apparently-wide scope of this problem, the next logical question is: what should YOU do from here?

Part 4 - Your Solution Implementation

Step 1 - Determine Your Potential "Danger"

The impact of conversion-duplication will depend highly on the types of funnels / conversions that you use & the underlying content-platforms. Here are some “low danger” vs. “high danger” scenarios:

Low-Danger Scenarios

Purchases from Shopify: Shopify includes a form of conversion “de-duplication” via their liquid-tag {% if first_time_accessed %}, which allows you to designate which tags should fire once-per-order vs. always (since visitors tend to return to their purchase confirmation-page to check the order-status, etc.).

I can’t find any documentation on how the built-in “Facebook Pixel Integration” in Shopify handles the purchase-events, but I assume they’re using a “fire once per purchase” approach by default.

Use PixelYourSitePro Plugin in WP / WooCommerce: for those using WordPress / WooCommerce, there’s an excellent plugin called “Pixel Your Site Pro” that is well-worth the money and offers an option to fire the purchase-event once-per-transaction:

Events That SHOULD Fire Multiple-Times: of course, some events should fire many-times per user. Examples are AddToCart and ViewContent, but ultimately, it’s up to you to decide which events or (event + param) combos should only be tracked once per-user.

Forms Using “onSuccess” Callbacks To Trigger Your Tags: this one isn’t as prevalent, but in an ideal-world, if you have a conversion that occurs based on the submission of a form, the tag should fire based on the “successful” submission of the form itself, not on the subsequent pageview. This way, you would only count duplicates if a visitor submits the form many times.

This setup typically requires the help of a developer who is experienced with form-handlers (javascript, in-particular), and that you have access to ALTER the form-handling functions (not usually the case if you’re using forms from an outside platform — more common when your forms are “home-brewed”).

Note / Aside: You might be wondering why we don’t mention the use of “Form Submission” triggers in GTM. Our experience is that these built-in triggers don’t play well with iOS (for some reason), and thus, we’ve steered clear of using form-listener triggers in GTM for the past ~2 years.

High-Danger Scenarios

Purchases From Non-Ecommerce Platforms: if you build funnels / pages that contain purchase-forms in platforms like ClickFunnels, then this should be your starting-point. Not only do many advertisers “hard-code” the value of the purchases (instead of setting it dynamically) in page-builder-based funnels, but I’ve yet to come across a tool that offers the “fire-once” capability that Shopify does.

Further, in this situation, you tend to see a lot of one-click upsells and pages that serve dual-purposes (confirmation of previous purchase + sales info & purchase-options for subsequent product), which tend to attract the most duplicate-conversions.

Lead Acquisition Campaigns: most lead-acquisition funnels (for email-options, webinar registrations, product-demos, etc.) trigger the conversion-event upon view of the thank-you / confirmation page. As shown in our example, people tend to hit these pages over & over, especially if there’s training-content, videos, download links, etc. on the page.

Funnels With One-Click-Upsells we covered this a bit above, but one-click-upsells tend to drive high “conversion duplication” because they often contain lengthy videos or sales-copy (which the visitor might want to visit several times before electing to purchase).

Step 2 - Implement a caching-system & run a test

Now that you’ve identified your high & low-danger focal-points for your funnels, the next step is to implement a caching-system “test” so that you can quantify the impact of duplicate-conversions.

Here, we offer a pre-built GTM container with our proprietary “caching” system (which we call “CACHEMONEY” bc we have street-cred) that you can get as part of our flagship training course “Facebook Pixel Blueprint V2”.

This training offers much more than just CACHEMONEY system, but one major benefit of the course is: you get both the caching-system & detailed walk-through videos on how to set it up & use it properly.

Step 3 - Communicate Findings With Intellectual-Honesty

Now the question becomes: how do you relay this information to your boss / manager without getting fired or completely destroying mutual-trust? This is not easy & of-course, will depend on your clients’ personality & history with your agency.

The only advice I can offer here is:

Honesty is critical, above all else

The importance of “intellectual honesty” (telling what the data actually says, not what you WANT it to say) was seemingly forgotten long-ago in the media-agency world… but you should NEVER try to massage the data or spin your analytics work to make things appear rosy when they are not.In this case, your findings will have retroactive implications. Once you alert the client to the inflated-results, their next questions will be: “how long has this been going on?” & “what were our TRUE results, then, based on this information”.

Unfortunately, custom-events & custom-conversions aren’t retroactive, so you’ll likely have to assume that whatever the %-gap is in your test-period also holds-true for past campaigns & re-pull the reports. Of course, this is painful, but it’s best to get it out of the way & use it to demonstrate that your interests are aligned with your clients, regardless of the likely emotional-impact of the analysis.

Focus On The Positives

Here, it’s important to communicate that this seems to be a widespread-issue (not unique to you or the client), and most importantly, that the vast majority of agencies & advertisers still aren’t even aware of it. If communicated carefully, you can use this situation as proof that your agency or team is on the “cutting-edge” & continuously seeks new ways to increase the accuracy & reliability of the reporting-data.

Conclusion & Next Steps

I believe / hope that Facebook will soon offer the ability to show performance-metrics in Ads-Manager on a “unique event” basis. They have already added this option in the “Customize Columns” UI, but it appears to be limited to SDK-driven events (mobile-app events). Naturally, I think this solution will be extended to FB Pixel-driven events (website) at some point in the future.

Since there’s no-telling when that will occur, for the time-being, I recommend you take the initiative & implement the steps outlined in this post.

This is the course I wish I had 5 years-ago when we started focusing heavily on tagging, Google Tag Manager & “technical marketing”. This unique offering & specialization completely changed our business & allowed us to acquire large-clients on retainer (who would have been otherwise unapproachable).