Tip 1: One-sided platform metrics are insufficient to compare advertising across various platforms. We must also look at what happens AFTER people click on the ad.

Google Analytics, combined with platform metrics, offered full insight into ad performance. The first week, the budget was evenly allocated across four platforms: Twitter, Facebook, Google AdWords, and LinkedIn. To compare which platform yielded the best results, we looked at outcomes, not just the cost-per-click to the site. As previously discovered, one site could generate many click-throughs at a low price, but that does not translate to conversions (registrations, in this case).

Tip 2: Be wary of extremely high session durations that greatly skew the average session duration.

After Week 1, Twitter yielded more than double the click-throughs of any of the other three sources, despite equal budgets. But, while more ad click-throughs is a good sign, it is not proof of good conversion performance. Using Google Analytics, we looked at the average session duration (how long the average person spent on our site after clicking on the ad). After Week 1, the following platforms showed greatest to least average session duration: LinkedIn, Google AdWords, Twitter, Facebook.

One Google AdWords session likely lasted about 50:33, which drastically skewed the average session duration. This meant the rest of the sessions from Google AdWords averaged only 0:02 per session, which was lower than the other three platforms.

Tip 3: Know your desired visitor path and track who is following it.

Another key metric to monitor ad performance is conversion. We tracked the number of sessions that moved through the conversion funnel from the event page to the registration page. The ads directed users to a landing page with the event description and a link to the registration page at the bottom. The first week, the event page had 37 sessions from Google AdWords, yet none of those sessions yielded clicks to our registration page. Taking this into account, along with the 0:02 average session duration once we took out the anomaly session, we stopped using Google AdWords for this event.

We only spent about 2% of our overall Annual Conference advertising budget on Google AdWords, and we are OK with that. It wasn’t working for our purpose (while other sources were), so we nixed it.

Likewise, we stopped Facebook advertising after the first week due to the very low average session duration from Facebook ad clicks.

Tip 5: Do NOT expect to fully track each registration in one session on the initial device.

Now that we narrowed our advertising efforts to two platforms, budget on those platforms increased.

The average LinkedIn cost-per-click was higher than Twitter, but the average session duration was also higher. During a two-week period, the average session duration for users who clicked from LinkedIn to the event page and then to the registration page was 17:33, which is incredibly long.

During that same time period, the average session duration from Twitter for users who went all the way to the registration page was 7:37, which is also very high. Given today’s often click-happy and attention span-deficient culture, we were very pleased with these numbers.

We also tracked users who complete the whole conversion process, from clicking on the ad to completing registration. We expect these results to be extremely under-represented, as most people do not learn about an event and register in the same session. We anticipate many people will switch devices, perhaps seeing a Twitter ad on a mobile device and registering at work or home via desktop. Despite these limitations, we tracked one full registration from a LinkedIn ad.

Tip 6: Start asking registrants right away how they found out about the event. Consider making this a mandatory question.

After running the ads for a week, we added a survey question that asked registrants how they learned of the event. These are the number of potentially social media-related responses to this question:

Facebook – 3

Google search – 4

LinkedIn – 2

Twitter – 7

The Google search responses were likely not a result of the very short-lived Google AdWords test. It is also possible some people found out about the event from organic posts on the other sites. Also noteworthy is that 72 people did not indicate how they found out about the event. Ironically, the one person who completed the full conversion cycle in one session from a LinkedIn ad, did not complete the survey question. Both tracking methods were partial and beneficial in a complementary way.

Tip 7: Before running ads, ask what actionable data you want to garner from direct comparisons of ad and ad set variations.

While this post is intended to stay high-level, we did compare various other components, such as ad text, ad graphics (Mayo Clinic logo vs. event location), and audiences (states around Arizona vs. snow-bird states in the north who might relish the warm opportunity in December).

Tip 8: Continually review performance and optimize. It is not a one and done deal. It’s a one, two, three, four, keep going deal.

I regularly monitored how these ads performed. Other metrics to monitor include frequency, bounce rate, and spend. I continually reallocated the spend breakdown allotment, as certain ads were not spending all of their budget. I created additional ads when a previous ad’s performance was slowing down too much. For example, after we had seemingly exhausted an audience in a certain region of the country, I created another ad set to target other states.