It Takes a Village to Measure Social Applications

Have you been lured in by “Texas Hold’em” or “Mafia Wars”? This week I was analyzing a branded application on Facebook using three different data sources: Facebook’s native analytics, Google Analytics and Atlas. One could merely use Facebook’s native analytics platform for tracking app usage, notifications, installs and uninstalls, but you can also glean a lot more from the canvas page than you might think if you creatively use your web analytics tools to mine some of the data.

The “Canvas page” is the application page that is hosted by the application developer. You can insert a web analytics tag on the page and mine social graph data about the user (if they have granted you permission when installing the app). Data points that are accessible include age, location, gender, interests and number of friends. For the exhaustive list (which might scare you – honestly) visit the Facebook developer wiki as it may provoke you to go change your own personal privacy settings.

Tags that you place on the Canvas page allow for grabbing these data points and inserting them into the URL as either directories or query string variables that you can later mine through your web analytics tool. Keep in mind that you want to be concise about “bucketing” information so that the data is manageable for analysis. For example, capturing “male” or “female” only adds two elements to a list for analysis, but “city” could add hundreds if not thousands of elements for analysis. Depending on the tool that you use, cardinality may or may not be an issue (something to keep in mind before you go data collection crazy).

Facebook provides some good tools for looking at the application data in that they will provide the “median” for how other apps are performing across their network. While this doesn’t tell you how you might be doing within a specific segment, it at least provides a benchmark for comparison.

A few pitfalls to note in Facebook’s application analytics:

“Usage” statistics are reporting in PST while “Features” are reported in GMT (these two reports are just tabs away from each other). One of the cardinal rules in any web analytics implementation is ensuring that your Web servers are time synchronized. Ending up with logs (and reports) that are off by seconds, minutes and in this case hours can really throw off your data analysis. My guess is that many application developers or analysts may not see the footnote and have no idea that they are aligning two different time zones of data to each other.

Definitions aren’t well defined anywhere within Facebook (that I could find). Understanding how this data is tabulated and filtered would be very helpful to the analyst when trying to analyze and interpret the results.

Regardless, Facebook applications provide the application developer with a litany of data points for analysis about those users who are installing and using the application. For brands, this is huge! Just one look at the granularity of information provided within the Facebook API and you’ll see that it is a marketer’s dream (or data overload depending on your perspective).

I’ve only scratched the surface on what is possible in analyzing here. The point is that utilizing your existing in-house web analytics tools in conjunction with Facebook’s offerings will provide a more in depth analysis than using either analytics package in “stand-alone” mode.