Across many industries, large and small organizations are using analytics and data science to offer greater insight and customer service. The gaming industry is almost well placed in that - particularly with online and social gaming - the companies already keep a vast amount of data on gamers. The challenge remains to make use of this data in a way that offers true value for money whilst enhancing the experience of the customer.

Steve Stalzer, Director, Analytics Platform Development ,Turbine / Warner Bros Games gave an insightful talk on "Use of Telemetry Data at Warner Bros Games". The mission of his team is to provide a fast, secure, and robust analytics solution that allows studios to generate near real-time analysis and recommendations. Telemetry data from games is huge. It can easily reach the orders of hundreds of terabytes per title per year.

The growing business requirements mandate the analytics platform to offer various features such as scalability, elasticity, data security, customization and flexibility. Besides standard reporting, the analytics platform must also support ad-hoc queries, deep analytics and machine learning. It should be easy to analyze data across titles, studios, platform (console/PC/mobile), etc. The game data is used by the analytics team to answer questions on various aspects such as how many players can we market to, which players are going to churn, what is the lifetime value of our players, how can we increase engagement, which side missions are the most popular, etc. Thus, the analytics platform team plays a key role in enabling data-driven decision making throughout the organization.

He shared some usage statistics from Shadow of Mordor game, and explained how it provides insights on the player behavior. He explained the data architecture, which included Amazon web services, Apache Spark, Apache Kafka, SparkSQL, R and MLlib. For BI Services, Tableau, Vertica, and Redshift are used. The future plans include providing real time insights and visualizations using Spark and Spark Streaming.

In conclusion he shared a few lessons from his game analytics experience: plan for elasticity as the magnitude of event data is highly unpredictable in the beginning, do Upstream ETL to give analytics a way to transform their own data, and strive to increase the speed of processing leading to real-time insights.

Nick Ross, Director, Analytics, Sega delivered an interesting talk on "Principles of Funnel Analytics". He defined funnel as a visualization tool focusing on the linear progression of a well-defined population. Funnel based visualization is 100% focused on simplicity and conveying a point to people who are not data-savvy.

He emphasized that the variance generated from within group differences should be less than the variance on the funnel. Ideally, the population being represented through the funnel should be homogeneous within-product time and homogeneous outside of product.

Humans are, in general, lazy. Thus, no matter what the subsequent steps are there will usually be some attrition throughout the linear progression. This can be termed as "natural attrition". While studying the funnel, it is important to distinguish the attrition due to friction points from this natural attrition, using the heuristic that the natural attrition rate is roughly proportional to the time that the user is in the app. Since the friction points and user journey are so specific to the underlying app or game, it is very hard to compare the funnel across apps/games.

Parsa Bakhtary, Games Product Analyst, Facebook explored the concept of customer life time value (LTV) in his talk "Predicting the Value of a Social Game Install". In simple terms, it is the dollar value of a customer relationship to a company. In other words, it is the upper bound of the amount that could be spent to acquire new customers. Specifically, for free-to-download games with in-app purchases, LTV is the total expected revenue from an install. Social/mobile game LTV varies a lot by demographic factors such as country, age, and gender. The traditional LTV model for apps is based on average revenue per daily active user (ARPDAU), retention, and virality. There are various challenges with the traditional model, such as, the model relies heavily on forecasting (for ARPDAU and retention values). Also, the model is very tightly-coupled to the app, and thus, varies a lot for different apps.

Facebook's goal is to find a LTV model that will work for all monetizing games across the entire desktop gaming ecosystem. This model must be independent of the genre or friction point placement of the game. The model should deliver stable predictions, i.e. LTV should not change a lot week over week. His team computed cumulative (30/60/90/180 days) revenue curves for weekly install cohorts of the top grossing Facebook canvas games and explained their shapes and stability across various game genres. The revenue curves clearly showed regular cycles over weekly time period. Thus, it makes more sense to analyze revenue pattern weekly, and not daily. For most games, 90-day ARPI (average revenue per install) is sufficient to understand monetization behavior.

Based on the shape of ARPI curve, linear or logarithmic regressions can be used to predict 90 or 180 day ARPI using data from the first 7 or 14 days. While this works well for puzzles and strategy games, it does not give much accurate results for casinos and table games. This insight can be used to understand which demographic cohorts yield the most profitable installs and determine targeting opportunities. ARPI is an important feature taken into account by game/app ranking algorithms, particularly for app and advertising platforms.

In conclusion, he asserted that 90/180 day ARPI is a more stable and accurate way of predicting an LTV-like metric across many apps and cohorts simultaneously. It is also important to note that even for games where overall app ARPI is predictable and stable, narrowly segmented cohorts often exhibit wild spend behaviors.