We have to be honest with ourselves. There is an awful lot of stuff that we have cherished for a long time in the games business which turns out not to work. Sometimes it takes us years to shed the scales from our eyes about the fact that hoary conventions of yore are just that — conventions, mutable and open to change.

Koster has what seems to me a healthy relationship with “data” (info, metrics, whatever). He is able to see the forrest and the trees about numbers, while keeping humanity and perspective in tact. This has always been the problem for me when discussing data with my colleagues in my building and online.

He goes on to describe what lies outside the “field of measurement”: “Anything that unfolds over a very long period of time”, “[a]nything that lies in the realm of emotion”, “[a]nything that is a short-term loss for a long-term gain”, and “[a]nything that exists outside of the game proper”.

He concludes this list with this statement:

Most critically, measurements are excellent at telling you where to iterate. They are not so good at quantum leaps.

But this has been the problem as of late. I feel like I’ve been striving for quantum leaps, but missed small iterations. I feel like I’ve tried too hard to be transformative, but failed to see the smaller steps, the little successes necessary for growth. I feel like I’ve leapt away from “science” because I felt like it stripped away my humanity, rather than finding a way to use science to support my own humanity and that of my students.

Koster concludes his post with a bit of advice:

If you ask me, the best way to make F2P games better would to find ways to provide provable metrics on why the things we value add to the bottom line. Community, artistry, narrative, brand, retention, customer sentiment, whatever your particular religion is — find a way to measure cause and effect and demonstrate the business value.

While we don’t run a busines (I will always cringe when discussing education with most business folks when they draw this analogy), those of us that consider our work transformative to education in some way seek to add value to the “bottom line” (let’s say, grade level expectations and factory-modeled education). If we are going to take this “long view” emotionally, we must understand the short-view intellectually.

More human metrics can support this approach, and I solicit stories of your experience with “data” to help round out my own cuddling up to it.

Discussion

4 thoughts on “Metrics and “success””

Five ‘datum’, that in the aggregate do not comprise data:

1. There are lies, goddamned lies and statistics;
2. If one sees a perfect Bell Curve, one has probably monkeyed around with exclusionary and/or inclusionary criteria, one’s assumptions, or all the foregoing to the extent the result is garbage;
3. Human beings are not the same as chemical elements;
4. A deep ‘long view’ requires much more relational datum than anyone has been able to compile — think, “The Archaeology of Knowledge”;
5. Beware the Contraires, who may purposefully violate the rules (but then, perhaps data would reveal the Contraires?).

I think data is important (it’s the only evidence we have!) but I think that people take a very narrow view of data, which is unfortunate.

– they think, for example, that data is just numbers, when in fact data can be found in the full range of perceptions, including observations of emotions, visceral reactions, likes and dislikes, and more

– they think the only way to work with data is to count things, while in fact data provide a rich range of possible interpretations – connections, patterns, flows, etc

– they think data is cumulative, suitable only for iterations, when (as Kuhn pointed out) the right sort of data shows a greater and greater need for quantuum leaps of scientific revolutions – data about anomalies, data that needs explaining, problems, unanswered questions, etc

– they think data should show you a single ‘objective’ perspective, when in fact different sets of data yield different perspectives, where these perspectives taken individually and together amount to more than the mass of data aggregated

The problem is not with the use of data to make decisions – the problem is with the simplistic one-dimensional use of data to make decisions. Instead of attacking the data – which leaves you with no ground to stand upon – it makes more sense to attack the simple-mindedness.

Change the grounds! It’s not that their approach is ‘data-driven’ or ‘evidence-based’ and yours is not, it’s that they have very carefully selected a subset of the evidence that will ‘count’, while you are using a much broader, richer, and ultimately more accurate base of evidence.

(p.s. on the term ‘data’ – sometimes I use it as a mass noun, and say things like ‘data is important’, and sometimes I use it as a plural, and say things like ‘data provide'; there isn’t a single ‘correct’ way to use the term; its conjugation travels as your usage travels).

I think we can iterate with qualitative feedback, not just quantitative. I think it’s correct to say that we have to pay attention to the present to shape our bits of the future. If we pay attention to the relationship between what we’re doing and what we hope to achieve, that, of course, helps. In so much as any kind of attention helps us critically examine convention, I dig it.