How do we measure the effect of a serious game?

February 14, 2014
Totem Learning

This post is taken with kind permission from our Lead Instructional Designer, Helen Routledge's "Curiosity Clinic", for more blogs like this click here. Enjoy!

There are many layers of measuring the impact of a certain training approach of intervention. Typically the accepted measure of proving the worth of a programme was to look at the ROI result. The ROI or Return on Investment gives a financial baseline as to the monetary outlay vs the gained results for a company or organisation. However ROI only gives us a piece of the puzzle when looking at effectiveness.

Personally I don’t focus on ROI. I appreciate it is important to my clients, but I prefer to focus on ROE or Return on Engagement. I prefer to look at the wider organisational and personal (user level) impacts to judge effectiveness. I believe that if you engage someone in a topic, you pique their curiosity and open their eyes to new areas then they will be motivated to learn more, explore more, communicate more and this will have a ripple effect on the organisation.

When looking at engagement there are several stages we look at. And of course this very much depends on the situation at the time, how much access can we have with end users, what data we can capture etc etc but below I’ve outlined the main methodologies we use.

Observation: During play observation we can learn so much about a user’s engagement level. Examining their body language for example we can see if they are leaning in, exploring the game world, and paying attention to the information that is presented to them. By listening to the users, especially if they are playing together in a team, or discussing their actions in a debrief we can truly get an understanding of how much users have taken in. This is great evidence of self-evident assessment, which if you ask me is pure gold when trying to assess if someone has learnt something or altered an attitude or behaviour.

Replay Statistics: If you’re looking for more hard and concrete data you can look at how often users revisit your game. This data is readily available on most LMS’s and of course when we host the games ourselves we can easily access the number of times players re-attempt a scenario or module. An example of this is that we know our Business Game is played on average 4.3 times per player.

Behaviour Change: The gold standard for knowing if you have made an impact is if the end user makes a change, consciously or unconsciously to their behaviour. This may be in the form of internal requests to seek out more information on a topic or a desire to tell others what they have discovered to implementing lessons they have learnt in the game.

Formal assessment: The traditional approach to measuring the effect of a training programme is of course a formal assessment. Be it a multiple question quiz or situational judgement assessment, formal standardised testing is still popular in many courses. In games we can still build this in but we always try to approach formal assessment in a softer way. Games lend themselves naturally to situational judgement assessment, and of course we all know we can do multiple choice questions and branching tree structures. That data can be captured as a score in the LMS or as a detailed breakdown given to the user highlighting their strengths and areas they need to focus on.

These are just a few examples of the areas we look into when evaluating the success of our products. Every client and every situation is different and we always take into account their unique environment and situation to craft an evaluation piece that is suitable. Sometimes the data is built into the game interface as a numerical score and in other instances we impart the softer consequences of choosing a particular path. The mechanics we choose depends greatly on the audience demographics, environment culture and intended outcome.

But what is important and where I want to end is to reiterate the Return on Engagement. If you want to measure training impact then look at your training as a whole. Does it offer users opportunities to explore content freely and openly, does it encourage them and does the tone of content give meaning to them as individuals as well as the business.