Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Using Analytical Game Design to make datasets ‘playable’ in the classroom

The paper was presented at the symposium “Games for Learning: Moving Goal Posts in Educational Game Design”, organized by the focus area Game Research at Utrecht University and held at the Academiegebouw Utrecht on March 15, 2018. It outlines a technique based on Analytical Game Design to analyze and teach datasets through play in academic classroom contexts.

3.
Making Datasets Playable – 03/05/18 – Slide No. 3
Preliminary Notes
• Reports on the early stages of an ongoing research project
• Our focus is on game design and media literacy considerations
⇒ We are not educational scholars!
⇒ In fact, we are looking for educational expertise related to this subject matter!
• Try out the prototype (both card game and digital version)!
 To access the digital prototype
after the symposium itself,
contact me at s.werning@uu.nl.

5.
Making Datasets Playable – 03/05/18 – Slide No. 5
Making Datasets Playable
• Tapping into the ‘sense of
play’ (de Koven)
• Sample dataset on the
video game industry
– Combines sales and ratings data from
vgchartz and Metacritic
• Translated in a collectible
card game (CCG) using
nanDECK
– For use in the Niveau 3 BA course
Computer Games in Context at UU
• Students ‘play’ a publisher

6.
Making Datasets Playable – 03/05/18 – Slide No. 6
Physical vs. Digital ‚Version‘
• Only the physical card game was
used in class
• Physical version (PRO)
– Affords more discussion as other students sit
around and advise the two players
– Tangible
– Cards on hand are more easily comparable
– Complex placement mechanics are easier to
implement
– Simple rule changes can be tested on the fly
– Players can’t look into each others’ cards
• Requires online connectivity in the digital version
• Physical version (CON)
– Only two players (as cards can only be
differentiated via orientation)
– Changing the card layout and data set used is
time-consuming and cumbersome
– Calculations (e.g. score) need to be kept simple

8.
Making Datasets Playable – 03/05/18 – Slide No. 8
The In-Class Experiment
• Two work groups
• 15 min. introduction
• 45 min. play session
• 30 min. in-class discussion
• Online survey for more open-
ended questions
1. What did and didn't you enjoy about
the game?
2. What did you learn about the game
industry?
3. What did you learn about the dataset,
i.e. about the games represented by
the cards?
4. Think of one or more potential
changes or improvements that could
be implemented into the game. How
would these modifications change its
procedural rhetoric?

9.
Making Datasets Playable – 03/05/18 – Slide No. 9
Limitations of the In-Class Experiment
• Did not afford multiple
playthroughs
– Familiarizing oneself with the data takes
time
– Understanding the game-as-model
requires game literacy (Bourgonjon 2014)
• Needs to be more reflectively
incorporated into a curriculum
(Squire 2002)
– “In a hypothetical Civilization III unit,
students might spent 25 percent of their
time playing the game, and the remainder
of the time creating maps, historical
timelines, researching game concepts,
drawing parallels to historical or current
events, or interacting with other media,
such as books or videos.”
Bourgonjon, Jeroen. "The meaning and relevance of video
game literacy." CLCWeb: Comparative Literature and
Culture, vol. 16, no. 5, 2014. Academic OneFile, Accessed
15 Mar. 2018.
Squire, Kurt. 2002. “Cultural Framing of Computer/Video
Games.” Game Studies 2 (1).
http://www.gamestudies.org/0102/squire/.

10.
Making Datasets Playable – 03/05/18 – Slide No. 10
Evaluating Player Feedback
• Q1: Criticism
– Participants expected the design to be
naturalistic
• “I did not enjoy the fact that there where
preset numbers like sales and community
ratings because if you are a publisher of
games then you can not know these
numbers”
– “we did not enjoy the fact that you could
compete against your own cards. This
caused a divided playfield, were the teams
did not compete against each other, but
strategically located their own cards in
order to score as much points as possible”
– “You can't really work around a bad card”
• Q2: Learning about the industry
– “[successful] games [...] don't have to be
top notch games”
– “it can be smart to release a game which
you know won't succeed only to then
release a better game after it”
• Q3: Learning about the dataset
– “how some games turned out to be
successful based only on their choice of
platform”
– “very high buzz but low selling rate. It
made you think, and get curious about
why they talked about it that much”
– Occasionally more ‘analytical results: e.g.
“The games represented can be viewed
both as entertainment media or as
commodity products”
• Q4: Potential changes
– Comments often focused on usability and
appeal, i.e. on ‘product’ categories
– “interesting if the games decay over time
[…] I think that is something that happens
in the actual gaming world as well”)