Tuesday, December 02, 2008

What comes first, theory or practice?

The first year I taught game design, I taught two game design courses: a fully practical one where students were given realistic game design problems (e.g. "Develop a core concept and one-page concept doc with brief description, genre, target audience, target platform, and feature list for a given IP") and a fully theoretical one where students read about the leading thoughts in the field and discuss them.

Because of administrative snafus, the practical course was taught first. Students loved the challenge, but once they got to the theory course the (predictable) reaction was: this is great, why couldn't you have told us this stuff before when we could have actually used it?

The next year, I vowed to teach the theory first, and then the students could use this strong foundational understanding of the field to go on and make excellent game projects in a practical class that followed. I ran into a different problem: without the practice of making real games, the students weren't nearly as interested. Sure, I can talk about MDA or flow theory or positive feedback loops all day long and not get tired, but the students had no easy way to contextualize all of this. Yes, I can give practical examples from real games, and that is part of it... but without having done a lot of design work themselves, the students had a lot more trouble seeing it.

Chicken. Egg. Gah! I can't win! Or can I?

This Winter, I'll be trying a third approach: combining the two into one course, alternating the theory with the practice so that we first go over a small bit of theory and then immediately apply it in a real design situation. I look forward to seeing how this goes.

It occurs to me there is a potential fourth approach: also a combined course, but with the practice first... then followed by discussion. This has the downside that students are less likely to produce anything of value (after all, I'm not teaching them how until after each assignment is over) but it should make the discussions much more valuable: we can talk about what went right or wrong on each project, and then comment on how current theories play into it all. I may try that in a future iteration.

Personally I'd go with the third approach myself. The age old rule of board games, “It'll make sense after a few rounds of playing it,” I think applies here as well. That way the students have the basic theory going in to the project and more hands on experience coming out. Nothing quite like that “Oh I get it now!” moment to truly cement it in the old brainbox.

As both a student and an educator, I believe the best way to deal with this is similar to what Chris Okasaki suggested.

First, present a problem to the class. Have them brainstorm possible solutions, and identify the pros and cons of each one.

Next, present the established body of knowledge and theory. If you can, present this in the context of the solutions that the students proposed. Did the students come up with any ideas that are similar to those that are considered best practices? If not, what were the differences, and why are those differences important? Could any of the student ideas be better than the prevailing practices? By doing this, you're making the knowledge more intimate to the student by connecting it with their own thoughts and words. This will help them to better internalize and understand the concepts.

Finally, give the students another problem to work on, now that they've been exposed to the Big Ideas related to it--and then enjoy the results!

Obviously, a major problem with this is time. If it's a class that meets twice per week for an hour or so each time, this can be divided up rather easily into two class periods (one creative problem solving and brainstorming, the second lecture and integration of ideas) followed up by a homework assignment.