This is amazing for a number of reasons, and oddly game-like: a novel use of ordinary objects, a delightful cycle of build-up and release, a brain tickling use of physics, and a nice, catchy tune that actually becomes more appealing with repetition.

This sort of ‘novel everyday’ use of physics seems to be one of the primary ways in which physics games appeal. In fact, there’s this whole weird subgenre of physics games devoted to rolling balls around and using physics in clever ways, stuff like Ballance and Hamsterball. Seems like these games could learn a lot from this video, don’t you think?

Here at Flashbang Studios, we have something called Experimental Mondays. Every Monday, you spend the day making something cool. The informal stipulations are that what you make should be topical (art or tool-related if you’re an artist, a prototype or piece of interesting tech if you’re a designer and so forth) and complete. At the end of the work day on Monday, we show what we’ve created to the assembled studio. The idea is both to continue to develop relevant skills and to compensate in some measure for the soul crushing realities of game production. Very cool.

In the past, I’ve made some cool stuff on Experimental Mondays. And, of course, some not so cool stuff.

Lately, though, I find myself using Monday as a ‘soft’ day, a day I devote to activities that I haven’t found enough time for during the rest of the week. This is lame, and it is in direct violation of purpose of Experimental Mondays. I end up doing the tedious, crappy, and otherwise unmotivating tasks I felt guilty about not having completed the week before. To that I say: shuttlecock!

As an experiment in game production, I’m undertaking a full scale project, to be completed as a series of Mondays.
I’m calling it “Tune”. The idea is basically that it’s a game about making games. It stems from my experiences with the Jumper exercise, and subsequent fiddly tests, as well as my desire to try out some of the things I learned about rapid prototyping from Chaim Gingold and Chris Hecker’s Advanced Prototyping session at GDC. And at a bunch of other Spore-related GDC sessions, I suppose. Eric Todd had some interesting things to say about his experience as a producer on Spore, though it mostly amounted to ‘rapid prototyping is awesome, here are a few hints about how to do it right.’

So, yeah, ready-fire-aim! I wrote up a detailed project plan, which I haven’t ever done before. I’m determined to try out a more detailed approach to the entire process of creating a game. I need more data on what works and what doesn’t. I’ve tried not doing a bunch of preproduction, time to try the opposite. If it ends up taking too long, I can calibrate between the two extremes.

The process, as written up, goes like this:

1. Brainstorm - Collect up all the current, crazy scrawlings I have related to the idea and examine them as a whole. Spend some time researching other, similar game ideas, asking the question ‘what’s out there already I can borrow inpiration from?’ Cast my inspiration net out there and look for things that fit with, compliment, or extend my vision for the game. Music, movies, art, whatever: it’s time to watch some Miyazake, some Kurosawa, listen to “Grace” and Faith No More’s cover of “Easy Like Sunday Morning.” Then some traditional brainstorming, stuff from Von Oech or De Bono, divided into questions about mechanic, level, system, and UI design. The objective here is some churning idea fertilizer from which to create a cohesive vision, as fleshed out as I can make it.

2. Create a Vision – Something that gets mentioned over and again in business, personal development, and game production is the power of focus. A clear vision is the key to doing something brilliant. In my limited experience, the best possible way to define focus is by framing the problem through questions. For example, the Harmonix guys let the question “does it rock?” drive the entire design of Guitar Hero. Arguably, Keita Takahashi had a similar question about rolling a ball around picking things up that led to Katamari. So – and I haven’t really done this externally before – I’m going to define my design for Tune as a overriding question, a series of smaller questions, and answer them with prototypes. This was one of Chaim’s main points at the Advanced Prototyping session: “Behind every successful prototype stands a good question, a clearly articulated problem to solve.” So, what are my questions? In addition, based on my own observations, I want to see if it’s useful to create a map of player experience (as in, the experiences the player is supposed to have playing the game.) I also want to see if there’s value in creating visual mockups for UI, control flow diagrams, and an abstract of the overall game structure with an eye for Flow. At that point, I’ll have a pretty clear vision and some cool data about these various approaches.

3. ‘Rubber, Meet Road’ - The other thing that made an impression on me at the Advanced Prototyping session was the concept of creating a map of questions to drive game design. Questions to which the prototypes provide answers. So, I’m going to create a map of interesting questions to answer with small prototypes. This is just good sense. What are the series of outcomes that must be completed for the game to be done? How am I going to go about getting those things done? What are the prototypes I’ll need, how am I going to attack and order them, and how will they all fit together in the end? What are the potential pitfalls technologically, design-wise, art-wise, and how will I stay motivated? One way is posting to my blog here every Monday with progress updates (prototypes.) The point here is to try to wind my vision string around a thumb of reality, to make some sense of it in real, here’s a list of things now do it terms.

4. Schedule it – Scheduling it makes it real. The trick here is to take the mindset of data collection; all I’m trying to do is provide impetus. If I miss a deadline, it was probably too aggressive. At least I’ll have some data. And, of course, I’ll put those milestones up here for all to see to encourage myself not to miss them.

So that’s pretty much it. I’m putting my other projects on hold for the time being to see if I can’t upgrade my process and methodology. Empirically speaking, my current method has gotten me some interesting, aimless little prototypes. Time for some new data!

When you want to understand the weight, mass, and texture of something, you pick it up and noodle it around in your hands. You feel its weight, imagine what would happen if you were to throw or drop it, and get a sense of its texture. Size permitting, you toss it up and catch it a couple of times. You roll it.

The same playful process occurs when controlling a game agent via input device. There is another layer of abstraction, of course – the controller, keyboard and mouse, stylus, etc…- but that layer fades to transparency the instant you pick up the input device and you’re left with the same kind of sensory experimentation. Your inputs aren’t button presses or mouse movements, you’re touching something virtual in the same way you touch something physical. You push, pull, and throw the agent. You see how far it goes. You test the limits, boundaries and rules under which it operates, you see how fast or slow you can make it move, how fast you can turn it, and how its mass and weight compare to other things in its virtual world. In short, you’re experiencing a feeling of touch, a purely virtual sensation.

Ipso facto, the important difference between virtual sensation and actual sensation are the worlds in which they operate. Whereas no one can change the rules of the real world, I, as the designer, can change the rules of my game world. For one, I’m always going to make it representational. Simplifying the rules makes things easier to predict and control, which is much more conducive to a feeling of mastery. Want to be able to hit a baseball 100,000 miles? No problem. Want that baseball to float in the air so it’s easy to hit? Done. Want to be able to jump 100 miles into the air and float softly back down? Fly? Spin? Run up walls? These are decimal point changes to my game world’s rules. I can make these changes in real time. I can give you, the player, control of them to do with as you like.

So here’s a deceptively deep question: what is the feeling of controlling something in a video game?

Clearly, it’s a very aesthetic feeling. When describing the control of a game, players often resort to a physical analogy; the control is “floaty”, “twitchy”, “smooth”, “slow”, or “loose.” It seems easiest to describe when it’s gone wrong.

As gamers, we intuitively understand this feeling. Even as I’m typing this, I can easily imagine the feeling, as I’m sure any gamer can. Why, then, do we grope for ways to describe it? How would you describe it? It’s something so base and simple, such a common experience, it blows my mind that we don’t have a name for it.

It’s the feeling that causes you to lean left and right as you play, the feeling that you’re reaching into the game and pulling or pushing the agent with virtual hands. If it’s done right, it’s the best thing about video games, a feeling that has no real analogy in other media. So what’s a good, short name for that feeling? What can we call it?

Look at the word “animation”. Divorced of the context of Disney and Pixar, of bringing life and motion to a series of static drawings played back in a specific order, it’s just delineation. Something is either animate, like a bunny, or inanimate, like a rock. Imagine the context of the word “animation” before cartoon animation developed. These days it’s a great word, rich with context and imagery; it describes a whole artistic medium. Before, it meant almost nothing. Now it’s flexible (it made the transition from 2d drawn animation to 3d computer animation quite happily), encompassing, and powerful. So what about that very physical, kinesthetic feeling of controlling a game agent? What do we call that? What’s a flexible, encompassing, powerful descriptor?

Overview: Using what players like most about play, this playshop offers tools and tactics for creating emotion for next-generation player experiences based on XEODesign’s close examination of players during play, and Isbister’s research at Stanford and the Rensselaer Games Research Laboratory.

Takeaway: a delightful brainstorming session, some valuable metrics of player experience and emotion, and an amazing view into to the lexicon of controlling emotional projection presented by Leonard Pitt (who comes from a theater background.)

I’m going to be a bit chronologically irresponsible here so bear with me. Starting at the end of the 8-hour day and working backwards, the following is the game concept my group brainstormed for the final design challenge. The constraints were as follows:

“Apply what you have learned today about creating emotion [in games]. Break into groups of five and…add emotion to an existing game or create a totally new PX [player experience profile] using biosensors. Choose one of three challenges, then select a game and someone to take notes and present your results. Brainstorm with your group on what you would change to create new emotions. Clear goals and big emotional shifts earn more points in the final vote. Advanced Play: Try replacing the most important feature with something that creates the opposite set of emotions. How big a shift can you get? You have 60 minutes. GO!”

At this point there was a list of games from which to choose, and Nicole added that if the group wanted it could create an entirely new game concept, based on a hypothetical biofeedback input device.

As it turns out, one of the members of our group, Tim Hong, actually works at a company that is relatively far along in creating such a device. I was a little fried at that point – sorry if I’m bungling this, Tim – but the way I remember him describing the device is this: “like a pair of sunglasses, wireless. It senses blinking and eye contractions. We’re running at 60hz so we can effectively capture brief facial displays of emotion like surprise, curiosity, or amusement.” Neat!

I’ve no idea why I thought it would be a good idea to move right after GDC, but the deed is now done. My new command center is secured and things are back to a new, improved, more Feng Shui version of normal. And now: business!

I’m working on a paper and continuing with move-related activities for the rest of this week so there will be a trickle of GDC posts, with the bulk flooding in next week. At last count I have over forty pages of notes to parse out, respond to, and post. Alice does a preternaturally good job directly capturing GDC sessions and Gamasutra’s coverage was excellent this year so I’ll be sticking to thoughts and impressions, letting them spin off in whatever bizarre directions they want. For example, I started typing up my notes from the first session I attended, Emotion Boot Camp, and I ended up drawing a crackheaded mockup in Photoshop and writing about why I dislike game genres. Go figure.