My views on the gamer disposition haven't really changed since last time. I see a fair amount of it in my game programming classes (as you'd probably expect). Less so in my "programming for non-programmers" and "intro to game design" classes. Strangely (or perhaps not so much) I see more "creativity" from the students who don't have much of the "Gamer Disposition". Since I like creativity more anyways, I try to encourage that instead.

A little more on my statement about "gamer disposition" vs. "creativity". I posted the link to this lecture by John Cleese in the last Game MOOC, which is well worth watching.

In the video, Cleese talks about the idea that, in order to be creative, one has to be in the "open" mindset. This open mindset is something I don't really see in people with the "gamer disposition". They're often good at the "closed" mindset, which is good for actually getting things done, but they don't really spend any time *thinking* about what they're doing. They tend to be so focused on the goal (the bottom line), that they don't stop to make connections to other things. They can't stand the bit of discomfort that comes from not having a "solution", so they grab the first solution that looks right, and then stick to it like glue...even if that later turns out not to be the best solution.

I tend to see more creativity from people who don't have the "gamer disposition". They're more willing to take the time to *just think*, without focusing solely on finding a solution.

With "gamers" I also see a strong tendency to "imitate" their favorite games. While there's nothing wrong with imitation (it's a great way to learn) it's often very hard to convince these students to try anything new. Non-gamers tend to make novel things, and be more flexible with their design. If they can't figure something out, they change their design to something they can figure out.

Now, I'm not saying that "gamers" can't be creative. I've seen some really creative things done in games, and by gamers. Their goal-oriented single mindedness means they will often put a great deal of effort into getting something exactly right. However, I've generally found that it's the "open" mindset that need more encouragement in my classes.

The author looks as the "born digital" as native users not native creators. It does seem similar to your point with your students.

It might also be the same with creativity. J.R.R. Tolkein discussed sub-creation as what I interpret as creating the world and the backstory. It may be that too many students in game design and writing the narrative for games are defaulting the tried (or is that tired) and true.

Yes, that article was great. This is something that other educators I talk to have been murmuring about for several years, and I've read a number of articles on it. Even scarier, a lot of people *think* they can use a computer, but can't do simple things like connect to a wireless network, or open a zip file.

The aspect of the digital world having more consumers than creators is something I've come across many times from many different directions. I have a friend who does web design for a living, and even she was talking about this the other day.

In my game programming classes, students are always wanting to do *neat* things, but not wanting to learn the skills needed to do them. If the "tool" doesn't make it easy, then they just skip it and do something else. (Things that involve math are especially avoided...which makes me a bit concerned for what these student's think they're going to be doing in the "real" world.)

While I totally understand the anxiety about just how much there is to learn (trust me, I have those moments on a regular basis), that's not an excuse for learning nothing. I have to pick and choose what things seem the most interesting/relevant to me, and study those things. I ignore other things (Minecraft, for example), because there simply aren't enough hours in the day. Maybe this is something we need to get better at teaching to students?

so true, blueAppaloosa... I've seen that a hundred times too. "Oh, this is so cool! I want to do it... wait, there's math? And I have to put in some of my own time? ... Never mind."

We are doing Google genius hour in my class, and I had a kid who wanted to build a remote control car. The second day of the project, he informed me that since he couldn't find a YouTube video explaining exactly how to do it, he wanted to change his project. Eventually I let him, since when I tried to get him to put a bit more effort he sat in a corner and stared at the wall pouting.

I'm not sure what the solution is :(

Don’t do work that just exists within your classroom... do work that changes the world. -Will Richardson

Another article that I was thinking of while reading the "Natives aren't Restless Enough": Kids Can't Use Computers. While I think the author goes a little bit overboard, if we're going to call someone a "digital native" they should have an excellent grasp of digital tech, not just a basic grasp.

@missrithenay Yeah, been there and seen that. I've had a lot of student's change their projects because they couldn't find a tutorial that was simple enough (because what they wanted to do was somewhat complex). I've also had students put in tons of effort trying to find and "easy way" to do something, far more than they would have needed to just do it the "hard way".

My current approach is mostly just to let students figure it out on their own. I'm hoping that, eventually, they'll start to realize what they're getting themselves into...although I'll admit I have my doubts. I don't have anything better here, I think sometimes students need to come to understand certain things on their own.

That was a really interesting article. I agree with you that he goes a bit overboard... I wouldn't say someone "doesn't know how to use a computer" in all of the cases that he mentions, but I would agree that they don't know how to go about basic problem solving.

I'm one of the unofficial tech supports at my school (ie, I do the work without getting paid for it!) because I have a reputation for being good with technology. But nine times out of ten when someone asks me for help, I turn off the device and turn it back on, and then it works. I guess you could say that, in this day and age, they really shouldn't need to call me in until they've tried that :)

Regarding kids... it really is shocking how little they know about computers. Sixth graders who have iPods and iPhones an Xboxes and PCs, who have been using computers their entire life, have meltdowns if they get an error message, try to download viruses, and give out personal information online all the time. The best was a student who used Instagram, but informed me it was totally safe because she never gave out her last name. She has 500 followers, most of whom she'd never met, and she posted her first name, pictures of herself, and pictures of our school, but thought she was fine because she hadn't told anyone her family name.

Similarly, all of my students were shocked this year when I showed them how to take a screenshot. They've been told that once you put it online, it's there forever, but they didn't really believe it (I think). They figured if you delete something, it's gone. When I showed them how quickly I could take a screenshot it blew their minds. Then they went on to try to use it to get around downloading copyrighted photographs... sigh :D

Thanks for the article. It was a very interesting read.

Don’t do work that just exists within your classroom... do work that changes the world. -Will Richardson

Hah, yeah. I've been tech support (both 'official' and 'unofficial'), and I found similar things. Close the program and re-open. Reboot. Check for updates. These things fix just about everything. If that fails, Google. I find this flowchart to be largely true. :)

Kids and privacy really scare me these days. I don't know the solution here, but I'm always amazed at what people put out there online.

And yeah, screenshots. One of those things I kinda figure everyone knows how to do, but I end up teaching it to my college class (where more than half don't know). Several of them end up just taking a picture of the screen with their phones. I'm not sure whether I should praise their problem solving, or lament their apparent lack of basic computer skills.

I just finished reading the article and the comments following it and wasn't really surprised to find that the first comment was along the lines of that isn't going to happen.

Yes the gamer disposition has its merits, and as many have pointed out, its flaws. As far as workforce goes for maximum efficiency we need a mix of those with the gamer disposition - to carry out the task and stick at it - and those with the open mind set - to come up with the task in the first place.

It has been said that imagination (open mindset) is more important than ingenuity (problem solving, gamer disposition) but it has never been said that either is undesireable. People are different, and we need to respond to that.

Yes, it is possible for one person to have the best of both mindsets, in theory, but people like that are not the general population who make up the workforce or the main body of students regardless of level. By all means value one mindset over the other where appropriate but don't ignore either one or all that is happening is that one perceived problem is being solved by causing another one.

In addition there is the unaddressed question of how the gamers act away from the keyboard. Some may well take the gamer dispostion with them, others may well be of the open mindset type away from the keyboard, or even a mix of both. Jane McGonigal ring any bells? I'd say she is open mindset in her book and her lecture, but she is a gamer and probably exhibits a gamer dispostion at that point.

Steven

There are no stupid questions and mistakes are opportunities to learn in disguise

Now having read Kids can't use computers, I thought I'd stick in my twopence worth.

I found the article interesting and clear, but the argument for me hinges on the word use. A secretary uses a computer but has no need to be able to program one, as do several other people including myself. I can't program. I've never felt the need to. I can install software for myself, find what I'm looking for online within a reasonable length of time (usually) and generally use a computer to carry out the tasks I need it to.

While I do not intend to put down the value of coding I question its value for everyone. Does everyone need to code? I remember the days of FORTRAN, COBOL et al and the people who said you needed to learn these languages to understand a computer, where are they now? Are the computer languages we use today going to be the ones our pupils or children use when they leave school?

I have threatened several times to try and learn java, as yet I haven't. I can still use the computer for what I need it to do. Am I computer illiterate? Not in my opinion. Am I fantastic on computers, not in my opinion. Can I use a computer? Yes I can.

I considered learning Java because of things I learned on the summer mooc, but real life, minecraft and warcraft got in the way - not necessarily in that order.

Steven

There are no stupid questions and mistakes are opportunities to learn in disguise

For me, the biggest lack I see when teaching people to be more than consumers on a computer is twofold:

1. Lack of understanding critical thinking (typically in the form of troubleshooting).2. Lack of flexibility in thinking.

In terms of critical thinking, students can get this in a variety of ways. The way I see this lack manifest is when something goes wrong in the course of a tutorial and students don't understand how to begin troubleshooting what went wrong. This isn't something that requires a computer to teach but the process of troubleshooting is easily taught and often used on a computer.

Flexibility of thinking comes about when the student understands that it's VERY HARD to permanently break the computer. They can poke at things and try things. That's how I learn on the computer. (Videos annoy me and books are there for backup :D) This is something I struggle to teach my students, especially the older ones. The younger ones don't want to try because they want it done for them. The older ones are worried they'll break something beyond repair.

"I found the article interesting and clear, but the argument for me hinges on the word use."

Yes and no. Word use is actually pretty important (as I'm starting to realize), but I don't like hinging arguments on it either, so I'm going to skip that. The problem is one of "digital literacy". Usually we think of literacy as referring to reading and writing. Reading and writing are almost always paired; you don't teach just reading, for example. We expect people to be able to both read (consume) and write (create). There was a time where only the privileged could read and write, and everyone else had to go to a scribe if they wanted something read or written.

These days, digital literacy is becoming more and more important. But the vast majority of people these day can only consume, and even then they're not very good at it. Imagine being able to read and write your name, and a few other words, but having to go to a scribe for anything beyond that. Maybe that's ok, because all you need on a regular basis is to write your name, and a list of groceries. But one day you want a new item, so you have to go to a scribe to write your list. And another day, you want to write a letter to your aunt, but you don't know anything about verbs, and you need more nouns than you currently know. So you have to go to the scribe again. You might get by fine in day-to-day life, but no one would call you "literate".

Given the pervasiveness of digital devices in our lives, the new generations need to be digitally literate. It's going to be a factor that separate the "haves" from the "have-nots". Maybe you only need to be able to use Word and a few other programs, and that's fine. But you're not digitally literate. This is fine, but it does limit what you can do in this world. You can't create software, and you'll probably need an expert to do anything beyond your comfort zone. As we become more and more dependent on technology, this will be just as limiting as not knowing how to read and write.

Does everyone need to learn how to code? No. But it will help a person become more digitally literate. Knowing how to code empowers people to have control over their digital interactions, and to become creators instead of just consumers.

@LeedaleYes, and yes. I see exactly what you describe in my classes as well. Both in the game programming ones, and the "programming for non-programmers" one. Even my "advanced" students have a really hard time debugging and trouble-shooting. A tiny little typo in the book (or one a student introduces on accident) and they are stumped for hours.

I didn't mean word use, which is important at times but the word "use"" and what is meant by it.

Leedale explained what I was thinking much more clearly: a lot of the press over here is on introducing coding earlier when basic troubleshooting would be more valuable all round. I'm comfortable enough trying a few things but I admit I'm not comfortable with opening the box and poking around.

I could (in theory) create a best seller on a computer without knowing anything other than how to word process. No I can't write software, nor do I have to. As technology becomes a bigger part of everyday life it is made suitable for its role or it won't sell.

I take your point about why coding is valuable, and I agree with it fully but it is possible to use a computer without coding ability. Most problems have more than one path to the solution, I can't code but I know people who can. Most people do not understand the technology around the fully, but can still use it. How many people can repair a watch? Or a microwave?

Perhaps we need to reevaluate the language used, for example are computer literate and digitally literate the same thing?

Steven

There are no stupid questions and mistakes are opportunities to learn in disguise

Just read the natives aren't restless enough, and especially taken with the previous articles and the posts on here it does cast an interesting light on things.

People don't want to learn if it's hard - any teacher could've told us that, but there is always the idea that computing is different. Kids like computers so they want to learn about them. Or so it's believed. It isn't entirely true, the truth is more:

Kids like (playing on) computers and want to learn (how to level up faster and use more cheats) about them.

Again, not entirely true because it's a generalisation.

I agree with the article that people aren't as good on computers as they think they are (in general) and don't want to learn because it's too hard. I like learning, but it don't want to write software and I find it easy to start something but, alas, not always to follow through. But maybe it's time I did. I started minecraft and later warcraft to see what all the fuss was about and got hooked. Twice. Now if only I could apply that to improving my computer skills...

Steven

There are no stupid questions and mistakes are opportunities to learn in disguise

"It has been said that imagination (open mindset) is more important than ingenuity (problem solving, gamer disposition) but it has never been said that either is undesireable. People are different, and we need to respond to that. "

I don't think either mindset is more important than the other, both are important. While some people may tend towards one more than the other, I think everyone needs to cultivate both. If you're good at open, you need to actively cultivate closed, and vice versa.

I myself tend to get "stuck" in one mindset or the other. If I'm in the open mindset, I get stuck there and don't want to leave. I dread actually starting anything. When I'm in the closed mindset, I get stuck there as well. I don't want to stop what I'm doing, I just want to keep working. I'm so consumed by it I often can't sleep or even think about anything else. I have no idea how common this is, or if it's just me (not a discussion I've had before).

"I didn't mean word use, which is important at times but the word "use"" and what is meant by it. "

Ah, I see what you meant now. And yes, I do think that a lot of that article is based on what exactly one means by "use". This is perhaps we should differentiate a bit between "use" and "understand"?

As an analogy, I know how to "use" my car, but I don't understand how it all works. While I can troubleshoot simple problems on my own (needs gas, dead battery, etc), I have to take it to an expert if something more complected goes wrong.

I think that the problem with "using" computers is two-fold.

1) A (largish) number of people can't even do basic trouble-shooting on their computers. Continuing with the car analogy, it seems like number of people would have to get an expert to tell them their car is out of gas, and then fill it for them.

2) The number of people who actually "understand" computers is decreasing, while the number of "users" is increasing. I honestly don't know if this is actually the case (not something I've actually studied), but if true, it does present a problem. This could very quickly lead to a "have" vs. "have not" split between those who do understand computers (or can pay someone who does), and those who don't (and can't afford to pay someone). Only those who understand computers can fix them and write software, which gives them a huge amount of power...and separations of power like this are usually not a good thing.