Posted
by
samzenpus
on Wednesday April 10, 2013 @07:36PM
from the get-in-upstairs-and-play-your-games dept.

CyberSlugGump writes "Computer scientists at UC San Diego have developed a 3D first-person video game designed to teach young students Java programming. In CodeSpells, a wizard must help a land of gnomes by writing spells in Java. Simple quests teach main Java components such as conditional and loop statements. Research presented March 8 at the 2013 SIGCSE Technical Symposium indicate that a test group of 40 girls aged 10-12 mastered many programming concepts in just one hour of playing."

The entire game looks pretty basic - and who the heck cares? Watch a two year old and see what happens when you give em a present. They are as likely to play in the big box as with the toy.

Graphics might be important for the latest 3D shooter, but a good game doesn't HAVE to have cutting edge graphics. A game with amazing graphics can still be crap.

If the idea is to teach kids how to code, and they enjoy playing the game enough to at least learn a little coding - then it is a GREAT product. If I was ten and wanted to learn java and had a choice of following tutorials/reading books/etc or playing a game that taught me the concepts, then I certainly know how I would have learned java. Sure, all my projectst might also include a random "Save the GNOMES!!" routine, but you know..

I got almost my entire introduction to programming in a C-like language through being a Wizard on an LP-Mud back in my student days in London. And that had no graphical UI at all - just text only. Until that point I had no prior programming experience, but it taught me the fundamentals that have now served me for over 20 years as a developer.

Go home, put your tin foil hat on, and protect your lawn. Java's not a great language, but it's an ok language for certain uses, and it's also one of the most frequently taught languages in universities. It's a decent way to teach the fundamentals of programming and object-oriented design, while also having the benefit of a C-like syntax which means it's easier for someone who knows Java to adapt to C or C++. "Creating the illusion" that anyone can develop computer software won't do a damn thing to help

I've seen java programs actually run *faster* than native code under certain circumstances. In particular, object allocation with the 'new' keyword in Java is often far faster than dynamic allocation with the 'new' keyword in C++, even when you factor in the costs of garbage collection compared to manually invoking delete in C++.

I really can't tell if you're trolling. If you're not trolling then you should know that the reason heap allocation is faster in Java is because Java allocates a sizeable chunk of the systems memory on runtime startup to use as a pool allocator. In C++ you're expected to come up with your own allocation strategies because you know more than the compiler about what the high level behavior should be. What you were actually benchmarking was the overhead of the system calls/context switches that come with opera

Of course it would... but in C++, you have to manually write the code to be optimized for whatever types of objects you were pooling, where the java runtime environment is smart enough to figure out how to do that on the fly, without having to spend any effort writing custom allocators for each type of object.

Funny thing is.. well, not funny it's actually kind of sad... most common high end languages today are scripting languages or interpreted with JIT. The traditional "compile it first and use a minimal run time" method is considered something that only ancient dinosaurs did before they died off. So in that sense Java is probably one of the fastest languages in that set.

Yeah, that's fairly outdated thinking. Speed isn't derived by the language anymore. It's the execution that counts. Java Compiles down to op-code, which is run in the JVM. The JVM has decades worth of run-time optimizations. The majority of large scale web sites are written in Java.

Hey, ever heard of Hadoop. You know the large scale Map-reduce framework based on Google's technology that sorts terabyte and petabyte of data? Java.

yes, I've heard of Hadoop - the framework that fixes java performance by splitting its execution up across a couple hundred machines.:-)

The framework language isn''t the performance part though - it just acts as a manager to send data to a group of workers and aggregate the results back, its the workers that are important. If they are slow, the whole thing is slow. So its best to write these in a native language. Its not anything special to Java either - Google's mapreduce implementation is written in pyth [google.com]

When Java first took off, and the web was made of Java content executed via plugin, Java was written by idiots who concatenated strings instead of using string builders, and similar abuses of common sense through ignorance and teaching materials that focused on results rather than good practice. Executables outside of plugins suffered the same deficiencies, although they were probably attempting loftier goals, and the performance was... what is the opposite of magnified, because it was slower than a sloth taking a crap?

This lasted a number of years, even as the Java interpreter became stable and work was made to increase its performance. Idiot coders learned or abandoned Java, and the runtime made even the remaining idiots look better, if not "good".

If you don't find this comment amusing, you either lack historical perspective, are a Java programmer, or should consult a medical professional to be diagnosed for your deficiency in some manner or other.

Security problems these days seem to be focused on the browser plugin, rather than locally executing native apps, so the security comments mostly don't apply. Visiting a random internet web page and allowing it to execute poorly sand-boxed arbitrary code is a bit like licking random strangers' genitals. In case that interests you, let me state that it should not be done as a general practice, and you should consult a medical professional.

I have read Java for over a decade, and I have coded in Java for 3 years or so. Having experience with x86 ASM (AT&T and MASM), K&R C, ANSI C, GWBasic, Turbo Pascal, C++ (VC 5-2010, gcc 2.x - 3.x, mingw), VB 5-6, C#, VB.NET, Python, Powershell, JavaScript (advanced, not your normal getElementById().Blink() shit) and several other introductions, I can say this:

Java examples in the real world and in most printed books are the most incestuous, groupthink-y, overly-architected piles of verbosity I have ever had the displeasure to read. I completely understand the need for default parameters, dependency injection, constructor and method chaining, and all kinds of modern best practice.

But I have never seen another language embrace the overbearance of best practice teachings without implementing some balance of solution soundness. Java examples and implementations (open source of course, because I have read them) seem to abound with overloaded methods under 5 lines of code, which initialize another parameter to call another overload. Now you have multiple functions to unit test, multiple code paths, multiple exception sources, and unless you are brainwashed in the spirit of Java, comprehension of the complete workings are complicated by scrolling off-screen with essentially purpose-free function declarations, whitespace between functions, and an essentially functional programming paradigm split over several different methods to give the appearance of flexibility, OOP, and conscious design.

It reads to me like someone wrote that no method should ever take more than one additional parameter that you were not already given, and coherence be damned. I would much rather see a single method with 5 non-optional parameters than 5 overloads which calculate and pass one new parameter each time.

The Java paradigm seems to be calculating things within the overloaded methods is preferable to factoring out these into unrelated functions. In a truly sane, OOP world, those calculations would be a part of the object, or if sufficiently general would be part of the object's base object.

In fact, the Java approach seems to be the Builder design pattern, which I have not seen adopted as frequently as it should be. Obligatory link here. [stackoverflow.com]

As sensible as the Builder pattern seems to be, I think it would still require a number of extra Set/Get property methods, which are function calls. Maybe Java has optimized this, but if you don't adopt it optimization can't he

One of the issues that has always rankled me hard was the "cookie cutter" nature of the world events in those games, as well as the limiting gameplay options, so I had this idea for "obfuscated and sigilized" programming syntax as the basis for a game's magic system. Rather than presenting a loop as a nested block of instructions, it would depict it as a "container", with subcomponents inside. Kind of a mix of flowcharting and stylized syntax.

The idea was that the layout of the "enchantment" could be moved and teased to make clever images out of the interconnected containers and symbolic representations, to make the programmatical nature of the system less banal, and much more aesthetically attractive, while simultanously making the kinds of magic and counter magic highly diverse and dynamic.

I never really did much with the idea (ideas aren't worth much, despite what the USPTO and several shell corps may claim. Implementations are far more valuable.), and all the "on paper" mental models I tried kept having non-trivial problems.

I like seeing that somebody had a similar idea, and made a working implementation.

Battlegrounds in World of Warcraft were kind of awesome back before they (quite correctly, I guess) put all sorts of restrictions on what kinds of things could be scripted. I used to *own* the level-19 battlegrounds with a warlock and an addon I wrote to keep track of enemy targets and optimally distribute my various curses and afflictions. I just ran around mashing the spacebar like crazy, because among the few restrictions was that every action had to be tied to a hardware event.

My nearly six-year-old is doing great things (for a kindergartner) with KTurtle -- which is really a pretty cool environment (I was surprised to find). He also spends much time hacking crazy stuff with redstone in Minecraft. The next logical step to real programming language seems to me, keeping it fun and relevant to his interests, is to introduce some javascript (as much as I dislike it) so he can mess up web pages with little effort. From there it seems python is the friendliest, easiest and most resource-rich multi-purpose playground.

Maybe CodeSpell will be something to check out eventually. Though the java example on their blog doesn't look all that fun to me. I hope its fun. If it gets to the point where I'm teaching the kid OOP, and all the verbose java syntax requirements, he'll probably only want to make minecraft mods. That's what CodeSpell is up against in this house.

I would think any language that's fairly simple and can produce instant results would be a good language to introduce a child to. I say this because I had BASIC at the age of 5 and I could type out a few lines of code, hit run and see the results (almost) instantly. Better yet was having a ton of software written in BASIC that I could load up, tinker with and then try. I'm not really in touch with most modern languages, so I don't know what's out there that would have that same kind of feel, but I know

There have been games that have done this before, even well before "The Diamond Age" was an idea in Stephenson's head. The problem is these kinds of games are so few and far between that it's fairly notable when one comes up.

While I certainly played my fair share of standard games as a kid, I also had quite a few educational ones as well (granted some of them were below me, my parents bought me a math game based on my age and not my ability). As much as I hate coding now (mostly due to syntax crap in lang

Java is an OK language, but it's kind of bureaucratic and boring. I can't think of a better way to suck all the magic out of a fantasy game than to have the spells written in Java---except maybe having the kids produce an ER diagram and a set of tables in Boyce-Codd normal form.

At the very least, they could do without the pointless punctuation. Does a spell really have to have semicolons and empty parentheses to denote that the spell is imperative?

While I agree with your sentiment about "why use Java" for something like this, I also really applaud this kind of thing. Yeah, different language or language invented specifically for this app probably would have been better, but introducing kids to programming at an early age is win over all.

From the dictionary: "to become thoroughly proficient"
I think I need an hour of CodeSpells and I can add Java proficiency to my CV; I've only spend a hundred hours coding in it, so I've set my skill as "exposed to" instead.

Try tekkit for minecraft, it will give you a mod called computercraft which will allow you to place computers with consoles on the map and even hook up wireless modems and a disk drive to them. Using lua you can then program these computers to do whatever you want basically, me and my brother made 3 train stations which would handle carts and track switches with the computers.
You can even program "bots" with lua and have them build structures and whatnot. They can even defend your area if you want. All this is done with lua inside the minecraft game. You can of course import larger scripts from outside the game since typing them in the console minecraft provides can take a while.

How about Javascript and run in the browser or on the cloud instead? There's nothing commenting on why Java was chosen but it seems a very surprising decision to come out of a computer science department... or maybe not. Are academics really keeping pace with technology or the public interaction with technology?

I was just visting with the good folks at the local Python users group. Nice folks, but the when I dug into where the actual jobs are it was clear Python was not a bread winner by a long shot. Most of them were using only Python when they were bidding out the work and the client had no input into the language to use. That tended to be side gigs. The 9-to-5 work was usually Java or.Net.

Python certainly has it's great points, but so do a dozen of so does Groovy, Clojure, Ruby and Scala. I know a lot of f

At the elementary level I don't think the choice of language with reference to the business environment is that important. Teaching kids that they can make their computer/tablet/whatever DO STUFF and presenting it in an easy to digest format is much more valuable that what is big in the industry now. Keep in mind, elementary age... at least 20 years (on average) until they start rolling to the job market. Not all of them will be programmers. The languages we use now may be dying by that point. they may not all be programmers, some may be scientists using more focused languages in the vein of Matlab, some may be homemakers, some may be athletes, some may be artists. But they will all have an appreciation for technology and what it can do. They will all get introduced to logic and algorithmic structure at a much earlier age than is normal right now. Those things easily apply to other aspects of life. Hell, if they keep at programming strictly on a hobby basis, they may even catch on to when the developers at their company/organization/whatever are BS'ing them about what can and cannot be accomplished.

While the mods may not agree with you very strongly, I've seen a wealth of evidence that says Java is a bad introductory language. The CS department at my alma mater switched from an all-Java curriculum to one with a Python intro, and the student attrition rate dropped by a significant margin. A friend of mine—the daughter of two CS profs—was dead-set on avoiding programming as a teenager until I introduced her to languages other than Java.

Indeed. Seriously fuck java. I decided to try to learn Java after quite a few years of not coding (not that I ever got that advanced with it anyway). There are not enough derogatory words in all languages combined to describe my hatred of Java. It seems simple and straightforward at first, but this is deception! (To be fair, some of my issue is with the buzzword like jargon associated with it and OOP in general, fucking insanity). Sure, basic math to advanced math (which is generally what I use code f

Irony of ironies, C# is almost exactly like Java at the language level, only with a totally different object hierarchy, which is why it's easier for UI development. The.NET hierarchy is somewhat influenced by classic VB, which was a very well-developed and efficient (if sometimes limiting) format for expressing common UI needs.

Java's popularity, sadly, has to do exactly with that OOP evangelism. In the late eighties and early nineties, academic software engineers were absolutely convinced OOP was the silver-bullet software development paradigm for all ills, since encapsulation (hiding methods) made code re-use practical. They also believed it was the end to all programming practices that inhibited re-use, particularly global variables. Unfortunately they made the mistake of conflating these practices with "laziness," and very mistakenly believed in a bizarrely Victorian fashion that all beginners should be forced to use only best practices, as though we should be teaching infants proper manners straight out of the crib.

It's stupid enough that I sometimes wonder if it was a massive conspiracy by Sun's marketing department, but to be honest computing has always been full of fads like this. In the early eighties, logic programming was The Way Of The Future; everyone thought that Prolog and constraint-satisfaction-based expert systems (basically, fancy predicate logic expression solvers) would dominate computing for the rest of time. Today, there are only a few niches where new Prolog code is considered desirable.

Oh trust me, I get that about C#, it's just that much of a step up above Java (and both have roots in C, so the transition was fairly easy). Granted I'm old enough to where when I started programming OOP hadn't even become a fad yet (to be fair, I started at a much earlier age than most of my peers - PC's weren't common in the home at the time - mid 80's). Though I still tend to think procedurally, I like the concepts of OOP, just hate that evangelism and the needless jargon introduced to separate it fro

Just like in math, CS is riddled with context-specific names for refinements of the same thing. You wouldn't want to conflate a ring with a field, right? For what it's worth, though, the first OOP language, Simula, just called them "procedures" at the syntax level. And in the case of encapsulation, there really isn't a good, compact term for "writing all of your code properly so that nothing inappropriate is publicly accessible," so that was kind of a new concept that needed a new name.

As a Math major I totally get that. Especially at the more advanced level where you start using perpendicular, orthogonal and normal pretty much interchangeably. In programming, I've used "procedure", "subroutine", "function" and "method" to describe what is conceptually the same thing. I like function best because of the easy tie in with Math, granted given the mathematic definition of a function, I can see where a function with no return value seems bizarre.

In my opinion the worst OOP offences are what sane people call "setters" and "getters", but are what are probably called "mutators" and "accessors". To most people they're a necessary evil when you want to limit the range of values a variable can take on (although C# does this transparently with properties, which Microsoft (confusingly) recommends starting with an upper-case letter... what the hell, Microsoft?) but in Java, students are often taught to write them even when they're completely transparent and

I still have enough of the old procedural mindset in me to avoid setters, getters, mutators and accessors all at once! What is even better is that/.'s apparently recently implemented spelling/grammar check doesn't like mutator or accessor. +1 for the sane people I suppose? I don't write code to be rolled out into a large package to be reused by someone else elsewhere in an obtuse and poorly implemented fashion. I will set my variables to what I want, how I want, when I want.

That's elitist bullshit. 90% of the population of the world could easily learn to program and learn to do it proficiently. If we taught binary and boolean logic earlier in life, programing would be second nature. And yes I interview crapy engineers regularly. It's not lack of capability holding them back, but rather piss poor education.

The world could use more elitism, instead of dumbing kids down by teaching them that we should value mediocrity. Everyone gets a trophy, woo-hoo! But a job? Gee, kid - Sorry we didn't prepare you to actually compete when you get to the real world...

90% of the population of the world could easily learn to program and learn to do it proficiently.

Aahahahahaha... Oh, man, stop, ya killin' me here!

With a lot of effort, you could teach most people to use cookie-cutter VBS sni

Perhaps to someone who has been trained in C but not Java. The biggest problems with C when compared against Java is the limited extent of its standard library, sorting through the plethora of poorly documented non-standard libraries that are available (vs Java, where if there isn't a standard for it, then the next obvious stop is apache.org) and the fact that you need to understand the hardware architecture of the system you are developing for in a lot of cases, as well as distinctions between stack and heap and a bunch of low level gotchas in the language that are far from obvious to the newbie, or even to experienced developers sometimes.

The only obtuseness I've encountered in Java is the hoops you have to jump through to interoperate with an API from another language that uses unsigned types.

C tutorials on the other hand are full of examples where *(x+1) is used interchangeably with x[1], which ends up becoming a habit for years until one day you hit one of the edge cases where those statements are not equivalent and are left scratching your head as to why your program is crashing. Endianness is never hidden behind an API, since the whole

In C, you don't have to create a separate namespace with specifically named folders and files, and a separate class just to be able to say "hello world". Kids lose interest very quickly if they don't get results, having to learn about OOP before they write their first program isn't going to work.

No, but it might not hurt to try and find a way to introduce those concepts earlier? The same goes for programming, logic is a segment of mathematics that is essential to programming yet doesn't require concepts such as addition and its offshoots (almost every other operation commonly used in Math). Boolean and Binary logic are the foundation of programming and are simple enough that a 5 year old can grasp them. Sure, they may not be able to design complex circuits at that age (though you might get the o

So we're agreed, using C as the first introduction to programming is a bad idea? Not that I'm advocating Java, or even Python as the best solution, both are only marginally better. Scratch is more like it - as simple to learn as Logo, but less boring (Logo held my interest for about half an hour when I was young, by that time I'd about reached the limitations in what could be done with a pen and 4 directional commands).

Pointers are not complicated, I'm sorry. Mabye for 8 year olds, but that's why they should learn Python. It's actually really easy, it's a very popular language, and it teaches good coding practices as well as jack-off object oriented concepts.

No pointers are complicated in any large size software project. Because they point to memory which you have to handle allocating and freeing yourself when not needed and not being referenced. When you have pointers to pointers to pointers then deciding what code has the responsibility of handling freeing what pointer memory which is not always handled by the same code it becomes easy to make a mistake. Garbage collected programming solves all these low level tedius memory handling automatically.

White space works to denote code blocks only if everyone agrees on what readable code looks like. But since K&R taught everyone the wrong way, you might find it hard to get people to agree. With white space being insignifacant each reader of the code can format it however they want. One might argue that we should not even be storing formatted code but leave the format up to the ide/editor.

With white space being insignifacant each reader of the code can format it however they want.

And that's a bad thing. I mean you don't have people choosing what they want to call the commands in a programming language. That would be chaos. Some people have tried... remember#define BEGIN {#define END }

With indenting as with commands, it's far better of the language defines a common standard, with deviations being a warning or an error. That way, when are working collaboratively their code is guaranteed to be in the same indentation style.

Code should be readable be the person reading it. If you find one form of indentation to be readable and your partner another, then you should each be able to format it however you want. This is, after all, why we use high level languages and not machine code. So why not take it one step further and leave format up to the reader. Store the code in an easily compilable format and let the editor format it to each reader's preference.

This is 2013, we shouldn't have to indent manually still. If you want to cut/paste a few lines of of code from one section to another, if the indentation doesn't match it can be seriously annoying in python to get it all correct. Compare this to java, where as long as it is between the curly brackets I know it will be OK. Press the shortcut for auto-indent, and I can tell immediately if it is in the right place.

The reason that Java isn't as fun to program is the same reason that it's good for businesses. The language is very restrictive and prescriptive of how you should do things. For programmers that want flexibility and power, the constraints and extra typing (dual-meaning intended) chafe. But when you're using it as part of a large group, those same constraints become the things you can depend on. Where is a certain class located? Java requires it to be in a certain directory. What methods are available on a class? Java's static type system was designed to make tooling easy, so your IDE will tell you. And even talented programmers can mess up manual memory management...the less-talented wouldn't stand a chance without Java's memory management. The list of things that Java prevents you from screwing up is quite long.

Basically, for my home coding projects and projects where I work with a small team of talented developers, Java is one of my last choices. But for my boring 9-5 job where I'm working with 30 knuckle-draggers who don't understand the purpose of an interface, let alone how to write functional code that's easy to read, I want them writing Java and I'm willing to pay the Java price to get that.

I think Java is fun to program in for exactly those reasons. For me the fun in programming is getting cool results and Java allows one to create complex stuff without having to constantly worry about shooting oneself in the foot. It allows me to use my full brain capacity for the actual algorithms I want to create and doesn't add lots of cognitive load. Especially when using a powerful IDE, like Netbeans.

right. you know everything. PLaying with something and engaging in it and relating it to fun and excitement like Harry Potter and "expelliarmus" (and knowing some of the roots of words and such) can teach you more than purely "rote memorization" could..
Rote memorization could maybe get them to pass a multiple choice exam where they can pick out which is "a conditional statement" out of the choices, or perhaps which is a valid beginning or end of a loop construct, but play-acting and engaging the mind int

I mean we can apply classical conditioning, operant conditioning, and other methods all day long. Providing the material in a way that engages, interests and challenges an individual is always best, but the method for doing that may vary from individual to individual.

Yeah, you go ahead and explain loops and conditional statements to 40 10-year-olds. They'll learn it in 5, master it in 10, forget all about it in 15. They'll probably be bored, too.

Or you can use a software like this which will engage them, encourage them, and help them remember it when they go home that night. It sure would be a shame if they were excited to learn more the next day and had a platform that was there to teach them and give you time to grade their math tests.

One of the best things I ever had as a kid was a TRS-80 (CoCo - and not a true TRS-80 either, even though that was stamped on it) that booted to a BASIC interpreter. The code for any games I loaded directly off disk could be tinkered with easily, no need to compile. This was awesome as a curious 5 year old.

Even better about it were the games "Rocky's Boots" and "Robot Odyssey". These games taught me the basics of digital electronics, lessons which have actually helped in my current career as a technician (with no formal training in digital logic). Seeing this kind of software being produced in a modern setting is awesome, I wish there was more of it.

I really think algorithm structure and design (from a math perspective) is more important for a beginning coder than things like OOP and memory management, yes those are important, especially with how prevalent OOP has become, however OOP is just a wrapper around the math and the memory management will flow from sound and logical structures. Pseudocode is probably the best first step, aside from it lacking the ability to be executed.

Yes, hence me despising Apple in their current state (pretty much since Jobs came back, though they had been rocky for a few years before that). I will however give credit where credit is due. Many icons in the code world got their start on an Apple machine. Many of them still have a lot of sentiment for the platform, why try to completely marginalize an entire group? Especially when some of the younger ones may not know any better?