Science and technology

Internet games

Immobile gaming

IN "WRECK-IT Ralph", an animated film about the off-duty life of video game characters, the titular bad guy winds up in a racing program called Sugar Rush, and ultimately befriends an unpopular character named Vanellope von Schweetz. Her flaw? She is a "glitch", and therefore a hazard to herself and the game's future, as she regularly fritzes into clumps of code disrupting the function of those around her.

In real life, the internet multiplayer video-game named "Glitch" has faced its own popularity contest and lost. A year go Tiny Speck, the game's developer, opted to pull back from a full release into beta testing. Instead, its programmers tinkered with the game's mechanics to make it more rewarding for long-term play. The firm's founder noted then that Glitch could attract people through the first four of six stages he defined for game engagement. But it could not make them "fall in love" (stage five) and "get married" (stage six).

Now, with too few wedding bells ringing, Tiny Speck says the system will be shuttered on December 9th at 8pm Pacific Time. A FAQ provides some grief counselling, as the death of a game means the permanent disappearance of a world, its characters and all the relationships forged within it. The company will refund in-game purchases back to November 2011 (but not subscription fees). A site has sprung up to catalogue good-bye and thank-you notes from players.

The firm puts part of the blame on its decision to use Flash, a technology that once provided the only sensible way to create interactive content that could be accessed via a webpage on many operating systems. Flash's star has dimmed in recent years after it failed to make a successful leap to mobile, and its maker, Adobe, said last November that it would give up future development for Flash on Android and other mobile systems. (Steve Jobs may have had harsh words for Flash, but Apple did not kill it; blame the user-friendly, non-proprietary portability of HTML5 websites and the speed of mobile apps programmed for individual platforms.)

It is hard to fault Tiny Speck for its decision to use Flash. Work on "Glitch" began in 2009 before truly powerful mobile devices really took off. Flash provided access to most Macintosh and Windows users, and it seemed likely at the time that it would spread to Apple's iOS, as well as thrive on Android, RIM's BlackBerry, Microsoft's Windows Mobile (later Windows Phone), HP's WebOS and other, now mostly forgotten platforms. It didn't.

The challenge for online games is to reach the point at which the massive cost of personnel and running or renting servers can be offset by subscriptions and purchases of virtual goods. Few games can boast the accolades showered on "Glitch" for its aesthetic appeal and richness of its world. But without ready access to the hundreds of millions of mobile users, "Glitch" was an evolutionary dead end.

I've played Glitch for a year, and I've loved every minute of it. But it never could've handled a transition to higher popularity like the founder envisioned. While the community was especially civil and kind, especially for a MMO, it had a dark side too. People were bizarrely concerned with gender identity and sexual preferences, there was a weird level of immaturity among supposed adults, and worst of all, no one could tolerate dissent ever. Yeah, games have thrived with subpar userbases, but such obnoxiousness is par for the course. I think the current community's stifling attitude would actively turn away most potential users.

In 2009 it was still vaguely possible to think you could write a single codebase and take it to multiple platforms (like folks do in console gaming).

Apple did not approve. They lost in the desktop space to a brutal 400lb gorilla who absolutely obliterated them in the development tools arena. They decided to use, essentially, their own in-house programming language.

Google decided to use Java.

In 2012, if I want to make a game that runs on webtop and mobile, I'm writing 3 separate client apps plus a server.

In 2009 it was still vaguely possible to think you could write a single codebase and take it to multiple platforms (like folks do in console gaming).

Apple did not approve. They lost in the desktop space to a brutal 400lb gorilla who absolutely obliterated them in the development tools arena. They decided to use, essentially, their own in-house programming language.

Google decided to use Java.

In 2012, if I want to make a game that runs on webtop and mobile, I'm writing 3 separate client apps plus a server.

" But without ready access to the hundreds of millions of mobile users, "Glitch" was an evolutionary dead end."

Well, i don't know about that - if i needed to waste time at work on my phone, i'd rather comment on the Economist. And that's all mobile online games are good for. Away for that, i only play one game online - World of Tanks - and it seems to be doing pretty good. And i don't think you can play it on the phone. So i don't think PC/mice games will be a dead end any time soon.

In 2009, it was unclear if Apple was being recalcitrant about iOS or if Adobe would be unable to provide. Same with Android: the first Android phone shipped that year, and Google and Adobe appeared to have a commitment to make Flash work.

"it seemed likely at the time that it would spread to Apple's iOS, as well as thrive on Android"

It never seemed likely or even feasible that Flash would ever appear on iOS and it was never really viable on Android either - very few vendors provided it pre-installed as it performed so badly.

I cannot speak for Glitch or their development team and hindsight makes it easy to have insightful looking opinions, but the iPad was launched in early 2010, by which time the iPhone had been around for 3 years. The "dead end" should really have been in sight around then.

That said, developing games in HTML5 is very different to Flash and needs different development skills and they may have felt they could not build the game they wanted to using the HTML5 of 2010, which although only 2 years ago was still evolving and was a different prospect to today, especially on Android devices stuck in the 2.x "rut".

There are still plenty of well funded and motivated "immobile" customers, and thus potential for commercial success of the potentially more immersive PC platform games. What is causing them quite a bit more difficulty is two factors:

1)The transition from subscription to free-to-play business models is causing difficulties for developers that do not understand it, and for customers that resent the often coercive ways it is being deployed.

2)Last year 57% of interactive media investment went into Facebook games, and 30% into mobile games. This has left "immobile" games weakly funded and often cancelled. This presents an opportunity for a competition-light environment for such titles that do make it to market, however.

With Facebook games imploding, mostly for reason #1, both of these factors should get sorted out once less coercive styles of free to play monetization hit the market and start getting emulated.

"Apple did all they could to kill flash as it benefitted the iOS eco system...Flash failed to become an industry standard on portable devices because the most powerful platform with the best conversion rates rejected it outright."

That's an interesting retroactive interpretation, but I don't believe history supports it precisely in that fashion.

Adobe never, to my knowledge, demonstrated an effective working version of Flash in iOS during the period that they were attempting to convince Apple to commit to it. They did offer a cross-compilation solution at one point, which I understand had issues, but could produce native iOS apps.

If you recall, Adobe didn't ship a decently functioning version of Flash for Android on comparably equipped mobile hardware until 2010, at which point the game was already lost, and Google and Microsoft had essentially decided (even though neither firm stated this bluntly until late 2010 and early 2011) that Flash was a dead end for their purposes, too.

I simply don't believe Adobe could overcome the limitations of an interpreted system with its Flash codebase in iOS until iOS devices became much more powerful. Flash ultimately was middling to good in Android, but even when Flash was available and in a rich release in early 2010, Android developers opted for native apps.

"I think that you are fixating on allowing developers to make as much money with the least effort, and disregarding that in the process future innovation would be undermined."

Well, having worked and started-up numerous small software companies, yes. I agree, completely. Financial efficiency is awesome and necessary for any small business -- even if we make our money by wiggling our fingers.

Roxio, the company that made "Angry Birds", is in no way concerned that their success is going to have any kind of negative impact on the long-term provision of technology. Neither am I.

Angry Birds isn't any different on Android or iPhone (or Blackberry, or Symbian, etc etc) because they can create a very compelling experience with a lowest-common-denominator user experience (nothing more than sprite blitting and a touch-screen).

Angry Birds' success is a strong proof of concept for the idea that "lowest-common-denominator user experience" can still be wildly successful.

10 years from now, competition among platform developers will have pushed that "LCDUX" to be something completely different than it is now, but it will still always be financially attractive to be able to write your awesome design just once -- and make it show up everywhere that makes sense.

Respectfully, all you just said here was "different platforms have different capabilities and therefore should all have different apps."

If you're talking about a single app that you already know you want to write (as in the original post), then you could cheerfully wish to be able to write that app once and get it everywhere.

But you cannot. Not because that's a "good thing" but because platform developers try to lock their software developers into having to overcome heavy API inertia.

HTML5 was what Apple pretended would match flash -- but they know as well as anyone that HTML5 won't match a native ObjC app. Nor a native Android Java app. Steve Jobs was blowing his normal smoke into everyone's collective orifices.

But neither the iPhone nor the Android platforms are different enough from each other to validate any "need" for their API stacks to be utterly different. I'm sure Apple and Google would want you to believe so... but really it's not.

So if you want to go write the next "Angry Birds" you get to hire 3x as many developers as you would if there was a proper metaware for the standardized platforms. And you're not going to give one fig for any of the specialized features on whatever particular platform you happen to be developing upon.

Or you wait until you hit it big on one platform and then spend the money to port it later. That's certainly what Apple wants, since now they're the gorilla of the marketplace. That's what they learned from Microsoft in the 90s.

As an iOS developer now responsible for rolling out cross platform HTML5 solutions I can confirm that Apple did all they could to kill flash as it benefitted the iOS eco system.

Flash failed to become an industry standard on portable devices because the most powerful platform with the best conversion rates rejected it outright. Where once developers could have used flash as a means to unify the costs of developing across a broad range of devices as they did browsers, the popularity of iOS and its lack of flash simply created a niche market unto itself.

Developers have been forced to write native applications - a costly and time consuming move - that 'fit' the Apple standard suite, extending the perceived depth of the Jobs' walled guarden via familiarity and strict adherence to human/user interaction guidelines.

HTML5 was not, and still isn't, a viable replacement for Flash. We're still a year or two away from mature frameworks that are easy to use, performant on the majority of end-user hardware and most importantly, cross platform. With HTML5 becoming a 'living spec', there may never come a time where a developer can target a feature with a good degree of confidence that the market will support it at release.

When IE was the lone dominent browser I wrote one 'IE-only' implementation per feature. Now I have to write IE, Firefox, Opera, Webkit (iOS) and WebKit (Android) implementations/fixes plus complicated abstractions that permit some code reuse between them. A Flash-less world is fast becoming a costly joke rather than the tyranny-free evolutionary programming world of the future.

"It implies that there is a way to have a platform that can excel without the competition from other platforms, which I don't believe we've seen in the modern era of computing."

No, sir, and I apologise for giving you any reason to make such an inference.

More specifically, the needs for platform developers differ strongly from the needs for software developers. Platform developers are certainly going to maximize value by standing out from their competitors on the feature tree.

Software developers are going to be able to split their maximization of value by either diving completely into a single platform and using every last competitive feature of the platform, **or** they're going to maximize by writing to the lowest-common-denominator of the available platforms and compete on the basis of a strong internal design.

Wii or Kinect programming would be the former. "Angry Birds" would be the latter. Hell, they even ported Angry Birds to the oxygen-starved Symbian platform.

I spent most of my money-making efforts in the 90's on the software side of computer games, and back when you could write everything in C++ I helped curate a crossplatform engine that would let us write an app once and run it on both Windows and Mac.

My experience (which may be different these days) is that most games software development is still the former -- design driven -- rather than the latter -- platform driven.

Even though you have to write that design bespoke to each platform upon which you wish to compete.

I've played Glitch for a year, and I've loved every minute of it. But it never could've handled a transition to higher popularity like the founder envisioned. While the community was especially civil and kind, especially for a MMO, it had a dark side too. People were bizarrely concerned with gender identity and sexual preferences, there was a weird level of immaturity among supposed adults, and worst of all, no one could tolerate dissent ever. Yeah, games have thrived with subpar userbases, but such obnoxiousness is par for the course. I think the current community's stifling attitude would actively turn away most potential users.

I think that you are fixating on allowing developers to make as much money with the least effort, and disregarding that in the process future innovation would be undermined.

Granted that a lot of the internals (and externals) of systems are arbitrary and not necessarily the most efficient or sensible, but an externally imposed constraint to a common base would force innovative ideas to conform to a base that hadn’t foreseen the new ways of doing things. If the Blackberry way of doing things had become entrenched then the iPhone couldn’t have radically shifted to a new paradigm. Nor could experiments in different interfaces undermine the dominance of the keyboard.

Red tape and bureaucracy is the cornerstone of entrenching the status quo, and committees inherently lag innovation not promote it. Flash itself was a response to the inherent constraints in HTML, and it *forced* change in HTML, and by your own admission HTML *still* hasn’t caught up. Granted that if internet applications had been restricted to HTML then porting to other platforms would be trivial, but those applications would be far less sophisticated than they are today.

"HTML5 was what Apple pretended would match flash": For less than a year, I believe. By 2009, many tens of thousands of iPhone apps were available, but it remained unclear whether Flash would remain a valid multi-platform tool and whether Jobs would cave on the issue (which, at the time, didn't seem as much Adobe's responsibility as it did later).

Adobe also allowed (and Apple blocked and later rescinded a ban) compiling Flash into standalone native iOS apps, an approach that I presume the Glitch developers found unacceptable.

"neither the iPhone nor the Android platforms are different enough from each other to validate any "need" for their API stacks": That is a fascinating contention. It implies that there is a way to have a platform that can excel without the competition from other platforms, which I don't believe we've seen in the modern era of computing.

I can't envision a market environment (including the presence of no-cost alternatives like Linux) in which a single platform could meet the needs of competing interests, even if those were entirely about control rather than about user experience.

Programming for multiple platforms and languages is a given for software developers.

Console gaming platforms are no different. There’s a radical difference between programming for a Wii and for an Xbox with Kinect. They have different mechanics and the type of gaming is different. As such a common codebase is something you *don’t* want. Even if a game can be ported from one to the next, if it is done without understanding or accommodating those differences then the port will be useless.

For PC applications you can program for Unix, for Windows or for Mac OS, plus a host of real-time OS’s and languages. Not only are there different look and feels for each, but the expectations in terms of performance, security and application footprint can be radically different. Again, a common codebase is not sensible here. Any business model must know which platforms and approaches are key for their client base and focus on those.

Mobile platforms have the same issues. Touch screens are becoming common place, but not all platforms have them. Some platforms are specifically designed for economy and focus on texting and email while others are full-featured with sophisticated virtual interfaces. You can’t have a common codebase here either. Granted that different development languages just add to the problem, but as you noted, those languages are tied to specific platform lines, and the developer must choose which is best for their business model and client base.

All of these differences reflect differences in client needs and the continual advances due to innovation. And this is a good thing. A static, common base would stifle innovation and force customers to make do with bloated or distorted messes that don’t really meet their needs. It is clearly harder for developers to create applications across all platforms, but the flip side is that it allows developers to capture niche markets and for new entrants to compete with long-established applications.

In sum, accommodating the needs and specifics of any given platform is the core job of developers. They shouldn’t be complaining about it, they should be embracing it for it allows them to differentiate themselves in the marketplace.

Your analysis is perfectly spot on. Glitch's problem was in timing, perhaps, rather than describing a failure of all games that don't have a mobile component. For the sort of game Glitch is, it needed a continuous availability across platforms, and Flash was expected to be the glue.