I must agree it looked to work somewhat nicely, but people are taking this cloud computing thing way too far. I want to run the stuff on my own box and not somewhere else. Aside privacy conserns, it doesn't make that much sense to run games via internet line. Theres lag issues, bandwidth issues, connectivity issues and latency issues. Sometimes the old model works better than the new 'cool' model.

However, the video doesn't tell anything about the lag when used via internet. The server might be sitting just next to him. I dont really think it can stream stable, fullscreen 1680x1020 video at 30-60fps over the internet and still also be as responsive to mouse and keyboard as you would be playing it on your computer. Hell, that's practically impossible in lan too, even basic remote desktoping or X sessions are somewhat laggy in lan.

No one says you can't still run things locally if you want to. Why do people see something that isn't a good fit for them, and immediately think it's an either/or thing? This whole system is for people who want to play on the go, who don't like to install stuff, who like the convenience, or who are new to a game and want to try it out. It is an expansion to the modern gaming experience, not a replacement, and it's a very positive thing.

If they release it exclusively, then choose another game that plays locally. The market will provide, if there is demand. The real mitigating factor is piracy, which is the main reason publishers pulled back from the PC in the first place.

As for the rest of your arguments, I share your concerns for the present, but ultimately it boils down to "640k should be enough for anybody" - stuff will only get faster, pipes bigger, etc. These are early days.

If they release it exclusively, then choose another game that plays locally.

Do you have some tips on how I can convince a family member who has specified a specific title on a wish list to want a different game instead? If one really wants to play Halo 3, for instance, Metroid Prime 3 isn't a close enough substitute.

ultimately it boils down to "640k should be enough for anybody" - stuff will only get faster

Not necessarily. If I'm in Chicago, and their servers are in Virginia and California, the speed of light establishes a lower bound on the latency between a keypress and its reflection on the client.

pipes bigger, etc. These are early days.

The reality of the non-market for Internet access in the United States i

Well, the speed of light is pretty fast...like, for all practical purposes instantaneous for the distances we're talking about.

Twitch games, such as Tetris Shirase [youtube.com], can be totally ruined with even 33 ms (two frames at 60 Hz) of additional lag. A round trip from Chicago to LA and back is 5600 km. At the speed of light, that takes 18.7 ms, not even counting router delays.

Its the gateway to DRM and consumers having less power over the items the buy.

That is in no way a positive thing, unless you're a gamemaker.

The day I am unable to buy a box and own a local copy of the game software is the day I stop gaming, period.

The greed of these people knows no bounds. I'm sure book publishers would love to stop second-hand sales with some bullshit cloud-computing solution too (a solution, mind you, to a problem that doesn't exist).

I don't think lag is the problem per-se. You're already exposed to it when playing games in multiplayer and it is not the end of the world.

Just think of the required bandwidth though. I play my games in 1900x1200, so using 30fps and 3 bytes per pixel (granted an arbitrary assumption) comes out to 205,200,000 bytes/second. I don't know about you guys, but that is slightly faster than my current internet connection. You could use video compression, but the requirements for that both for the cloud and f

You're forgetting here that multiplayer games transfer significantly less data. Pretty much only player movements/positions and some small information. Besides that, the pings usually range from 50 (own country) to 200-400 (some other country near you). If you live in USA, replace country with state, tho interstate connections are probably better than off country. Now imagine moving your mouse in the game to look around. How is that 50-400ms lag working for you now? Sure, bandwidth and speeds can get better

... pings usually range from 50 (own country) to 200-400 (some other country near you). If you live in USA, replace country with state, tho interstate connections are probably better than off country. Now imagine moving your mouse in the game to look around. How is that 50-400ms lag working for you now? Sure, bandwidth and speeds can get better, but latency is harder.

I thought about that as well after the fact, and I agree with you. You can add that to the list of issues. It only strengthens my argument that cloud performance 3D gaming is not an option for the foreseeable future (i.e. next 10-20+ years). Now other types of gaming, e.g. Civ-like 2D games might be a different story.

Perry alludes to the distance, indicating that he's further than he's ever been from the servers. We don't know if that's 300 miles, or 10 feet. He goes on and on about the importance of latency in gaming, meaning I think they are at least properly sensitive to the importance of getting that aspect right.

Even more significantly, he's looking for closed beta testers, meaning he's actually ready to show the tech to the public in some form. Finger's crossed!

And just imagine the bill when your kids go over the bandwidth cap! Seriously unless all these 'cloud computing" bunches are going to start buying out the telecos and running fiber everywhere these kinds of thing will be a non starter.

The places where you have a teleco/cable duopoly (which is most places in the USA) simply haven't been running the backbone they need or investing in infrastructure and instead took all those tax breaks we gave them for nationwide broadband and all the money we paid for serv

They'll partner with ISPs, so you're paying a subscription model to access the games on a server located at your ISPs premises. The data you use will be part and parcel of your subscription to the service. It won't be viable any other way. Think more in terms of Gametap, with increased fees for access to newer games. It's not going to be a big dent in PC gaming whatever they do. This is for the millions of set-top TV boxes that are going to be produced to take advantage of the features, with the compression

He needs to get back to making games. Dude ported some great games back in the day, Smash TV, Cool Spot, Aladdin. Then started Shiny which put out some real classics, Earthworm Jim, Wild 9, MDK. Then Shiny made 2 crappy Matrix games and did nothing else of note until getting bought out.

Perry, quit dicking around on the business side of the industry and make some fucking games again!

Well considering that the tier 1 internet server would still have to be able to run the games, meaning Windows, and be able to render the graphics meaning probably nvidia workstation hardware, and the fact that neither company is moronic in their licensing, I'd say it'd be a profit bonanza for those two companies. Microsoft would probably gladly give up a portion of the desktop market to get the enterprise licensing this sort of monstrosity would take.

What exactly is the point of having games run in The Cloud, other than the wish to remain buzzword compliant? It seems like such a waste of network resources, and a pointless centralization of computing resources as well.

Frankly, I can't wait for the "cloud computing" bubble to finally burst.

You forgot the "get off my lawn". I don't see "cloud" computing leaving anytime soon, or ever, because it makes the most sense from both a business and a consumer perspective.

1. Company X develops "product" Y which remains on their servers.2. Charge people for an account to gain access.3. ???4.Profit!

From the consumer perspective, cheaper PC's and what "appears" to be cheaper software (pay-as-you-go, "low monthly fee", etc), access to all your shit from anywhere (via some universal wireless not yet develope

I heartily agree. But only on the grounds that it's not actually possible for something to leave before it arrives.

We've been bombarded with hype about the "cloud" for a while now, but where's the beef? So far the only application that used to be run on desktops and has really taken off online is webmail. And that's ancient technology. I'd been using webmail for years before the first dotcom bubble.

The point is that you don't have to spend lots of money on expensive hardware, a simple web browser with internet connectivity (over a fat pipe!) is all it takes. This saves you large amounts of cash.

It also opens up the possibility of allowing your games and applications to stay with you wherever you go. You could bring up a GaiKai iPhone app and play WoW during lunch or on the train or wherever. Once you get home you can fire it up on your big screen TV via the media center's web browser.

What exactly is the point of having games run in The Cloud, other than the wish to remain buzzword compliant?

Gamers spend serious cash on hardware. AIUI, a typical gamer will put down an average of $500-1000 per annum on hardware, just so that they can continue running the latest and greatest games. That hardware sits unused most of the time.

The idea of cloud gaming is to put some fraction of that money into shared hardware instead. You'll spend maybe $180 per annum renting access to the hardware, but y

Casual gamers can try the game instantly, at work, in the library, anywhere.

Except they won't actually be able to do that at all. Workplaces will block these services. Libraries will block these services. And if you can afford to regularly stream HD video to a mobile device, you can probably afford a Playstation.

Try Before You Buy is a really nice model.

Except this isn't "try before you buy". It's "pay per play". Remember arcades? Well, this won't be "insert coin", it'll be "insert credit card". And

- Why no mention of what connection he is on? Or for that matter, why no mention of where the server is located? (besides some vague "Oh, I've never been as far from a server as I am at the moment!")
- Where's the fullscreen? I can see how it would be quite hard to properly stream current screensizes (such as 1680x1050, or even 1280x1024)

Other than that, I noticed a few odd things, such as:
- When playing MarioKart 64, at the end he all of a sudden crashes into a wall, which he tells is because "he hasn't been playing the game for quite some time"; Seems quite odd, and looked more like it had to do with the actual command not properly coming through.
- Howcome he's allowed to have MK64 running on an emulator anyways? I thought it was illegal to do so (even if you have the game yourself); though I might be wrong on that.

Whereas I like the idea (but can't see myself using it in the next decades), I think the price has to be really low for people to actually use it. Though I can definitely see a use for it for some new sort of console (Phantom, anyone?:D ), where one would be using a subscription service to be able to play a big library.
Still, I always wondered how this would scale if it got really popular: I can't imagine a computer being able to stream multiple high-graphics game for multiple clients.

I remain skeptical also. The resolution is small for one thing. Is there a way to get larger resolution for these games? I'm used to games taking up my entire screen.

Also what will be cost of this service? I mean is it aimed at people who can't get proper hardware for these games? If you can't afford the hardware why would you afford this service? What happens when this service goes down?

(a) Making of Additional Copy or Adaptation by Owner of Copy. -- Notwithstanding the provisions of section 106, it is not an infringement for the owner of a copy of a computer program to make or authorize the making of another copy or adaptation of that computer program provided:

I'm not sure the law requires that you use your own ROM dumper. As I read it it requires you to own "a copy" which authorizes you to make "another copy" of "that computer program". So by my reading if you own "a copy (of Mario Kart 64)" the law authorizes you to make "another copy (of Mario Kart 64)". A ROM of Mario Kart 64 downloaded from the internet is "another copy (of Mario Kart 64)".

Of course, any copy made of the ROM may only be used for running the program on a computer. Since making further copi

Indeed. The person distributed the ROM online is likely violating the law, but the downloader is probably not. Some theories indicate that the downloading is technically legal even if you do not own the game. If you do have the game, then it is very likely not an issue.

Making the ROM yourself, or having somebody make it for you (pay them to dump you game cart) is virtually indisputably legal. (Although for some systems like the DS, this requires circumventing a copy control mechanism.) Nintendo has some sor

During the video he says he's using less then 1 Mb/s connection speed, though he doesn't say how much he actually has available. On the FAQ page he says the server is 800 miles away, but that he has a 21 ms ping.

It does also seem to me that fullscreen means more bandwidth. All that's going his way is the video, but streaming full-screen video is obviously more bandwidth intensive then streaming a lower resolution.

He crashes into the wall because he's stopped playing, as far as I can tell. I suspect he's using a gamepad or something, because playing mario cart with a mouse just isn't feasible. I assume he set the gamepad down, and then a second or so later we see the mouse going to the "close" button. And I believe that it's only downloading ROM's that's illegal. The emulator itself and using your own ROMs should both be fair use.

Honestly, I like the idea of remote-running programs - I'd assumed that's the way things would end up going as soon as I heard people actually buying netbooks. I think it's something I'll use extensively eventually. Of course, I completely reject the idea of letting someone else host them for me - I suspect eventually people will have home servers plus netbooks or something like that. So I won't be using *this* service, but I don't doubt that I'll eventually be running something like it. Also I'm certain I won't be running photoshop inside flash inside firefox. If this sort of thing gets popular there will be a custom application for it.

(3) Data travel distance is around 800 miles (round trip) on this demo as that's where the server is. I get a 21 millisecond ping on that route. My final delay will be 10 milliseconds as I just added a server in Irvine California yesterday, but it's not added to our grid yet. (So this demo is twice the delay I personally would get, the good news is I don't notice it anyway.)

Regarding the bandwidth: take a look at the left side of the video, it shows the needed info.

One thing many people missed: Almost at the end of the demo, he shows Photoshop CS4, and then he moves the windows. Take a good look at the cursor, specially when the cursor is out of the window it shows something which might be familiar to any Linux user:)

I'm certainly not signing up for anything that absolutely requires an active high bandwidth connection to play single player offline games until companies like Comcast have been brought to heel.

They're already complaining about those pesky high-bandwidth users, they aren't upgrading their infrastructure, and they're charging fees for just about anything they can think of. Now wait until their metered plan really takes off, and tell me about gaming in the cloud. Any savings from hardware cost with this setup will be eaten by increasing ISP charges.

Besides, really, aren't we reaching the point where mandatory PC upgrades for games are much farther apart, really mitigating that factor?

Comcast won't be brought to heel, US internet consumers will be forced to accept bandwidth caps which means Comcast won't need to be assholes to make a profit. All you can eat doesn't work anymore, but the US consumers won't let ISPs end it, so they have to be even more evil than usual.

This could be helped by making deals with the ISPs, where the gaming datacenter is peered directly with the ISP's core network, and in exchange the ISP doesn't meter (or gives a higher limit for) the data going across this peering.

So I look at this and think "some machine, some where has to be running the code". When you play flash games, all the work is being done on your local machine. When I play wow, its pegging a 2ghz processor to the extent it slows other things running in the background noticeably. When you start doing complex work in photoshop, your limitation is often the amount of memory in the machine running it. While this is awesome for streaming content from remote servers, I really question the ability to provide t

If standard multiplayer gaming, which involves far less data being sent back and forth between players, still doesn't properly work and often ends up in disconnections, lag and desync issues, I seriously doubt they'll fare any better, bar having servers in every major city in the world with fiber connections linking them together. At that point, the price tag would probably be so high it would become useless to subscribe. Sure, right now it's looking good with a handful of players all set in perfect conditi

While some claim that bandwidth isn't up to snuff, and maintaining enough servers to support a massive number of gamers is not feasible, etc, etc, this will all change in a short amount of time. This type of service is on the cusp of being a reality, and it will change computing forever. If a video game can be supported through a remote terminal, then ANY application could be supported. Eventually more and more apps will be available only on the cloud, and hardware costs will go down, then you'll find that 20 years later only dumb terminals exist in the hands of the average consumer. As wireless connectivity matures even phones will all just be dumb terminals. You never need to upgrade your phone, unless you want a bigger screen or different input method. The applications you RENT will be unpiratable, because there is no publicly available platform to run them on, and you can use the same app across your desktop and phone, but with modified interfaces as the device would report it's capabilities to the cloud, and the cloud would change the interface appropriately.

How does linux fit into all of this? Will there be a new ideological movement in the future to keep processing power in the hands of consumers? How do you install linux on a dumb terminal?

I think linux will be the most favoured OS of the dumb terminals. Since all OSs are going to do exactly the same thing (run clients for other stuff), users won't care about interface, familiarity or whatever is keeping the "linux revolution". They are going to care about the price. And linux is clearly a winner there.

That may be true in the sense that the dumb terminals will be using an embeddable form of Linux with little capability. When virtually all computing power is on the cloud, in the hands of private corporations, who's going to be coding open source apps for the cloud platform? There will probably be several platforms out there, but they will all probably work on something similar to the Apple store model, where your rights are severely limited. Is that where we want the open source community to flourish?

Optical routers will bring router delays down, eventually, so pings will be closer to their theoretical limits.

I don't think it's meant to be all-encompassing, but that's a typical internet misunderstanding. Given a ping of 50 or less would make enough games playable for those who did not want the dedicated hardware in their house (like, for that one game on another platform you want to try). This will be a nice niche service, just not a replacement.

The same technology underlying remote desktop or similar systems is whats at work here - send a frame buffer - raw pixel rectangles - down the wire after some compression.

Most of the compression works on the idea that the delta doesn't change to much from frame to frame so they only send data about what did change. When that isn't true, say in the case of gaming, VNC clients stop working well. Certainly, they stop working well under a 1mbit connection.

What worries me is the 3D as it would be done in software but then again.

It's fairly easy to get a 3D card to render to an off-screen buffer and then grab that buffer for encoding. My suspicion is he has the graphics card piping data to a custom hardware video encoder (probably based on an off-the-shelf FPGA PCIe card, which are available for about $3-400 each) in order to reduce latency. Realtime video compression is hard, and he'll have to be doing everything he can to minimize latency.

The ping requirements make this unfeasible. As more "casual" people get connections that trade-off bandwidth for latency (such as wireless broadband or just plain crappy ISPs that 80% of the US has to deal with) I don't see how this could ever be usable except for a select few.

How about being able to stream your own games like this when you're on the go. Say I'm at a friends who has a good net connection, nice screen, but some crippled onboard video (not to mention I'm not going to bring my games and do the install/update/patch setup the config thing). It would be nice to log into your own 'kaigai' server and play your collection. Sell me this as a product instead of a service and I'm onboard.

First off, from what I saw both these models (OnLive and Gaikai) seem to be based on you buying the games through them.

If so, what about games I already own? I own games on physical media (like Spore) that are already on the catalog for Gaikai. I also own digital copies of games bought through Steam. Am I completely out of luck trying to use anything I already own with such a service?

What if I want to mod my game? It's incredibly common for people like WoW players to use addons for enhancing functions and f