Cloud-streamed gaming services like OnLive and PlayStation Now have yet to be made widely available to markets outside of North America and Europe, and high latency has made such services unplayable from far flung locations like Australia so they're all geolocked anyway.

However, Microsoft Research seems to think that tyranny of distance might be a solvable problem, having recently outlined their findings on a project they're called "DeLorean" that is essentially an advanced form of client prediction that instead of just guessing a single most likely outcome for each user movement, calculates several.

In this paper, we present DeLorean, a speculative execution system for mobile cloud gaming that is able to mask up to 250ms of network latency. DeLorean produces speculative rendered frames of future possible outcomes, delivering them to the client one entire RTT ahead of time; clients perceive no latency. To achieve this, DeLorean combines: 1) future input prediction; 2) state space subsampling and time shifting; 3) misprediction compensation; and 4) bandwidth compression.

To evaluate the prediction and speculation techniques in DeLorean, we use two high quality, commerciallyreleased games: a twitch-based first person shooter, Doom 3, and an action role playing game, Fable 3. Through user studies and performance benchmarks, we find that players overwhelmingly prefer DeLorean to traditional thin-client gaming where the network RTT is fully visible, and that DeLorean successfully mimics playing across a low-latency network.

Masking up to 250ms of latency? Surely like usual client prediction it won't be without side effects, but if it pans out and makes high latency cloud-gaming feasible, that would be particularly good news for Australian gamers.

They do mention that it becomes untenable when accounting for multiple players, which pretty much rules multiplayer out. Sounds like it would only really work in games where the software has a direct hook into the AI calculations and only has to do predictions for a single player.

Masking up to 250ms of latency? Surely like usual client prediction it won't be without side effects, but if it pans out and makes high latency cloud-gaming feasible, that would be particularly good news for Australian gamers.

Can someone explain to me when cloud gaming would ever be better than running it locally? I can understand why it is 100x better for the publishers for obvious reasons. Cloud gaming as a consumer tho? I see no advantages over the existing client -> server model for multiplayer and for single player? Why would i ever want cloud processing for single player.

Can someone explain to me when cloud gaming would ever be better than running it locally? I can understand why it is 100x better for the publishers for obvious reasons. Cloud gaming as a consumer tho? I see no advantages over the existing client -> server model for multiplayer and for single player? Why would i ever want cloud processing for single player.

Assuming you have 0 latency, you could have a barebones computer that probably never requires upgrading and you could via cloud computing play any game you like.

Can someone explain to me when cloud gaming would ever be better than running it locally? I can understand why it is 100x better for the publishers for obvious reasons. Cloud gaming as a consumer tho?

From a technical aspect, none. Clearly f*****g us over is worth enough money that these companies are willing to even BUY the gaming hardware for you to play on, so they can turn it off whenever, etc. Imagine is TF2, CS, Doom, Quake ran that way. Those games would be pure history and the years of user content wouldn't even exist. A more agressive example would be EA closing off BF3 24 hours after they release BF4. Why? Because f*** you give us money.

It must be worth a lot of money for a second reason - why would anyone in MS spend (most likely) millions on developing software to support this type of architecture? If anything, as a gamer, I'm alarmed by this. One day we mightn't even have games that run on our PC's anymore. Imagine it, now you need to download "EA Origin-Play" to stream BF5 to our computer and there is no other way to play the game. "Her durr yeah but it's BF5" - Ok, CS3 then, or Trine 3, or UT2020, insert your game here.

Assuming you have 0 latency, you could have a barebones computer that probably never requires upgrading and you could via cloud computing play any game you like.

Never needs upgrading is a misnomer, and having 0 latency is also a misnomer. This would be a perfect line for a cloud gaming company to use really.

It's a perfect (or nightmare? heh) world scenario sure but there are advantages and possibilities. I would love to be able to game anywhere with my Mac, wouldnt that be amazing?

You're taking the very pessimistic view point as usual. Valve wouldnt really change TF2 (or change itself), you would still get the opportunity to make custom items (which you dont even use TF2 game to create anyway).

0 latency cloud computer (f*** you and your minomer, everything has latency blah blah yeah I get it) would be amazing as a desktop could potentially have the capabilities of a super computer. The possibilities of non gaming uses are endless from [email protected] sort of computing to running algorithms straight from your desktop.

You're taking the very pessimistic view point as usual. Valve wouldnt really change TF2 (or change itself), you would still get the opportunity to make custom items (which you dont even use TF2 game to create anyway).

You're taking the very simplistic view as usual. I'm saying if we went back in time and this stuff was available back then, that we most likely wouldn't have TF2 (and many other classics) now. This is a boiling frog situation, some of us are much more sensitive to the change of water temp and are seeing these industry issues early on. I also don't see why you would want to make custom items for a game which in the example, wouldn't exist anymore. This is telling me that you haven't even comprehended what I originally said.

0 latency cloud computer (f*** you and your minomer, everything has latency blah blah yeah I get it) would be amazing as a desktop could potentially have the capabilities of a super computer. The possibilities of non gaming uses are endless from [email protected] sort of computing to running algorithms straight from your desktop.

Pipe dream. Light and electricity take time to get from one point to another and unless you can break the laws of physics then you have nothing to add to that besides "Wouldnt it be cool" pipe dreams. At least Microsoft are smart enough to realise this before they invested money into getting around an issue which doesn't (wouldn't) exist your perfect world. As for super computing at home, we already have this using streaming and remote. We can already [email protected] on a server, and we can already do heavy calculations remotely. This is all about pre-caching a ton of 3D information that needs to be sent in real time. It's something nobody has done before and a bold claim.

Stop being self centered, man.

Just keeping it real, man. "I" didn't create physics, "Im" not the physics police officer so what part of my post is self centered when it's based around pure facts?

We should revisit this in 12 months time when cloud gaming is no more revolutionised than it currently is, just for laughs you know. What Microsoft is (saying they are) going to do will not only revolutionise gaming, but the entire remote/cloud computing industry as a whole, it's a bold claim they've made and it'll be interesting to see what they come out with. Thinking it's easier to just avoid talking tech on AG sometimes, lel.

It works astoundingly well: In testing with Doom 3 and Fable 3, DeLorean was able to mask a round-trip time of up to 250 milliseconds, a lag time that would usually make for an utterly unplayable game. Players weren’t able to discern a difference between local gameplay and the DeLorean-powered cloud system.

There’s a drawback, though: DeLorean’s data-heavy setup can send nearly five times as much information than a simple real-time gaming stream. That kind of load would require a seriously beefy connection.

How much bandwidth does a bit-perfect stream cost, at 1080 res? Around 10mbps? So ~50mbps there, and it's still compressed. What about 1440p? 4K? It raises more questions than answers them and I think it's valid to be weary of the claims until it's available to the public and gets reviewed by trusted reviewers. Look at Mantle, look at Freesync, there's a ton of overhyped crap that simply doesn't do as claimed out there these days.

Please though, indulge me with more pipe dreams. You might just convince me that history has it wrong. Your original suggestion that the latency on a stream can be fixed by adding 5 minutes more latency tells me that you aren't on the same page at all.

Argumentative debate would be a better choice of words, and whilst ph33x points out the most extreme "What If" scenario it is something that could be an option for publishers, EA are doing their subscription service, and who would of though the word 'expansion' was a thing in the past DLC is all the rave, can you imagine 10 years ago if all the Half-Life mods were paid DLC, would CS, TFC etc be where they are today?

The only benefit I can think off for cloud computing / streaming would be being able to stream games to my Surface tablet while I'm at University.

On another note the same thing could also be said for Steam, I have 177 games on Steam what happens if Valve one day decide to terminate the service? Would cloud computing be any different then how Steam is being used today? Just means I don't need to download and install.

Weakling technical dunces these days. So hard to have any kind of robust gaming tech discussions in this forum sadly. Oh well, gg for what it's worth.

I did like your post though Dan as mentioning no MP support does indicate the rest of your post most likely to be on the ball in regards to the game being segmented across the network instead of merely manipulating the inputs and streaming the outputs. Maybe it has something to do with that high-speed/timelapse 3D stuff MS was showing off just a few weeks ago. I wouldn't think of Doom as being the most 3rd party friendly game to test with either.

Still, I'm hyped to see the outcome of this. Here's also hoping that MS gives it all it's got and it doesn't become a failure. Mainly so it doesn't scare someone else off who thinks they are close to breaking the ice on cloud gaming latency issues. It's mainly interesting to see if it can be done fully and completely, because most of the advantages of cloud gaming I doubt I'll reap anyways.

So it's as i thought there are really no advantages for the consumer. I wouldn't say that having dumb terminals is huge advantage, hardware to run games these days is so cheap and we're fast approaching the point where it's not hardware rather cost of development that is limiting graphical fidelity.

Also I don't think ph33x is overly cynical about the possible direction they'd take with it. You only have to look at what has happened in the last 5 - 10 years. The evidence is all there that given the opportunity they will fleece you at every turn. If they had the ability to stop or shutdown "old" titles and force you to upgrade or pay more money for the latest and greatest, you can bet they'll find a way to spin it and do it.

Cloud computing is the next mobile app like service. It isn't supposed to compete with locally rendered hardcore FPS's.

It will however be competing with mobile based games, instead of requiring your mobile to smash out super 4K HD 3D Farmville, it can instead get it from the cloud server. Also those games don't need high precision multiplayer actors, they can just grab that info from the server too.
You could imagine the battery time saved not having to process and render all that stuff.