Share this story

In practically every one of its major press conferences since last year's GPU Technology Conference, Nvidia has reminded us that they want to virtualize the graphics processor. The company wants to take it out of the computer on your lap or on your desk and put it into a server somewhere without you noticing the difference. It introduced the concept at GTC 2012. Then over the course of the next year, Nvidia unveiled the actual graphics cards that would enable this tech, started selling them to partners, and also stuck them in Nvidia Grid-branded servers aimed at both gamers and businesses.

The difference between Nvidia's initiatives and more traditional virtualization is that the company's products support relatively few users for the hardware they require. The Grid gaming server supports 24 users per server box and the Visual Computing Appliance (VCA) only supports eight or 16 depending on the model. Most virtualization is all about dynamically allocating resources like CPU cycles and RAM to give as many users as possible the bare minimum amount of power they need. Instead, Nvidia's is about providing a fixed number of users with a pretty specific amount of computing power, thus attempting to recreate the experience of using a regular old computer.

There are situations where this makes sense. Given the cost of buying and maintaining workstation hardware, Nvidia's argument for the VCA seems more or less convincing. But I'm slightly less optimistic about the prospect for the Grid gaming server, or any cloud gaming service, really—call it leftover skepticism from OnLive's meltdown earlier this year.

That doesn't mean the technology behind cloud gaming servers is bad, or that the problem they want to solve—that not all devices have high-end graphics hardware in them—doesn't exist. In fact, after logging some extended playtime with an Nvidia Project Shield console at this year's GTC, I'm convinced cloud gaming has a future. I just think it looks less like OnLive or the Grid gaming server, and more like what the Shield does—streaming all of the games you already own from a gaming PC already connected to your other, less powerful devices. Let's walk through the biggest issues with a service like OnLive or something based on Grid, then consider how moving the same technology on to your home network solves them.

Game library

Most of the problems we'll talk about here are going to be technical, but this one's all about business. No matter what technology you're using, a company like OnLive or one of Nvidia's Grid customers is going to have to cut deals with publishers and independent game developers to make those games available on the service. While the way Grid is implemented suggests that developers won't have to do any extra work to make games compatible, that won't be the case for all implementations. This whole process takes time and money, and in the end you're not guaranteed to get all of the games you want in any one service.

Streaming from your local Steam library neatly circumvents this limitation. Not only are all the games you own already paid for and available, but future games from most developers and publishers are practically guaranteed to come to Steam. There are, of course, notable exceptions—but they're exceptions and not the rule. Any new cloud gaming service would need to have a critical mass of customers to be able to provide a big game library, but there's no easy way to achieve critical mass of customers without a big game library.

Latency and responsiveness

No matter how fast your Internet connection is or how near you are to the server room actually rendering your game, Grid just won't be 100 percent as smooth as local rendering all of the time. There are too many variables in play—your own home network, your ISP's equipment and wiring, the load on your ISP's network, your distance from the server, and any number of other issues that could impact your experience negatively. Unlike something like Netflix, which can buffer video to mask these issues, cloud gaming needs to be streaming in real time all the time.

Now, move the computer doing the rendering from someone else's server room into your own home. The only limiting factor is going to be your own home network. The most unpredictable elements have been removed, and you'll have a much faster connection between your PC and your tablet than you'd be able to get through most Internet service providers.

Rendering performance

Each one of Nvidia's Grid servers can support up to two dozen users at a time, and each one of those users can only have so much GPU horsepower assigned to them at once. Measured in FLOPS, the amount is roughly equivalent to one of the company's mid-range GeForce GT 640 cards. Especially as games get better-looking and this technology moves beyond the 1366×768 display of the Shield console (and it seems reasonable that it will), you're going to need more graphics horsepower to make these games look as good as they can.

Granted, turning all of the settings in a given game all the way up is more of a "nice to have" than a "need to have." But there are plenty of gamers who game on a PC precisely because they want to crank the settings up beyond what a laptop or game console is capable of.

Cost

The main problem with the Grid gaming server from the perspective of a potential service provider is that it just doesn't seem like it would scale particularly well. It might be difficult to turn a profit between the initial hardware investment (which is made all the larger by the relatively low number of simultaneous connections each box supports), the need to cut and maintain software licensing deals with publishers, and the space and power required to keep all of the servers up and running. Keeping streaming local passes that cost on to gamers, of course, but if your Grid-powered gaming service can't turn a profit and had to shut down, they're going to be left out in the cold anyway.

Not a perfect solution, but a more realistic one

Despite all the potential upside, there are still drawbacks to streaming stuff locally as opposed to through a server like Grid. The first is that you have to own and maintain your own gaming PC. This probably isn't too onerous a task for the people interested in this sort of thing, but you'll have to pay for the cost of the hardware, the games, and the power usage rather than offloading it to some third party.

The second issue is that it becomes more difficult to leave the home with your PC game library. Even if you run your own VPN server at home (or if you otherwise modify your network connection to allow access to your game "server" from outside of your network), you're likely to encounter some of the same connectivity and bandwidth issues you'd get with a Grid server.

What we need now is a chance to spend a few days streaming games from our PC to the Shield or a Shield-like tablet. The streaming functionality seemed to work well enough in Nvidia's demo, but real-world testing is always warranted. If it works as intended, you can keep your cloud gaming service to yourself—my "private cloud" will be more than good enough.

Share this story

Andrew Cunningham
Andrew wrote and edited tech news and reviews at Ars Technica from 2012 to 2017, where he still occasionally freelances; he is currently a lead editor at Wirecutter. He also records a weekly book podcast called Overdue. Twitter@AndrewWrites

64 Reader Comments

The second issue is that it becomes more difficult to leave the home with your PC game library.

I don't see that as much of an issue. Currently, I can't leave the house with my PC game library *at all* unless I feel like 90's LAN-partying it up. Maybe with this kind of a solution and a decent internet connection (i.e., a non-US ISP) that might be possible.

Interesting idea, and I look forward to seeing how this develops. Thin clients do have their uses, and this seems like a good application. The real kicker will be how much new nVidia-specific hardware I'll have to buy, especially if this requires a special tablet.

The thing I don't understand is why anyone thinks I would want to play PC games on a handheld controller with tiny screen instead of on my 24" eIPS display. Or play FPSes without a keyboard+mouse, for that matter.

I actually tried this a year ago as you don't need anything special to stream your viewport and feed back the controls - just a basic RDP connection will do if you have enough bandwidth, if not you can compress the image with x264, which will eat a little into your framerates, but not a problem if you have say six cores.Anyway it's not so great - most games are not optimised for playing on a tablet and as others have said, why would I plat on a tablet if I can get a better experience on a TV, for example.From a technical point of view, it really is stupid to stream the bandwidth hungry video and have the controls local - the other way around - to have that graphical horsepower plugged in and to stream just the input, makes more sense, so I can definitely see smartphones and tablets being used as controllers more and more (there's even a chrome experiment for that, but that's too high latency IMO)

I actually tried this a year ago as you don't need anything special to stream your viewport and feed back the controls - just a basic RDP connection will do if you have enough bandwidth, if not you can compress the image with x264, which will eat a little into your framerates, but not a problem if you have say six cores.Anyway it's not so great - most games are not optimised for playing on a tablet and as others have said, why would I plat on a tablet if I can get a better experience on a TV, for example.From a technical point of view, it really is stupid to stream the bandwidth hungry video and have the controls local - the other way around - to have that graphical horsepower plugged in and to stream just the input, makes more sense, so I can definitely see smartphones and tablets being used as controllers more and more (there's even a chrome experiment for that, but that's too high latency IMO)

Details? Would love to be able to do this from my desktop to my HTPC. I was trying with VLC for video but it wasn't working.

Measured in FLOPS, the amount is roughly equivalent to one of the company's mid-range GeForce GT 640 cards.

So... why not just get one of those mid-range cards? I don't see the advantage. The space required by a graphics processor is dropping just like every other component. And while graphics require power, so does internet connectivity. This seems like it has almost -- but not quite -- reached the point of solving the problem caused by tablets which are almost -- but not quite -- achieving the power necessary for high-end games. At this rate, NVidia should have a perfect solution just in time for it to be totally unnecessary.

Neither technology offers any value or useful function. I don't see any way latency can be solved in any time soon so this whole ridiculous notion of streaming games is moot. The only way any of this is relevant is in the misbegotten marketing world where "value conscious" consumers can subscribe to a system which comes with a vast variety of games, a cable TV subscription of gaming where quantity shares an equal place with quality to justify its cost.

The comment about streaming to a tablet was hilarious. I'm in my home where I can house as many large screen displays as like, but instead I choose to use a device in no way optimized for gaming and with the smallest display possible. And I'm gonna watch movies on the little thing too. Sweet.

The one issue I rarely see mentioned in these articles, or ones about the Steam Box or alternatives, is controllers. In this case sure, you could stream to your desktop, but who in their right mind is going to put their most powerful computer in a closet to game an inch from a monitor on a less capable machine? But the general goal of these projects seems to be to bring gaming to every device, and most of those won't be using a mouse and keyboard. I have 80 or so games in my Steam library. I think five qualify for big picture mode. Sure a dozen or so aren't playable with a controller, but loads of them are with a controller keyboard mapping program which means they could easily have controller support baked in. Yet they don't. And many are "indie" titles from the last few years. Now I'm in a different position from most I'm sure, as I prefer consoles all the "big" titles have controller support but I'm playing those on one of the consoles (long live physical media!!!) so I'm mostly just irritated by the little platformer and shooter games I have to add controller support through a third-party utility. But in these scenarios, and the Steam Box, what are they planning to do? Is that what the client software is adding, it's own co repelled mapping software plus the display software? I haven't heard any talk about this issue.

I'm always going to want to play games in the best possible situation (and graphic fidelity is hardly the be-all aspect PC-centric folks). When I first started gaming back in the '70s a small screen wasn't much of an issue, that's all we had. Today the notion of playing a game on a handheld or phone or tablet makes little sense to me since I could be playing on an 83" TV with 7.1 speakers. I am never that desperate to game to give up the best possible environment I have access to. The same applies to streaming. Why would I lower the quality of the experience for the imperceptible advantages listed here. I see nothing being brought to the table but a step back from the gaming available since the turn of the century.

This implies you'd actually want to play such games on your tablet in the first place. I can honestly never say I looked at my Nexus 7 and though "gee, if only I could be playing Dishonored on here, that would be the best!"

I don't think streaming services are the future. My reason is essentially Moore's law. GPUs have become so powerful. In the next gen of consoles PS4/Xbox720 the games will look as good as you might ever want them to look. PS3/360 already look as good as is necessary for most situations Uncharted 2/God of War 3/4 are almost perfect. Provide enough hardware in the next gen that all games look like that and why would you need a streaming service for? Games will be restricted by production ( artists, animators ...) not by technology.

Streaming services will die for the same reason that Thinclients never became popular. They have too many annoying little quirks that negate their advantages and the need is not there because end user devices are more than powerful enough for everything you might want to throw at them.

Now that doesn't mean that there won't be niches where they do well. Playing game demos as Sony plans to do sounds like an awesome idea. Instead of downloading a couple GB of data just stream a couple minutes. Thinclients have their limited uses as well in training centers etc.

I have no issue using keyboard and mouse at my big screen (laptop tray works nice) and would love to move a game as I play it from one device to another particularly in "live game" situation.

Scenario one: crap, the wife wants to watch her twilight movie, hang on dear let me save and stop playing... Oh what's that? I should watch this drek now?Scenario two: I'm playing my fps on my pc and wife hates me off in my dungeon instead of snuggling on the couch/being visible and able to "talk"... But like hell im going to quit my epic 45 minute match of NS2 for that.

In my opinion, even 5ms of latency is unacceptable and I would much rather they drop the GFX down to something low-end hardware can support.

So I'm guessing you don't play any video games at all currently?

In a typical PC gaming setup, you'll see at least 3 frames at 60fps (3*1/60 = 50ms) of latency caused by the Mouse/KB->GameEngine->GPU->Monitor chain alone. I suspect this limit is hard to break because the latency incurred just to move the mouse on the Win7 desktop is 3 frames on a good 60Hz monitor. Consoles have traditionally been worse off at 66ms-100ms, even for the rare titles that actually manage to run at a stable 60fps on existing hardware. Crappy old 30fps is much more common on consoles.

Considering that 5ms isn't even one frame interval at 120fps, it's very unlikely that you'd be able to tell the difference if I added 5ms delay to your setup. Of course, an extra 50ms-100ms for 'cloud gaming' round trips is a whole different ballpark. An aggregate input lag of >100ms is going to give a terrible experience for any game.

Not sure I see this. Maybe for graphic artists having a local rendering server makes sense, but I doubt home users are going to want this. They'll still have to maintain the box and in the end it's more complex and less optimal than just getting a gaming computer and playing on that directly.

I think NVIDIA's tech will do more for movie special effects than it will for games. The only thing they have left to affect the gaming market are PC's, the Ouya, the Shield, and cell phone/tablet games.

With hardware becoming ever more powerful (heck, even cell phones jumping into 4+ cores), why in the world does anybody think there's a big demand for compute power to be offloaded to servers?

Hard enough to find something that will max out my own PC, don't see why all of my (mostly unused) compute power needs to be dragged off to a server.

Quote:

or that the problem they want to solve—that not all devices have high-end graphics hardware in them—doesn't exist.

Even without cloud gaming, however, I think the problem will eventually be solved. They're putting GPUs into cell phones these days! My iPhone has a GPU! Technology is only becoming better for local GPU power, not worse. Even if cloud gaming never takes off, I see a bright future for all types of gaming, even mobile.

I think people are getting too hung up on playing PC games on tablets (which IS definitely stupid but also demonstrates a lack of imagination if that's all you can think of to do with this.) I would love to replace all the expensive media and game devices in my home with little ARM thin clients and have a single powerhouse PC tower stowed someplace convenient. Definitely keeping the gaming PC, but what do I do for the TV? PS4? Steambox? HTPC -> for video only (cheap) or for video and games(expensive)? What about extra TVs? Dad likes games and so does his son. Why buy two expensive gaming machines? There are a lot of things this kind of setup can solve/improve. Spending 2 grand on a gaming PC is definitely a huge expense, but non-gamer spouses or parents won't mind so much if that same hardware powers the entire home's TV, on demand, web streaming video content. This is exactly the kind of functionality that cable companies have been charging out the ass for with extremely limited functionality. Kick those jerks out of your house and just run one internet connection, one cable/antenna TV connection into your graphics server and let your network do the rest. I don't see how this is anything other than a GREAT idea.

In my opinion, even 5ms of latency is unacceptable and I would much rather they drop the GFX down to something low-end hardware can support.

So I'm guessing you don't play any video games at all currently?

In a typical PC gaming setup, you'll see at least 3 frames at 60fps (3*1/60 = 50ms) of latency caused by the Mouse/KB->GameEngine->GPU->Monitor chain alone. I suspect this limit is hard to break because the latency incurred just to move the mouse on the Win7 desktop is 3 frames on a good 60Hz monitor. Consoles have traditionally been worse off at 66ms-100ms, even for the rare titles that actually manage to run at a stable 60fps on existing hardware. Crappy old 30fps is much more common on consoles.

Considering that 5ms isn't even one frame interval at 120fps, it's very unlikely that you'd be able to tell the difference if I added 5ms delay to your setup. Of course, an extra 50ms-100ms for 'cloud gaming' round trips is a whole different ballpark. An aggregate input lag of >100ms is going to give a terrible experience for any game.

The importance of lag really depends on the game. I would rank the importance by genre as follows:

1) Turn based games: if it works for chat it's fast enough for these games 2000ms - 500ms.2) Puzzle games: most are slow paced 250ms should be passable3) Real time strategy: twitch reflexes are mostly irrelevant 150ms can work.4) FPS, fighting, racing, and action games: twitch reflexes are critical 80ms can work but every 10ms is going to be felt by the user.

#4 is where I spend most of my game money which is why I could never go for onLive or any similar service. They cannot guarantee a consistent latency of under 80ms.

The thing I don't understand is why anyone thinks I would want to play PC games on a handheld controller with tiny screen instead of on my 24" eIPS display. Or play FPSes without a keyboard+mouse, for that matter.

It's a nice option for when you don't want to sit in front of your PC, or if you've got an HTPC and someone actually wants to watch TV or a movie. Game can run on your PC, you can play on the handheld unit.

Though, please continue talking about things like this. If attention comes back to a local client-server model as the best way to do things, maybe it'll kill off brain-dead ideas like 'The Personal Cloud' and mandatory isolated storage. My PDA and mouse have been infuriating the crap out of me lately.

I'd just like it to stream to my laptop. That way I can sit on the couch with my wife while she watches telly, and take advantage of the powerful graphics card and CPU in the desktop machine next door.

There's a service called streammygame that claims to do it but it sucks. Proper support from an outfit like Nvidia, I'd upgrade my desktop graphics card for that!

Cloud gaming is the golden calf for developers. From their side there are pretty much only upsides.

No physical copies means that you cut out a huge cost for freight, quality costs, printing of boxes, physical media etc. You are also cutting out those pesky middle men with shop front costs etc and you won't have to worry about second hand games 'eroding' your profit margins or physical stores destroying your games perceived value by putting your games on sale to clear out stock in storage. And all those 'lost sales' due to piracy will be but a memory too. Licensing games rather than selling them is like the gold pot at the end of the rainbow for publishers.

For the consumer however, I don't see the flip side of the coin. Sure, the equipment to receive and decode video and send user input is cheaper, however, once you bought the equipment, digital sales will be more expensive. Just use the PSN store as an example. Pretty much every AAA game sold in the PSN store is more expensive as a physical download compared to a physical copy, new, from eBay.

I am also not convinced that the game play will be good enough either. Input lag is going to be a huge input. Where I am in the world, I get about 37-42ms ping to the closest Google server, and that is very fast for me. Hardly any server beats that (Melbourne, Aus). So in best case scenario:

Send input: 37-42msrender "game segment" of say 10 frames (333ms worth of video): 50-100ms (plucked out of my ass, but this feels like a realistic time frame for a fast render cluster to produce 10 720p frames)encode 10 frames to h.264 video packet: 50-100ms (My old core2duo does pretty much 1 for 1 encode and I don't think a server farm would give any more FLOPS than this to each customer, but I could be wrong)Packet delivery: 37-42ms

So under ideal situations, I would get 174-284ms worth of input lag (not counting the device decoding and putting it on the telly/monitor). This may well be good enough for a mumorpeger but certainly not good enough for a shooter or any other type of 'twitch' game. Counter based fighting games and reflex based action adventure games with our beloved quicktime events come to mind (sarcasm delivered free of charge). However, most MMO's nowadays have already offset the need for cloud gaming with their business model, with most MMO's being free to play and already requiring an always on internet connection. It's the nature of the beast after all. So applying the high latency gaming where it's acceptable doesn't fix anything from the publisher’s perspective.

So apart from no longer owning your game but only a renting a license for said game (valid for as long as you keep paying your monthly fee), your gaming experience will be severely eroded and left in the hands of your ISP. And you trust those guys, right?

I've played a fair bit of Mass Effect 3 MP, and while very entertaining when it works (which it usually does), everything is fine. However, every once in a while when I want to play my game (which I actually bought the discs for) and my internet connection is a bit dodgy (which it is from time to time (thanks Optus/TPG)) I can't play. At all. I can't even play the SP unless I hack the game. I'm sure some CIV gamers are feeling similar love from EA atm.

So in short for the consumer:1. Less ownership which will quite possibly lead to less attachment.2. Worse gaming experience due to latency.3. 100% at your ISP's mercy. You can't even crack your store bought game to play local SP any longer.4. 100% at your ISP's mercy. Data caps are present here in Australia and the way things are looking, they will only get more prevalent everywhere. Streaming video is very data intensive, so you might need a bigger/faster data cap which will offset your saving from going with cloud based gaming in the first place.5. Video quality will most probably suffer, at least compared to a midrange PC game rig, but will probably be on par with the PS3 at the moment. However, can anyone say that the PS3 looks good by today’s standard?

Personally, I hope Gaikai and others goes the way of onlive and dies off quickly. However, I do think that publishers will continue to promote this business model, at our (the consumers) detriment to obtain the goal of total control over our experience.

Even without cloud gaming, however, I think the problem will eventually be solved. They're putting GPUs into cell phones these days! My iPhone has a GPU! Technology is only becoming better for local GPU power, not worse. Even if cloud gaming never takes off, I see a bright future for all types of gaming, even mobile.

They have GPU cores inside the SoC, which is nothing particularly new. What is new is that development of those cores has stepped up considerably.

But they're still many years behind existing GPUs that are going in PCs. The really big change is that gaming becomes more subtle in its improvements. A lot of the cell phone games around simply take advantage of being on rails, that details are much smaller and you'll never walk up close to that rock over there to see it's just a blurry pile of pixels that never improves in detail. Being able to design around the hardware limitations is becoming better, really. I wouldn't want a cell phone powering my gaming rig within the next decade though.

I do really like the idea of a single GPU in my house though. I have a HTPC, a Vita and a desktop, having one big chunky GPU powering all that would be amazing. Additionally, it might expand mobile development, it'd be great if PC developers were developing with something like the Vita in mind rather than just a touchscreen phone.

The importance of lag really depends on the game. I would rank the importance by genre as follows:

1) Turn based games: if it works for chat it's fast enough for these games 2000ms - 500ms.2) Puzzle games: most are slow paced 250ms should be passable3) Real time strategy: twitch reflexes are mostly irrelevant 150ms can work.4) FPS, fighting, racing, and action games: twitch reflexes are critical 80ms can work but every 10ms is going to be felt by the user.

#4 is where I spend most of my game money which is why I could never go for onLive or any similar service. They cannot guarantee a consistent latency of under 80ms.

I agree with the gist of what you're saying. It's just that once you've played on a good setup, it's very irritating to play on a laggy one even if the gameplay is slow-paced enough to tolerate it. The controls feel totally unresponsive. If the client side lag approaches the 200ms range, even the most casual gamers are starting to notice.

The could is a cute idea that will not take off for another 10 years...... I can do more legally with a bit of effort than any cloud based operation. Steam(cloud saves aside) is about as good as it will get for awhile.

With a wireless Xbox360 dongle ($50) and an IoGear wireless HDMI device$200) I have already completed this project! Far Cry 3 on 57" TV in glorious PC quality!

You are aware that such things as HDMI cables exist? They can stream both video and audio.If you dislike hiding cables, there are kinda flat ones and really flat cables (you can just put them on the wall and paint them over) if your house doesn't have cabling ducts.

Lossy image quality FTL.

I put double runs of HDMI where ever there is a HDMI output device all connected to a HDMI switch. It enables me to play on the big screen TV, and switch back to the 27" monitor if the wife wants to watch TV.

I dunno, Andrew, that seems rather silly to me. Why would anyone take the time to buy a gaming PC, and then not use it directly? It's sitting right there! Why would you go through all the pain to virtualize your game to a tablet?

I suppose it could be of some use for multiplayer games, but that's a horribly inefficient architecture for multiplayer. You'd need enough bandwidth for a full resolution stream for every client. Even a good 11n connection isn't going to be able to support more than three or four of those. It's horrifically inefficient to stream millions of bytes for multiple screens, instead of hundreds of bytes for the client to tell the multiplayer server what it's doing, and then thousands of bytes for it to describe to your client what everyone else is up to. It should be something like three orders of magnitude more bandwidth-efficient to architect it that way.

Even something like a shared D&D game, with everyone having tablets networked with the DM, can be done as well or better without virtualized graphics. It just makes no sense to swap video data instead of doing a regular client/server multiplayer architecture. And the singleplayer scenario seems even less likely, since you can just plunk your butt down in the computer chair.

I don't see much market there.

The strongest use case I see for virtualizable graphics would be the ability to run Windows games at full speed under Mac OS and Linux. If they had cards that would truly do that, I'd be very interested in buying one. I just bought a GTX 680, so I'm not likely to upgrade again for probably at least three years, maybe four. But a card that could be virtualized might get me off the fence much sooner.

-- #4 is where I spend most of my game money which is why I could never go for onLive or any similar service. They cannot guarantee a consistent latency of under 80ms.

- I agree with the gist of what you're saying. It's just that once you've played on a good setup, it's very irritating to play on a laggy one even if the gameplay is slow-paced enough to tolerate it. The controls feel totally unresponsive. If the client side lag approaches the 200ms range, even the most casual gamers are starting to notice.

I am doing some tests on my Windows PC with NoMachine installed. This computer is available on my own home network. I connect to it from my laptop (for mobility issues more than anything) when I'm downstairs and when playing games, it's showing less lag than you are saying is what you need. Have tested it from outside, and I have to say, it's showing good results.

I actually tried this a year ago as you don't need anything special to stream your viewport and feed back the controls - just a basic RDP connection will do if you have enough bandwidth, if not you can compress the image with x264, which will eat a little into your framerates, but not a problem if you have say six cores.

Details? Would love to be able to do this from my desktop to my HTPC. I was trying with VLC for video but it wasn't working.

Considering the probable latency and connection issues (*cough* SimCity *cough*) I really fail to see how this is going to work well at all (and I wouldn't be buying it if it did).

But maybe I'm underestimating nVidia, maybe they've solved the latency problems by developing subspace communications. Although if that's the case, I'm dissapointed that this is the best use for it they could come up with.