AMD's new graphics cards want to power your cloud gaming services.

AMD has had a few noteworthy successes in gaming lately. Sony's PlayStation 4 will be using both CPUs and GPUs from the company, and Microsoft's next-generation Xbox is widely expected to do the same. And now, the company has made several announcements at this year's Game Developers Conference that look to maintain that momentum.

AMD announced a couple of new desktop graphics cards, but far more interesting is their entry into a market that Nvidia has been pushing for a little over a year now: cloud gaming. AMD has announced three new "Radeon Sky" server graphics cards that are in many ways similar to the Nvidia Grid cards that its competitor announced late last year. Here's all we know about them, based on both AMD's GDC press conference and the Radeon Sky product pages.

The cards

Enlarge/ The Radeon Sky lineup. It's worth noting that all three cards are passively cooled.

AMD

There are three cards that will be available to start with: the Radeon Sky 900 is a dual-slot card with two GPUs running at 825MHz and 6GB of GDDR5 RAM on a 384-bit interface (each GPU can access 3GB of this RAM). This card includes 3584 of AMD's stream processors across its two GPUs, and consumes 300W of power under load.

The Radeon Sky 700 also includes 6GB of GDDR5 RAM on a 384-bit bus, but has only one 900MHz GPU with 1792 stream processors—it's the same GPU as in the Sky 900, but there's only one of them. This dual-slot card consumes 225W of power under load.

Finally, the Radeon Sky 500 is the weakest card of the bunch. It has a single 950MHz GPU with 1280 stream processors and 4GB of GDDR5 RAM on a 256-bit bus. The benefit is that the card is both physically smaller (it's a single-slot card rather than a dual-slot) and consumes less power (150W).

Like Nvidia, AMD is using the same basic "Graphics Core Next" architecture across most of its product line at this point—the GPU's 900 and 700-series Sky cards have the same number of stream processors as a Radeon HD 7950 desktop card, while the Sky 500 uses the same number as the Radeon HD 7870 GHz Edition card. As a result, the Sky cards work with the same API and feature support as these Radeon cards—Direct3D 11.1, OpenGL 4.2, and OpenCL 1.2 support is available on all of these cards.

The software: AMD RapidFire

Enlarge/ We don't know many details, but AMD's RapidFire promises to address at least some of the most common problems with cloud gaming.

AMD

As with the Nvidia Grid cards, there are two important problems the Sky cards will need to compensate for: they'll need to be able to reduce latency however possible, and (to be cost-effective) they'll need to support multiple users. AMD is dealing with both problems through something it's calling RapidFire, which the company calls the Sky cards' "secret sauce." We don't have all the details we'd like, but AMD's slide deck from GDC and the Sky product page provides enough to get started.

First, latency: as with Grid, the Sky cards reduce latency in part by encoding the video stream that is being sent to your gaming device directly on the GPU, rather than sending it to another server box dedicated to the task of video encoding. The same Video Encoding Engine (VCE) that speeds up encoding and transcoding in laptops and desktops with AMD GPUs will be used here to reduce the number of steps between the Sky cards rendering a scene and you seeing that scene on your device.

RapidFire also enables the Sky cards to stream "up to" six games at once. The only example AMD provides is for the Sky 700 card, which in AMD's testing could stream three games at once at 60 frames per second or six games at once at 30 frames per second. What we don't know is how the other two cards perform—if they support more or fewer streams, or simply allow the graphics quality on those six streams to be turned up and down. We've reached out to AMD for comment on this, and will update when we get a response.

Finally, AMD is working with companies like VMWare and Citrix to allow virtual machines based on those companies' technology to access these GPUs directly, which according to the AMD will provide "greater density and more simultaneous game streams from a single server." While AMD's focus for these cards is set squarely on cloud gaming, direct access from virtual machines also opens the door to workstation applications—drafting, video, and photo editing software that can make extensive use of the GPU should be able to benefit from direct access to the card just as games can.

Not first, but still competitive

Enlarge/ AMD's list of partners so far consists mostly of small or foreign companies, but Nvidia has that problem too.

AMD

In most respects, AMD is following Nvidia here. The Radeon Sky cards are aiming for the same general market as Nvidia's Grid (née VGX) cards, and they accomplish most of the same tasks in the same (or, at least, similar) ways. Still, this market is young, and AMD's cards appear to be competitive.

Where AMD's high-end cards appear to beat Nvidia's, at least at the moment, is in number of simultaneous users supported. Even if the Sky 900 card can only support six simultaneous users (something we aren't yet sure of), it still beats the high-end Grid card, which can only support one user on each of its two GPUs (though Nvidia has assured us several times that multiple user support will be added later via a software update). Cloud gaming servers really need to be as dense as possible to help them scale, so the more users that can get a good experience from a single GPU the better.

One area where Nvidia is definitely ahead of AMD here is in hardware: AMD will be reliant on hardware partners to put its cards in servers and then sell them to companies like CiiNow, Otoy, Ubitus, and G-cluster Global (the four companies AMD said would be adopting its technology), while Nvidia is offering neatly packaged hardware directly to consumers in the form of both the Nvidia Grid server (for gaming) and the Grid VCA server (for workstation applications). AMD could well jump into this end of the market itself, if it proves lucrative—it already sells servers through its SeaMicro arm—but as of now it's only selling the cards themselves.

We're still a bit skeptical about cloud gaming, but with both of the major graphics hardware makers in the game, it's going to be easier than ever to buy the hardware you need and get a service up and running. Whether those services succeed is something else again, but Nvidia's Grid and AMD's Sky cards remove one more barrier to entry.

Promoted Comments

It's not so much that the boards are passively cooled, but rather that there is a lack of onboard active cooling. They'll be in servers with incredibly noisy chassis fans, which in turn reside in a DC with industrial cooling in place.

PXE has been around for over 13 years. "Thin" clients have had plenty of time to take things by storm. What we're going to have with all this "cloud/sky" is a fatter "thin" than we use to.

PXE is a different kettle of fish though. PXE is basically just making your computer diskless, and treating the server as your local drive. It tends to have much higher bandwidth requirements, and is near useless over wifi.

Here we're talking about having all the computation happening on the server. Your computer is basically just a VNC/RDP client, and everything takes place on the server. This offers a ton of advantages over a PXE thin-client system. You don't have to update end-user hardware if they need more performance. Crank up the ram/CPU count on their VM and suddenly they have a faster computer. Gaming aside, the stream from the client to the server is a lot lighter, it's only sending data to express what changed on the screen. PXE is sending over the entire application for you to use it.

Need to suddenly switch computers? Log out of your current thin client, hop to another computer, and everything you had running before is still there. This is *huge* for some fields.

Remote desktop computing works well even over a WAN, too. A tiny branch office can still have all their apps running at the HQ's server, without worrying about backups or having a local server.

Remote desktops are not perfect technology, but I'd never deploy diskless PXE style thin clients now. They were amazing when it first came out though.

There are three cards that will be available to start with: the Radeon Sky 900 is a dual-slot card with two GPUs running at 825MHz and 6GB of GDDR5 RAM on a 384-bit interface (each GPU can access 3GB of this RAM). This card includes 3584 of AMD's stream processors across its two GPUs, and consumes 300W of power under load.

It's because they aren't for cloud gaming services. They're more for large scale enterprises whom are slowly going to the old Thin Client paradigm, where everything you do on your "local" terminal is actually running on a big ol' server back in the data center. It makes supporting people a lot easier, and moves costs around.

Somebody's thin client having issues? Replace it with a spare in 5 minutes, no need to move profiles or anything. Instead of, oh having issues with a desktop? Ugh, this is going to take a while.

New person coming in on Monday? Throw a thin client out there, and let them connect to a new base VDI, and go from there.

This is basically the future of enterprise computing. Desktops are slowly going to die in the enterprise.

Basically, they (1) want it to be possible to play PC games that look good on devices without the computing power to render that stuff in the first place (i.e. phones, tablets, Ultrabooks), and (2) want to build something for gaming that works kind of like Netflix streaming does for movies.

I'm not sure what the market for this stuff is, myself, for reasons I've laid out elsewhere. IMO (as another poster pointed out) it makes a little more sense in the context of giving virtual machines access to GPU power, but that's going to be more of a thing in enterprises than in the consumer market. Talking about the potential gaming applications first is "sexier" or something but I'm sure we'll be hearing more about potential enterprise use soon.

Due to latency issues, I am more than slightly skeptical about cloud gaming. Still. it will be interesting to see how this evolves, and as others have pointed out it does have applications in virtualization or server support for workstations that might be more applicable. Latency both matters less there than it does in gaming and can be minimized by locating the servers relatively close to the clients.

It's not so much that the boards are passively cooled, but rather that there is a lack of onboard active cooling. They'll be in servers with incredibly noisy chassis fans, which in turn reside in a DC with industrial cooling in place.

I've used both houses for video cards and have nVidia GPUs in most of my machines currently.

While nVidia may have some sort of lead on AMD, I'll mention that in work settings, I've used the AMD FirePro cards in 4 display setups for years. Primary use case is call centers, stock traders, and the like that need the screen real estate. Zero use for gaming, but the 2450s are cheap and draw < 30w.

AMD may actually have something here if they can connect all the dots. The VDI bit is interesting since it implies multi screen and multi user capability, and the business segment is larger than the gaming segment. That's where the potential $$ are...if either of them are smart.

It's because they aren't for cloud gaming services. They're more for large scale enterprises whom are slowly going to the old Thin Client paradigm, where everything you do on your "local" terminal is actually running on a big ol' server back in the data center. It makes supporting people a lot easier, and moves costs around.

Somebody's thin client having issues? Replace it with a spare in 5 minutes, no need to move profiles or anything. Instead of, oh having issues with a desktop? Ugh, this is going to take a while.

New person coming in on Monday? Throw a thin client out there, and let them connect to a new base VDI, and go from there.

This is basically the future of enterprise computing. Desktops are slowly going to die in the enterprise.

PXE has been around for over 13 years. "Thin" clients have had plenty of time to take things by storm. What we're going to have with all this "cloud/sky" is a fatter "thin" than we use to.

I'm not sure what the market for this stuff is, myself, for reasons I've laid out elsewhere. IMO (as another poster pointed out) it makes a little more sense in the context of giving virtual machines access to GPU power, but that's going to be more of a thing in enterprises than in the consumer market.

Well virtualization as far as the consumer could mean easier "backwards compatibility" without all the baggage, and greater security with the sandboxing.

Edit: It may also help with those "security" issues mentioned in the WebGL story.

PXE has been around for over 13 years. "Thin" clients have had plenty of time to take things by storm. What we're going to have with all this "cloud/sky" is a fatter "thin" than we use to.

PXE is a different kettle of fish though. PXE is basically just making your computer diskless, and treating the server as your local drive. It tends to have much higher bandwidth requirements, and is near useless over wifi.

Here we're talking about having all the computation happening on the server. Your computer is basically just a VNC/RDP client, and everything takes place on the server. This offers a ton of advantages over a PXE thin-client system. You don't have to update end-user hardware if they need more performance. Crank up the ram/CPU count on their VM and suddenly they have a faster computer. Gaming aside, the stream from the client to the server is a lot lighter, it's only sending data to express what changed on the screen. PXE is sending over the entire application for you to use it.

Need to suddenly switch computers? Log out of your current thin client, hop to another computer, and everything you had running before is still there. This is *huge* for some fields.

Remote desktop computing works well even over a WAN, too. A tiny branch office can still have all their apps running at the HQ's server, without worrying about backups or having a local server.

Remote desktops are not perfect technology, but I'd never deploy diskless PXE style thin clients now. They were amazing when it first came out though.

It's because they aren't for cloud gaming services. They're more for large scale enterprises whom are slowly going to the old Thin Client paradigm, where everything you do on your "local" terminal is actually running on a big ol' server back in the data center. It makes supporting people a lot easier, and moves costs around.

Somebody's thin client having issues? Replace it with a spare in 5 minutes, no need to move profiles or anything. Instead of, oh having issues with a desktop? Ugh, this is going to take a while.

New person coming in on Monday? Throw a thin client out there, and let them connect to a new base VDI, and go from there.

This is basically the future of enterprise computing. Desktops are slowly going to die in the enterprise.

They said that years ago and here we are still with local machines at every one of our desks at my office. We might be the exception, as we do a lot of design and load intensive work (logic design.

Due to latency issues, I am more than slightly skeptical about cloud gaming. Still. it will be interesting to see how this evolves, and as others have pointed out it does have applications in virtualization or server support for workstations that might be more applicable. Latency both matters less there than it does in gaming and can be minimized by locating the servers relatively close to the clients.

I agree, if you have a game running at 60fps, you'd need to have less than 20ms of latency for it to be sufficiently responsive (well depending on the type of game, but certain for twitchier action-y games).

That's fine for gaming at home on a LAN (e.g.: you have a tablet and you're running a game remotely from your beefier desktop), but that's going to be mighty problematic over the internet.

Over the Internet, cloud gaming isn't terribly interesting to me. However, I'd love to set up my own cloud gaming server in my house, that'll allow me to do PC gaming from any device (including even a tablet) within range of my WiFi network. I've looked up what it'd take for me to do this previously and it'd require thousands of dollars in the software alone. With AMD and NVidia putting this out on a more mainstream level, maybe the barrier for me to set something up will lower.

My ideal is to have a central home server that'll serve up everything, from data storage to gaming, all over a "personal cloud" kind of platform, and let me access it anywhere from any PC, laptop, tablet, or even phone, in my house. And potentially over the Internet on the web. One can dream.

man i get all drooly every time i think about nvidia's game server. If i had the cash, I'd buy one maxxed out to the teeth and throw a lan parties where everyone brings their tablet and blutooth gamepad/keyboard+mouse of choice.

It's because they aren't for cloud gaming services. They're more for large scale enterprises whom are slowly going to the old Thin Client paradigm, where everything you do on your "local" terminal is actually running on a big ol' server back in the data center. It makes supporting people a lot easier, and moves costs around.

Somebody's thin client having issues? Replace it with a spare in 5 minutes, no need to move profiles or anything. Instead of, oh having issues with a desktop? Ugh, this is going to take a while.

New person coming in on Monday? Throw a thin client out there, and let them connect to a new base VDI, and go from there.

This is basically the future of enterprise computing. Desktops are slowly going to die in the enterprise.

This explanation makes far more sense than the ridiculous cloud gaming idea.

I suppose another similar sort of application would be hotels. You put a cheap thin client in the TV, then you could offer high-end games (for a charge) with the work done from a server.

It's because they aren't for cloud gaming services. They're more for large scale enterprises whom are slowly going to the old Thin Client paradigm, where everything you do on your "local" terminal is actually running on a big ol' server back in the data center. It makes supporting people a lot easier, and moves costs around.

Somebody's thin client having issues? Replace it with a spare in 5 minutes, no need to move profiles or anything. Instead of, oh having issues with a desktop? Ugh, this is going to take a while.

New person coming in on Monday? Throw a thin client out there, and let them connect to a new base VDI, and go from there.

This is basically the future of enterprise computing. Desktops are slowly going to die in the enterprise.

This......

This actually makes a ton of sense.

As for "cloud gaming"..... honestly, I have real trouble seeing it. At least at this point and time, mainly due to the internet itself. MAYBE if my local broadband ISP was providing an affordable gigabit connection..... or say, Google brought their service here..... MAYBE then. That to me is still the limiting factor. Well, that and physics. Maybe in 5-10 years?

PXE has been around for over 13 years. "Thin" clients have had plenty of time to take things by storm. What we're going to have with all this "cloud/sky" is a fatter "thin" than we use to.

Oddly enough, I remember using Sun3 computers that were made in the late 1980s. No hard drive in the thing, everything was stored on the server. Of course, as SunOS got bigger and bigger, it eventually took up too much memory (I forgot how many minutes a Sun3/50 took to boot towards the end) and they all were converted to X machines (the thin clients of their day).

I would think that CPU cycles would be essentially free today, at least compared to bandwidth. I'm guessing that caching issues are the real culprit. While a rotating drive might be one of the biggest hardware expenses in such a design, a small SSD could hold either a good sized cache or the basic boot files and applications and leave bulk storage to the network.

Getting back to the idea of gaming on the cloud. It looks like an idea for those who though Simcity and DiabloIII didn't enforce enough connectivity.

Yes. And every one of those computers is tied to the network. Most of the "important" software I use runs on a server and I simply access through a client. Most of the software others use , the important part of that program resides on the network. Disconnect my computer?..... Grats. I now have a word processor.

I have a "desktop", but it's not being utilized as a stand-alone. Hell, one programmer here was still using his desktop from 2006-2007. Sure, it ran a little slow..... but he knew if he could run stuff on his rig, then the other PC's would be fine.

Doesn't sound like AMD is in much of a position to push low-latency cloud gaming, given their problems with micro-stuttering. Adding network latency to an already unacceptable stutter will make it massively more annoying and obvious. Nvidia is much better positioned for this right now, since for cloud gaming, minimum frame-rate is more important than average or max.

I'm surprised there's enough of a cloud gaming market already for both Nvidia and AMD to develop products dedicated to it.

It is more that people believe that there will be a cloud gaming market. Foolishly, really, as a lot of the issues are insolvable.

Sadly no one is paying attention to the lessons taught by SimCity and Diablo III. Cloud gaming will be even worse.

Though of course, it doesn't matter to Nvidia or AMD if the cloud market fails, if they have already bought their products.

Quote:

It's because they aren't for cloud gaming services. They're more for large scale enterprises whom are slowly going to the old Thin Client paradigm, where everything you do on your "local" terminal is actually running on a big ol' server back in the data center. It makes supporting people a lot easier, and moves costs around.

Somebody's thin client having issues? Replace it with a spare in 5 minutes, no need to move profiles or anything. Instead of, oh having issues with a desktop? Ugh, this is going to take a while.

New person coming in on Monday? Throw a thin client out there, and let them connect to a new base VDI, and go from there.

This is basically the future of enterprise computing. Desktops are slowly going to die in the enterprise.

I'm surprised there's enough of a cloud gaming market already for both Nvidia and AMD to develop products dedicated to it.

I don't think it's because there is enough market, rather AMD sees that the market could potentially be sufficiently big that they cannot let NVIDIA get much of a head start. Companies have suffered just for waiting too long and ignoring potential markets.

It's because they aren't for cloud gaming services. They're more for large scale enterprises whom are slowly going to the old Thin Client paradigm, where everything you do on your "local" terminal is actually running on a big ol' server back in the data center. It makes supporting people a lot easier, and moves costs around.

Somebody's thin client having issues? Replace it with a spare in 5 minutes, no need to move profiles or anything. Instead of, oh having issues with a desktop? Ugh, this is going to take a while.

New person coming in on Monday? Throw a thin client out there, and let them connect to a new base VDI, and go from there.

This is basically the future of enterprise computing. Desktops are slowly going to die in the enterprise.

Licensing fees and the fact that nobody outside the datacenter actually wants that means it will never happen. You might notice that it's tablets that are slowly replacing laptops and PCs, not thin clients. People like being able to do things when off the network, for whatever reason. It's actually more resilient and useful that way.

I remember as far back as 2004, when my company at the time was heavily invested in RDP and preparing to move to Citrix, we decided to look into true thin clients. Then as now, cheap thin clients cost more than full-fledged desktops! Most were integrated with the worst monitors on the market, as well, and don't use nearly enough less power to justify the difference. Plus back then you had to get a whole rack of servers, one blade per user, because VMWare didn't support desktop OSes.

Flash forward to 2008-9 when XenDesktop was trumpeted as the salvation, and VMWare put out its own version shortly after, I started looking into it again. Turns out you still have to buy the OS licenses twice, plus VDI licenses; some software activations still don't work, though at least you could consolidate it all into a few servers (which has its own issues, because servers do go down now and then, even if by accident or faulty wiring). The administration isn't even that much reduced - you've always been able to image physical machines, and you still have to manage them as if they're regular desktops via group policy and such.

It's because they aren't for cloud gaming services. They're more for large scale enterprises whom are slowly going to the old Thin Client paradigm, where everything you do on your "local" terminal is actually running on a big ol' server back in the data center. It makes supporting people a lot easier, and moves costs around.

Somebody's thin client having issues? Replace it with a spare in 5 minutes, no need to move profiles or anything. Instead of, oh having issues with a desktop? Ugh, this is going to take a while.

New person coming in on Monday? Throw a thin client out there, and let them connect to a new base VDI, and go from there.

This is basically the future of enterprise computing. Desktops are slowly going to die in the enterprise.

Licensing fees and the fact that nobody outside the datacenter actually wants that means it will never happen. You might notice that it's tablets that are slowly replacing laptops and PCs, not thin clients. People like being able to do things when off the network, for whatever reason. It's actually more resilient and useful that way.

I remember as far back as 2004, when my company at the time was heavily invested in RDP and preparing to move to Citrix, we decided to look into true thin clients. Then as now, cheap thin clients cost more than full-fledged desktops! Most were integrated with the worst monitors on the market, as well, and don't use nearly enough less power to justify the difference. Plus back then you had to get a whole rack of servers, one blade per user, because VMWare didn't support desktop OSes.

Flash forward to 2008-9 when XenDesktop was trumpeted as the salvation, and VMWare put out its own version shortly after, I started looking into it again. Turns out you still have to buy the OS licenses twice, plus VDI licenses; some software activations still don't work, though at least you could consolidate it all into a few servers (which has its own issues, because servers do go down now and then, even if by accident or faulty wiring). The administration isn't even that much reduced - you've always been able to image physical machines, and you still have to manage them as if they're regular desktops via group policy and such.

I've never been able to determine what the thin-client vendors are smoking when it comes to prices. Historically, most of them have been brutally cut down x86s(HP used transmeta while they were alive, switched to VIA thereafter, also ships some low-end AMD and Intel Atom-based, along with, I think, one ARM offering of indeterminate breed) and shipped with either WinCE feature-free-edition or a dysfunctional embedded linux, yet cost as much or more as a boring business desktop...

The 'desktop' as a lovingly-coddled machine that actually stores a lot of user state locally is totally dead; but the economics of 'buy cheap machines by the pallet-load from Dell, PXE boot to automation environment, dump standard image, ready to roll' still compare awfully well with the thin-client world. Microsoft makes sure that they get their pound of flesh either way, so no savings there, terminal servers aren't exactly free(or maintenance free, may Xenapp rot in hell...) and keeping a few spare SFF desktops on the shelf, pre-imaged, isn't rocket science, nor is swapping them.

It would be really cool, if some iterations of this technology finally enable VMs to access the might of GPU processing instead of a emulated 1999 card.

With the right hardware(which, in practice, means basically anything sold as a 'server' in the present or recent past, and not infrequently workstation and even some desktop gear: Intel tends to be a bit more aggressive about trimming the IOMMU from the cheap seats, for market segmentation, AMD a bit less; but the cheaper flavor of AMD-supporting motherboards can be pretty dodgy about having features that work) you can actually attach a PCIe device directly to a VM. It can be very handy if you need the guest OS to have 'real' access to some weirdo peripheral(a line interface card for a soft-PBX, say) that has no standard virtualized abstraction, or something like 10GbE, which has an abstraction; but is fast enough that it eats nontrivial time to abstract.

What hasn't been possible is slicing up a single big video card and allocating chunks of it to multiple VMs. Nor has it generally been possible to route the video card's work through the RDP/ICA/PCoIP/whatever link, rather than directly to the physical video-out connector. These two are the areas that Nvidia and AMD seem to be gunning for now.

My ideal is to have a central home server that'll serve up everything, from data storage to gaming, all over a "personal cloud" kind of platform, and let me access it anywhere from any PC, laptop, tablet, or even phone, in my house. And potentially over the Internet on the web. One can dream.

I think that is the general idea of both Nvidia and ATI. In order to sell many cards, they will peddle home Gpu Cloud Gaming cards that you install which will link game-date over the internet to multi-user database. Your home rig runs the game served to you via Wifi, so a single hop with the least latency. You might to able to "team-up" multiple cards via SLI to scale to more users from you local server. It offers a predictable gaming for all users in your local wifi LAN. Genius!. In the enterprise, they use VCX servers to handle rendering and run compute intensive programs. The issue there is also local LAN (switched GigE) for best performance on the client side. The client runs a think layer of stream plus synchronous command/response channels to the server. It took 25 years of client-server computing to get to this "client-server-cloud" model!!!.

Andrew Cunningham / Andrew has a B.A. in Classics from Kenyon College and has over five years of experience in IT. His work has appeared on Charge Shot!!! and AnandTech, and he records a weekly book podcast called Overdue.