Share this:

For a dying platform, the technical innovations for the PC aren’t half coming thick and fast. For starters, Alec and Graham have been dabbling with Steam’s new streaming capability. It all looks bloody clever to me and has the knock on effect of rebooting interest in some previously pretty pedestrian kit. £40 mini-ITX board with embedded Atom chip as basis for client streaming box (based on a free OS)? As if that wasn’t enough, AMD’s Mantle API has gone live with beta driver support, promising a brave new age of high performance gaming for all. Well, kinda.

At the moment, I’ve the nagging feeling that some parts of the PC industry must be feeling a little queasy about game streaming technology. After all, it allows you to play bells-and-whistles, uber-res (well, ish) gaming on multiple boxes and screens without buying multiple high-end boxes. Not good for CPU and GPU sales, surely.

Nvidia’s efforts to lock streaming tech down to its own GPUs and mobile devices are symptomatic of this. But frankly, Steam streaming makes Nvidia Shield’s limitations look the cynical anti-customer ruse they so surely are. But then Valve are probably best positioned of anyone to make streaming – over home networks, at least – a reality. Anything Nvidia did was likely to end up marginalised.

Home-streaming hardware
Anyway, with home streaming suddenly looking very plausible the next question is what hardware to use. You’ve got your gaming rig, what’s the client device?

Actually, there are still some questions marks over suitable host hardware in terms of the video encoding part of the puzzle. As Alec discovered, the current beta build encodes the video stream in pure software on the CPU cores, which is a painfully inefficient way of going about things.

Valve has said hardware encode is coming, which is a blessed relief. But it’s not yet clear which 2D acceleration solutions will be supported and which will be best. But it’s easy to imagine all those dormant Intel HD Graphics integrated cores and their QuickSync tech suddenly waking up and doing something useful.

Likewise, what about an AMD Kaveri chip with its high-tech integrated video core doing the streaming, its CPU cores unleashed by Mantle…but more on Mantle in a moment. Back to boring old client boxes.

Client boxes
If you’ve got a crusty old PC or knackered laptop that’s capable of decoding 1080p H264 video and with suitable outputs to drive whatever display you favour, I suspect that will get the job done as a client box. That’s the beauty of streaming.

55 bucks: Biostar’s embedded-Atom board

But streaming also makes it possible to buy a dedicated PC for the living room that’s up to the job but seriously cheap. Something like, say, Intel’s NUC just not nearly as expensive. Something like, say, Asus’s new NUC-like Asus Chromebox (pictured up top) complete with dual-core Intel processor and 16GB flash storage. Yours for from $179 (I’m hoping under £150) for a very slick looking little client.

Simply whack on SteamOS and get streaming? Here’s hoping (Valve’s minimum req’s only speak of 500GB for SteamOS which can’t actually be the space required to install it). Or maybe build your own teensy client for even less money.

Intel’s new BayTrail Atom is miles better than previous iterations of its ultramobile CPU in terms of performance. And a few board makers have announced mini-ITX efforts complete with quad-core Bay Trail chips for under $60 (maybe £40?). Add a mini-ITX case with PSU for about £50. Yippee.

The options are pretty much infinite and it’s all so much more interesting with streaming. That said, it does make me begin to wonder if the real cost will be sorting out a reliable wireless networking solution. I’m confident my current wireless ‘g’-based home network is unlikely to be up to snuff and I don’t fancy running cables about the place.

More about Mantle
Meanwhile, AMD’s Mantle has gone beta and early looks at the new API and what it can do for games performance have splurged across the web.

It’s very early doors. I haven’t done any real testing myself, just seen it running briefly. And frankly, I’d be here until late 2015 mixing up the hardware variations to get a really good idea of what even this beta offers, which is currently limited to a single game (Battlefield 4) plus a tech demo.

So a sort of poll of polls makes sense. For my liking, far too many sites have focused solely on high end Radeon R9 290X boards as the basis of testing Mantle. I appreciate Mantle is predominantly about optimising CPU performance. But the fact is, relatively few people will ever own a 290X and for me Mantle is only interesting if it boosts performance for real-world rigs.

Anyway, Mantle certainly makes a difference for mid-range CPUs. I’ve seen a lowly quad-core Core i5 running next to a mighty six-core Core i7. And it was the Core i5 rendering 5,000-plus objects with buttery-smooth frame rates thanks to Mantle while the six-core beast spluttered chronically in DirectX.

OK, that was in the purpose-built Star Swarm tech demo, not a game you can play. But I’ll give the demo the benefit of the doubt and assume the DX codepath hasn’t been intentionally borked to make Mantle look good. In other words. with Mantle you can do things that previously have not been possible.

Thou shalt count to 5,000. And the number of the counting shall be 5,000

TechReport have a nice summary of how Mantle scales with various CPUs, albeit in concert with a 290X. LegitReviews gave the lowly Radeon R7 260X a run with the Mantled’-up iteration of Battlefield 4 and came up with a 15 per cent boost in performance, which isn’t too shabby.

I think the real benefactors from Mantle are likely to be CPU-bound RTS titles, so it’s a bit frustrating to have BF4 as the only test game. Total War: Rome is due a Mantle makeover and strikes me as a much more interesting test case.

Also, on the downside there are signs Mantle as it currently exists can actually reduce minimum frame rates. Not good. Microstuttering is, of course, the sort of thing that could very well evaporate following some driver optimisations during the transition from beta to prime time. But suffice to say the overall picture is mixed.

For me what’s really interesting is that m’colleague Dave James on PC Format tells me AMD is pitching Mantle as an open standard and would like or at least be happy to see it or something like it included in future iterations of DirectX.

That sounds like the best possible solution. It would mean all DirectX GPUs would be compatible (well, all that meet the standard of this notional new version of DX) and the feature would actually get used. As a stand alone API that only works on one vendors’ technology, I struggle with it a bit.

Whatever, it’s always exciting to see the prospect of more performance for little or no money. Oh, and if you want to try Mantle out for yourself, you’ll need a graphics card with AMD’s GCN technology, which means Radeon 7700 or better and the new R7 and R9 boards.

I read your first words and though “a bot already on the first post?” but then you turned out to be an actual person (at least I think you are)

back on topic. As was stated in the article, if it can decode 1080p video, you should be good.
Hell, I’ve managed to stream games on my craptastic Asus netbook with an E350 APU. It wasn’t great, but it worked.
This whole streaming thing is starting to look pretty damn promising.

My uncle’s brother-in-law’s sister’s grandson’s former roommate makes $80 per hour on the internet by posting spam comments on video game blogs. He’s been out of work for all his life but last month the paycheck was $132873098706 and he has bought a second-hand Lada (poor bastard!).

Quickest way would be to transfer a 1080p mkv remux onto your main drive like your gaming computers SSD. Install VLC on the old/test computer and tell it to play the bluray from a network location i.e your SSD. Use gigabit ports.

I think they are sending out invites right now so if you havent joined the group yet, now is a good time. I just got my invite yesterday and I was pretty excited by how well the streaming worked to my Surface pro. I didnt have any lag issues at all. So far, so good.

Really exciting stuff. I cracked open the old PC and its got a PCI slot rather than PCIE – my GTX 220 wont fit. May buy a cheap mini board + cpu, 1 gb ram and recycle the case and power supply from this thing and put it in the living room

Even if Microsoft did include Mantle elements in a future DX version I doubt it would do much good, because knowing Microsoft they would require people to buy a whole new OS to use it. Not to mention that quite a lot of games coming out nowadays are still built on DX9, which suggests to me that developers wouldn’t exactly jump at it.

The J1800 boards are extremely appealing for industrial and low power systems. I’ve been trying to find one stateside to test out at work.

And I really wish Newegg would get serious with their embedded solutions. They carry a paltry number of boards with embedded SoC’s, and don’t even have any of the J1800 boards yet. And their selection of mini ITX cases is horrendous and way overpriced. All I want is a tiny enclosure with a few USB ports. Is that so much to ask for?

Can you wipe chromeboxes and put your own OS on there? I’ve been thinking about snagging a thin client for my house (wife and guests use VMs, not a physical computer) but they’re so damn expensive it’s not worth it. But there are various Chrome OS devices out and coming out that have much better formfactors, and are cheap, assuming I can get something on there to run RDP.

Chromeboxes/books come with a custom (locked down) BIOS that, as far as I know, only allow you to install a dual boot, while in ‘developer’ mode with constant warnings that you are voiding all right to professional help by going against Google’s will.
So no, probably not. Only Linux at best.

Some of them even dont allow for dual boot in developer mode. Trying to get anything else to run instead of Chrome OS is a real pain in the ass and I really cannot recommend for anyone to try. Unless they have plenty of time, technical know-how and a lot of hair they want to turn grey asap.

Its easy, Google left them wide open to hacking. Flash any linux distro you want and boot from usb. Basicaly Google sticking it to Microsoft. Apple have something similar a lot of people use as custom media center hubs. Very powerful network toys ans asus cranking them out at 150 with Google is awesome, pretty much allows you to slave entire server networks into a happy near silent hyper cube outputing to your 50″ TV or multiple HD projectors.

A little bit of further reading on the Chromebook/Chromebox as a thin client concept.

Firstly it’s important to establish that most ( but not all ) are running ARM based chip sets. This is a major sticking point as it completely rules out Windows ( even if you got hold of a copy of Windows RT OEM it wouldn’t run.) It also rules out the possibility of running SteamOS.

With that said there are a handful of x86 devices and there are plenty of linux distro’s which you could run on an ARM devices ( and a lot of info out there on getting various Ubuntu spin off’s running on a variety of chrome devices.)

Actually getting something other than ChromeOS running on it though can just about ruin your day. On a few devices it’s just not sanely possible, on most it’s doable but not always straight forward and getting Windows running on one of the x86 based chromebooks seems to be an exercise in outright insanity.

Given the mention of RDP I’m going to assuming you have a Windows setup. In which case installing some form of Linux and turning it into a vague impression of a thin client probably isn’t feasible. If nothing else than for the ease of use of your family and guests.

There is however another option which is often overlooked. Chromebooks have the option of running an RDP client through the browser ( link to chrome.google.com ) . This combined with something along the lines of an Acer C720 might meet your requirements.

The streaming is the entire reason I’ll be buying a Steam machine (if it works well). I don’t think hardware companies are as worried as you imagine they are either, Jeremy. The alternative to a cheap streaming device for my already costly gaming machine is having nothing at all, rather than buying more GPUs etc. I would imagine it’s the same for most without vast amounts of money to burn.

“I’m confident my current wireless ‘g’-based home network is unlikely to be up to snuff and I don’t fancy running cables about the place.”

Homeplug-standard powerline LAN adaptors are the answer to this right here, surely? I have my PC, router, Mac and an Ouya running Limelight (which makes GeForce Experience think my Ouya is a Shield so I get hardware encoding from my GPU and hardware decoding on the Ouya’s Tegra 3) all on 200mbps powerline adaptors and the latency is around 100ms or so, it’s not perfect but that’s far better than the 200ms I get on whatever the fastest 802.11 standard was before whatever the very latest one is called, not to mention extremely playable even if the actual data rate is much, much slower than advertised; Steam streaming’s stats thing estimates 35mbps (of which 1080p30 uses 40%), though there are also some IPTV boxes and a laptop or two also using the same adaptors in my house.

Maybe that new wi-fi standard will be good, maybe it won’t (data rate and latency are different things, after all), but right now you can get a pair of TP Link PA211s for £16 on Amazon and they’re great for my streaming needs.

I use an 802.11ac router and usb wireless adapter for my computer and it is quite good. I haven’t played a lot of multiplayer fps games recently, but thus far I have been unable to measure any significant increase in latency vs wired. Personally I think wireless has reached a point where someone should not feel reluctant to try it simply based on worries over speed or latency. Range and congestion can be an issue depending on the set-up and location, but certainly it can be the most convenient way to connect a device to a network.

Powerline adapters are pretty neat, but I think they can be a bit risky. They themselves may be capable of very high speeds, but in practice the wiring in my home has never allowed me to even take full advantage of my Internet connection. That said, it sounds pretty economical. A new ac router can easily be several hundred US dollars, and a wireless adapter can be pretty expensive as well.

Well hey, Mantle does pretty much exactly what it might reasonably have been expected to do. On the other hand, actually achieving what you set out to achieve is pretty impressive, in software.

I still think shenanigans are happening with that star swarm demo though. I just don’t believe you would need a huge number of draw calls to display 5000 similar looking ships. I’m not even a proper programmer and I’ve written code which works out orientation matrices on the fly and then draws 10,000 instanced objects in a single call; the CPU hit was negligible. As in 0.1%.

You can run it as a VM if that’s what you mean. As for running it entirely in memory, I guess you could if you had loads of memory as a RAM drive (32Gb would probably do it), but why would you want to? A 32Gb SSD is way cheaper than 32Gb of RAM and will involve less faffing around.

Are you over-thinking this, by the way? You just need the current Steam beta (and Valve to have let you into that beta) to use the streaming. I’m doing it between two Windows devices – SteamOS is just to avoid you having to run a copy of Windows, so putting it in a VM is kind of overkill right now.

Nvidia’s efforts to lock streaming tech down to its own GPUs and mobile devices are symptomatic of this. But frankly, Steam streaming makes Nvidia Shield’s limitations look the cynical anti-customer ruse they so surely are.
Yes and No. Nvidia really has added very smart capabilities to their hardware to make local streaming better.
The main one is even more basic than encoding: frame capture.
Here is an expert explaining it: link to forums.guru3d.com
Basically, to read back a frame, DirectX will flush its pipeline, delaying the next one. Nvidia has added commands to read the buffer more efficiently. This is what allowed them their ShadowPlay feature and why their Shield streaming was limited to recent models.
So, we may need those types of capabilities to allow smooth streaming and lower latencies and at the moment, Nvidia is the only one offering the hardware for it.

Interesting, thanks. But I see nothing there to suggest you need Nvidia hardware at the client end. It’s all about capture on the host, which was my point – Nvidia tying streaming to its client hardware. If they have superior hardware for capturing on the host, that’s great!

Indeed – when I got in the Steam Streaming beta I really hoped that NVidia would let the same stuff they’re using to keep the overhead for Shadowplay so low to avoid the pretty heavy hit of encoding 1920×1080 in software. But it’s not to be (yet).

Correct. Anyone can implement those commands on supported nvidia GPUs for lower latencies. Steam just need to do it for their implementation, but that will favor nvidia as server easily.

For decoding the stream, most recent CPUs have hardware decoders (intel has QuickSync, that both include a hardware decoder and hardware encoder and AMD just added a new hardware decoder block in their Kavery CPUs). Any decent graphic cards have also had those for ages. Even mobile processors do that very efficiently and would be enough to play those videos. The people reporting success with an Ouya are on the right track…

> I’ve seen a lowly quad-core Core i5 running next to a mighty six-core Core i7. And it was the Core i5 rendering 5,000-plus objects with buttery-smooth frame rates thanks to Mantle while the six-core beast spluttered chronically in DirectX.

I thought Mantle was for AMD CPUs? How are you getting performance gains on an i5?

No, Mantle is for AMD *GPU* (graphic card).
It will work whether that card is plugged into an Intel-cpu based computer or AMD-CPU based one.
The biggest gains are recorded on AMD CPU, but that is mostly because Mantle is lighter on the CPU and removes CPU bottlenecks. As AMD CPUs are weaker than intel’s lately, the effect is more visible there.

I stream from my gaming PC up the stairs in my home to an Ouya running Limelight using nVidia’s streaming protocol. It works perfectly and it’s cheap as chips. There’s even a Raspberry Pi port in the works

Using an x86 box just to decode a stream and upstream controls seems like a ridiculously inefficient way of doing it and much more expensive than using cheap ARM hardware. Especially because lots of ARM devices have hardware decoders which will help keep the latency down compared to CPU decoding

I think it’s worth mentioning that home streaming does not need SteamOS to run, just needs to have the steam client installed I on both computers…

I have a pretty old PC in my living room which is incapable of playing virtually any of my steam games (I even tried mark of the ninja which I thought would be fairly minimal requirements but it was a slideshow) but it streams from my gaming PC with no problems…

I even tried adding a non- steam game (startopia) the and it steamed that with no issue…

I look forward to VitaTV coming to the west and trying out some PS4 Streaming to TV.

Last gen Core 2 Duo struggles at 1080 with Steam Streaming but 720p seems to work well. I still need to try a wireless connection to the laptop and see how that works. Wish it came with ac but n will do (have a new ac router).

You just need two computers who are able to run Steam to stream. Obviously, you’d want the faster computer to be able to run the game, and then stream the game to the other (probably slower) computer. You don’t need SteamOS for anything, just any OS able to run Steam will do. What I think is the most obvious use case is a SteamOS client connected to the TV, together with some gaming rig with Windows located anywhere else in the house.

They’re running after a niche market. A very very niche market: The console gamer that knows how to play games on a PC.

Not being a “PCMASTERRACE” douche, but you’d have better luck keeping it at a desk. Why the hell do you even want to play on a couch in front of other people? Very few console and PC games make very little use of local multiplayer at all anymore.

You really want to game in the living room without lag from having to pump down sampled VIDEO through your home’s networking and wifi? You already have several options. 1) Get your PC and move it. 2) Get a long HDMI cable. They’re not expensive at all. 3) Get a lap desk. (They’re cheap as hell compared to getting another machine and use no electricity. Try lapdesk.com)

Also, fuck those awful trackpad controllers. You will never have fun playing FPS games without aim assist and if you’re swiping your fingers like mad across those concave bowls. A mouse is still much better than even regular joysticks.

Stop buying into this bullshit hype machine. Just because it’s Valve product, doesn’t make this the best thing ever and doesn’t make them infallible gods to be worshiped.

Yeah, a mouse is better for FPS and RTS. Just like a controller is better for 3rd person brawlers like Batman: Arkham Barkham. And a joystick is better for flightsims/spacesims.

Look, I think the thing is a bit overhyped (I don’t see a viable use case for myself, not owning a TV and stuff), but if you have a big ass TV, and a gaming PC in some other room (and *somebody* doesn’t want the big, noisy, blinkenlights gamemonster in the living room), I can see the advantage. Just relaxing on the couch, streaming the game you want to a low-profile Mediacenter+Steambox. Not everybody cares that much about the lag (which was only around 60ms in my limited testing over WiFi anyway, not too bad). Yeah, if you play competitive Counterstrike, this is not for you. If, however, you own lots of PC games that play well on a controller and have a good TV, then this might just work fine.

If only they’d let us stream games and do something else on the streaming computer at the same time….

“They’re running after a niche market. A very very niche market: The console gamer that knows how to play games on a PC.”

Or you know, people who play different kinds of games, and *gasp* sometimes with other people.

“Not being a “PCMASTERRACE” douche, but you’d have better luck keeping it at a desk. Why the hell do you even want to play on a couch in front of other people? Very few console and PC games make very little use of local multiplayer at all anymore.”

Speaking for myself, those “other people” happen to be my wife/children/friends, and believe it or not, sometimes they like to watch or participate. Usually it is platformers and racing games. Other genres I keep to the desktop.

You really want to game in the living room without lag from having to pump down sampled VIDEO through your home’s networking and wifi? You already have several options. 1) Get your PC and move it. 2) Get a long HDMI cable. They’re not expensive at all. 3) Get a lap desk. (They’re cheap as hell compared to getting another machine and use no electricity. Try lapdesk.com)

1) Moving the main PC is not really a practical solution, so that was just a stupid answer.
2) You obviously live in a rather small apartment and don’t have children. Less cables = less clutter = less accidents.
3) Sure, if you want to. But some games play better with a controller.

Also, fuck those awful trackpad controllers. You will never have fun playing FPS games without aim assist and if you’re swiping your fingers like mad across those concave bowls. A mouse is still much better than even regular joysticks.

I assume you mean the Steam controller here? Since it is still a work in progress, I think it is too early to judge, but see my earlier answers. Wanting the option to stream to your TV, does not mean you want to play ALL your games/genres on it. Why is that so hard to grasp?

Stop buying into this bullshit hype machine. Just because it’s Valve product, doesn’t make this the best thing ever and doesn’t make them infallible gods to be worshiped.

I fail to see how this has been overhyped really. It is a handy option, not the second coming of Jesus. What I REALLY fail to understand though, is the constant negativity from narrow minded people, who fail to grasp that different people have different gaming preferences.

Logged in to reply almost the exact same thing. Why so self centered? This is actually a great solution to play a lot of games i don’t feel like hunching over a monitor, and there are a whole bunch of them

Exactly – sometimes I want to play games while sat in the living room. I’ve got a 360 that does that, but (a) it sounds like a jet engine taking off, and (b) it’s a bit crap compared to what my PC does (not just graphically, but in terms of which games I have for it). I don’t want to move my big PC from the home office it sits in, because it’s connected up to a pair of monitors for doing Lightroom stuff and so on. But I don’t want to go sit in there when what I’m actually after is absent-mindedly playing Civ V while the telly’s showing something that doesn’t deserve my full attention.

Both of you hit it on the head. Example, I like to play FPS at my desk with a keyboard and mouse, but i like to play games like Batman Arkham Origins, or Metal Gear Rising on my TV with a controller if I can. I already have a media PC with low game capability, streaming without having to move my desktop from where i like to play at my desk is just a great new feature to have! Up till now I have been moving my desktop when I feel like it every few weeks.

“For me what’s really interesting is that m’colleague Dave James on PC Format tells me AMD is pitching Mantle as an open standard and would like or at least be happy to see it or something like it included in future iterations of DirectX.”

Did Nvidia ever pitch PhysX the same way? Seems to me that Mantle is just going to complicate things, and I can’t see Microsoft let anyone dabble with DirectX. But what do I know? Have they ever incorporated someone else’s technology into DirectX?

Do Valve, Nvidia and AMD already contribute towards OpenGL? If we’re aiming to ditch Windows as our gaming OS – and I think we should, regardless of what OS you run your spreadsheets on – doesn’t OpenGL need quite a bit of love to bring it up to what DirectX already offers? Why entrench yourself ever more into Microsoft, who don’t seem all that interested (to the point of being actively hostile with the closing of GFWL) in PC gamingm which hurts Steam, Nvida, AMD and arguably Intel as well.

“Have they ever incorporated someone else’s technology into DirectX?”
– Yea, they did, but mostly smaller things. I head somewhere that Tessellation was implemented with cooperation from nVidia – but I have no idea how much truth is in that.

“If we’re aiming to ditch Windows as our gaming OS”
– We’re not.

“I think we should”
– I think we should stop living in an endless dream of “linux will become superpower on desktops” – that never happened, never will.

“doesn’t OpenGL need quite a bit of love to bring it up to what DirectX already offers?”
– Perhaps it does, but each year gap becomes more and more obvious, so I’d prefer progress over getting stuck for next few years in order to sort out inferior library.

“Why entrench yourself ever more into Microsoft, who don’t seem all that interested (to the point of being actively hostile with the closing of GFWL) in PC gamingm which hurts Steam, Nvida, AMD and arguably Intel as well.”
– I don’t know, PC gaming is doing just fine as far as I see. Besides – MS did promise to look more after PC gaming in coming years, so I don’t see as huge problem as you do.

You could be right. And thanks for the answers by the way. I’ll take a robust, open-developed gaming OS over Windows in a heartbeat. If there were was one available, but as you point out, there is not.

“doesn’t OpenGL need quite a bit of love to bring it up to what DirectX already offers?”
– Perhaps it does, but each year gap becomes more and more obvious, so I’d prefer progress over getting stuck for next few years in order to sort out inferior library.

No, it doesn’t? What is OpenGL lacking again, could you tell me? OpenGL is capable of everything DirectX is capable of (except for multi-threaded rendering, which is bullocks anyway) — without any platform limitations. What makes you think OpenGL is an “inferior library” (which it clearly can’t be, because OpenGL is an API standard; the quality of a specific implementation may vary but that is the driver developer’s fault – for example, GL on OS X is a mess because Apple is strictly controlling how the implementation is supposed to look like).

“Keep on dreaming then…in Peasant console land i can Stream any PS4 game to my Vita (and soon to VITA TV).
My Wii U which i bought for less then 200 euro Streams any game to the controller.”

Those are rather dumb comparisons, you know. Those streaming functions are nice but:

1) Proprietary hardware, with all that it means for older games.
2) The streaming examples you refer to stream from a games console to a MOBILE device, with lower resolution. I fail to see why that should be compared with streaming from a powerful computer to a less powerful client connected to a TV.

Dumb comparison? PS4 and Wii U streaming works already flawless..no hassle needed. Like stated before VITA TV will allow PS4 games to be streamed to another TV (but again..isnt out yet). Stupid comparison? No not at all! It works great in comparison to e.g. Splashtop or other streaming services on the PC

THK god I DONT have a inferiority complex! Do you have one? Because you seem quit defense about my sarcastic remarks…

Streaming from console to mobile device != Streaming from powerful computer to lesser computer connected to a TV. Different resolutions, different input options, non-proprietary hardware, different use-case.

The only real comparison would be streaming to the VITA-TV, but as you said, it isn’t out yet, and cannot be compared. Comparing them to Splashtop for example, is one thing but this article wasn’t about streaming to mobile devices.

But incidentally, it would be interesting to compare different streaming-to-mobile programs/devices.