Back in the day when PCs were first moving into households, they came in big, clunky desktop form factor machines, with a beige colour, built like a brick. Later on, for some inexplicable reason, the world decided to move to tower configurations - more stuff could be stuffed inside, yes, but I considered them to be impractical and always in the way. These days, people just buy laptops and be done with it. This has a few disadvantages, one of them being the lack of graphical grunt in many laptops, combined with the inability to upgrade the graphics hardware. AMD believes it has a solution.

This looks interesting, but adoption looks to be undoubtedly slow as mentioned. The fact that you would have to be buying a new laptop to use it is a killer. And it has to be significantly cheaper to buy that after spending a boatload on the original laptop than any high laptop's to be any sort of attractive. The 4 screen support is nice, but will it be worth the cost? I imagine its gonna cost more than that gpu would for a desktop so its hard to see the benefit, other than the cool factor, or the "I really don't want to have more than one computer and have extra to spend" crowd.

Indeed. I think the implementation (the new graphic port) is going to be a bit of an issue as far as adoption is concerned. But I guess existing solutions simply don't have the bandwidth required for 'high end' video cards.

I've read about similar solutions using the Express Card port as the connector. The system was basically a little hub that you plug your full on desktop pci express card into.

The one I read about was the Asus XG but it doesn't look like they have released it at this point. Their website says Q2 2007 so looks like they are running a tad late.

Slow? Ok, the brief article didn't give all details as to just how conformant the connector is to the PCIe 2.0 specs, but the fact of the matter is whether it is 8 lanes or 16, that (by itself) is hardly the reason it'd be slow. It isn't like they're using USB 2.0 which would simply be inadequate for the amount of data sent back and forth for 3D and other deep data details: this is basically doing the same to PCIe 2.0 as what has been done with SATA: make an external connector that doesn't perform any worse.

Having developed 3D CAD software on a desktop-replacement notebook (the power brick had to be replaced, and I had a brief discussion with customer support where they emphasized "It's not a laptop! Don't use it on your lap!") a great reason to have this be available as some sort of docking solution is the fact that the GPU was probably as bad of a battery eater (and heat-provided birth control) as the main CPU (a 3.4 Ghz P4 at the time) besides the cost issue. At about 10 pounds, it wasn't possible to watch a movie on a DVD without the darn thing overheating and shutting off.

Of course, there's still that pesky issue that people will have to start buying laptops with the external PCIe 2.0 connector, but I don't see that's likely to be too big of a deal: likely all the electrical stuff is already present in chipsets, or at least it can be readily adapted and placed on whichever bridge is required, because it's a known-good and working standard that's already deployed. The only thing that'll suck is to be the people that want it but can't do it without buying a new laptop

Slow? Ok, the brief article didn't give all details as to just how conformant the connector is to the PCIe 2.0 specs, but the fact of the matter is whether it is 8 lanes or 16, that (by itself) is hardly the reason it'd be slow.

True, this external video box is made for the desk. But wouldn't this technology be more feasible integrated with a docking station? This would solve the issue of heat, power, and space at the business desk.

And make the docking station's graphics card be upgradeable. Instead of a box which you would end up needing to replace every time you want to upgrade, you can just swap the video card in the docking station.

I'm sure that PC vendors have offered this sort of thing in their docks, but it kinda reminds me of a limited version of the old Macintosh Duo Dock. You would have an ultra-light laptop (for the day), but you could have a full desktop PC as soon as you plugged it into a docking station at home. That included two or three standard (for Macs of the day) expansion slots which you could use for video cards, among other things.

And that's why this strikes me as a light version of the Duo Dock. It sounds like something that allows you to plug in one high performance peripheral card. Depending upon the flexibility of the standard, it may only be a video card. That's too restrictive IMHO.

Perhaps a standard high performance expansion bus would be a better idea. Something that would allow you to have a box like the ATI one describe above, or a larger box with hard drives and a few PCI/PCIe expansion slots -- so that you can design your own docking station.

I have always liked the idea of splitting the bigger desktop PC into 2 parts connected with a high speed bridge which would be PCIe today. In effect you buy a processor memory PSU module which is where all the heat, noise is managed and upgrade on latest processor schedule.

But the majority of interfaces would be hung from the 2nd box which is likely to be almost silent, cool and right in front of you. This comes as close as possible to that but only for the video signal. If this were combined with docking station and boat loads of ports USB, FW, eSata, sound etc, this would be it. On the down side such powerful graphics is also bound to bring some heat and noise with it and is competing with the cpu for top dog slot. gee maybe you could run the OS on the GPU as well.

AMD would really have to persuade the entire industry to make external PCIe available on desktops as well as laptops to make this fly.

You have been able to buy docking stations for laptops for years. One of the features of the dock is you can add in cards for whatever you need, and current docks have PCIe slots. I don't see what makes this drastically different. You can even buy laptops with high end graphics built into them, capable even for the most discerning gamer or graphics enthusiast. So, why re-invent the wheel?

That's the XG Station I mentioned in my post. Asus have been kicking it around trade shows for a couple of years now, and - as you found - they 'announced' it a week or so back, but they still haven't actually set a release date.

There's already several companies who have announced external graphics adapters which connect via XpressPort, and so will work with any recent laptop. The most prominent is the Asus XG Station. There's actually already one company which *sells* empty XpressPort -> PCI Express adapters into which you can stick whatever graphics card you like. (Only problem is the low, low price of...$499 and up). The reason ATI decided to go with a proprietary interface is that XpressPort is only as fast as PCI Express x1, so it'll be something of a bottleneck for faster cards; they'll still be faster than most laptop's onboard chips, but not as fast as they can theoretically go.

I just wish *someone* would get *something* on the market already; I don't care about or need whizzy 3D performance, I just want a decent 2D display on my laptop's second monitor. My only current option is a generic (not DisplayLink) USB VGA adapter, using the sisusb driver, which is slow as hell. DisplayLink apparently works rather well, but has no Linux drivers yet.

If you just want to hook up more monitors to your laptop you can use a DualHead2Go from Matrox. It uses your system's own graphics card, but performance is consistent throughout (even 3D). There's also a triple version, to hook up three external monitors.

Phil08: oh, yeah, forgot to mention that one. Two problems - it doesn't do high enough resolutions (lists as only going up to 1280x1024, IIRC - my monitors are 1680x1050) and I'm not entirely sure there's any Linux support. I'm fairly sure what it does is just tells your display driver to output at 2560x1024 - assuming it can manage that - and then does a bit of internal trickery to split that into two separate signals. Not a bad technique, but no good for 1680x1050 displays.

Not with my graphics adapter, it doesn't (max res 2048x768, according to the hardware compatibility checker). And it still leaves the issue of Linux compatibility; I suspect even if it works at all it wouldn't work *well* as the OS doesn't know it's not one really big (and strangely wide) monitor, so when you maximised an app it would spread across both monitors, new windows would pop up split across the two monitors, etc etc...

Talk about a longshot. If this actually comes out, and people buy it often on new laptops with the new port that has yet to be adopted so the whole shebang ends up profitable enough to survive... I think I'll eat my shorts. Interesting tech, though people were whining about the Macbook Air needing an external CD drive.