Posted
by
Soulskill
on Saturday August 15, 2009 @12:48PM
from the more-and-better dept.

An anonymous reader writes "AMD invited 100 people up to their private suite in the hotel that Quakecon 2009 is being hosted at for a first look at gaming on one of their upcoming DirectX 11 graphics cards. This card has not been officially named yet, but it has the internal code name of 'Evergreen,' and was first shown to the media back at Computex over in Taiwan earlier this year. The guys from Legit Reviews were shown two different systems running DX11 hardware. One system was set up running a bunch of DX11 SDKs and the other was running a demo for the upcoming shooter Wolfenstein. The video card appears to be on schedule for its launch next month."

Bigger problem: Probably runs worse than directx9 with it's only "advantages" being one or two minor shader effects (geometry shaders...) and a lot of games that arbitrarily lock things to Dx11 mode when they could run just fine in dx9 mode.

You are dead wrong. Direct3D 11 and Shader Model 5.0 is quite the step up from Direct3D 9 and SM 3.0. If you were a graphics developer you would know this. From Wikipedia:

Tessellation to increase at runtime the number of visible polygons from a low detail polygonal model.

Multithreaded rendering to render to the same Direct3D device object from different threads for multi core CPUs.

Compute shaders which exposes the shader pipeline for non-graphical tasks such as stream processing and physics acceleration, similar in spirit to what NVIDIA CUDA achieves, and HLSL Shader Model 5 among others.

It also has a lot of awesome smaller features that make doing what are known as deferred shading/lighting pipelines more feasible. This is a good thing because it simplifies the amount of work needed in implementing game's material system while offering great performance at the cost of more GPU memory being used.

Most game developers are skipping Direct3D 10 because it's explicitly tied to Vista and it has poor market share compared to Windows XP/Direct3D 9.0c. The hope is that most current gamers on Windows XP will eventually move to Windows 7, and that Direct3D 11 enjoys the same long life span as Direct3D 9.0, ending up in the console from Microsoft.

From my viewpoint the changes made in recent releases of DirectX are no small matter, to label such things as "minor shader effects" shows your ignorance of the subject. The DirectX SDK [microsoft.com] contains wonderful documentation if you are so inclined to learn.

I'm not arguing for or against Microsoft platforms but the fact remains DirectX is currently the de facto standard in creating games. And even though it's a COM based technology, it's still kinda fun to play with.

I'd still use win 2k if MS pumped out more security updates for it. To me that OS was the best MS ever produced. However, if I want to use an MS OS then it is XP now. How long until MS starts pushing for 7 and XP users are forced to upgrade for security reasons just like Win2k users where?

What?!? The standard always comes before the hardware, DX11 is an API and a (defacto) standard. We could go back to the OpenGL model with ARB extensions for new features that are implemented differently by each party until the standard catches up, but that was tough on everyone. It was tough on the hardware guys because they inevitably implemented features that didn't make it into the standard, it was hard on the standards body because they had to arbitrate between the different implementations to pick a wi

I agree with what you say. I'd just like to add that Khronos has finally gotten the hint and is moving quickly in catching up to Direct3D. They might always be behind, but at least they won't be that far behind. The recently released OpenGL 3.2 implements all of the Direct3D 10 features, at least as far as I know. We can probably expect another OpenGL revision Q1 2010 with some Direct3D 11 features, but probably not all of them. I really don't know what Khronos is going to do when it comes to trying to impl

He already explained it. It was a headache for everyone involved doing it that way. It's why the OpenGL ARB was disbanded a few years ago and replaced with the Khronos Group which now more or less follows Direct3D features. Pretty soon though we'll see a return to fully programmable graphics hardware post Intel Larrabee, and we'll probably see some software developers moving away from OpenGL and Direct3D and implementing their own low-level rendering stacks. It's hard to say if Microsoft or the Khronos Grou

Traditionally, it was very much the case that the spec preceded hardware support. OpenGL was around for years before it was ever (basically) fully implemented on consumer hardware. There are still some corner cases where some cards will have to fall back to partial software support, and some cards (like my cheap laptop chip) are specifically designed to run some things in software (my chip lacks vertex shaders).

More recently, it's become a bit more complicated, because the spec designers and the hardware

Problem being every hardware company does their own 'set of features' to differentiate themselves from their competitors which means software guys pull their hair out trying to support the fucking mess. DirectX may not be the best solution possible, but it is a helluva lot easier to support.

Perhaps he has misstated the case a bit, but the fact is that video cards have functions to correspond to directx functions these days in the way that they used to have them to correspond to opengl functions. Some of this is of course implemented in software; the idea is however to always implement as much as possible on the card itself, leaving the CPU free to do the other stuff. That's why we're seeing physics functions creeping into GPUs... they can sell us more transistors if they do more of the work.

>>Since when did we build hardware around APIs, rather than the other way around?

Always.

There's always a dialogue between software and hardware people on what needs to be implemented, and whether it should be done in hardware and software. The RISC/CISC days were full of stories like that in the CPU design world.

Actually, it hasn't been this way since around 2003/2004. Essentially nVidia, ATI/AMD, Intel and a few other lesser known vendors sit down in league with Microsoft and decide what kind of features they will be able to implement in the next graphics hardware cycle. They then come up with the API and get feedback from the hardware vendors and work towards a final workable API. This is what we saw with Direct3D 9.0c, Direct3D 10, and Direct3D 11.
OpenGL and the ARB has lagged way behind Microsoft and its partners, which is why the ARB was eventually disbanded and replaced by the Khronos Group. The Khronos Group kind of messed up OpenGL 3.0, they didn't implement half of the things they said they were going to do. As such, OpenGL 3.0 lagged quite a ways behind Direct3D 10. Fortunately, they've caught up, and OpenGL 3.2 is on par with Direct3D 10, but still a big step behind the new stuff in Direct3D 11.
As such, Microsoft and it's partners are leading the pack here, and Khronos (and because most of Microsoft's Direct3D partners are also Khronos group members) is no playing the role of follower. You can be guaranteed that the next major revision to OpenGL to match Direct3D 11 almost exactly in features, as nVidia, ATI/AMD, et. al. don't want to deviate radically in their underlying hardware.

Maybe you should also mention in your rant that it doesn't matter whether OpenGL 3.x implements a feature, because every hardware developer can just add an extension to it to implement that feature. This means that new features usually get into the standard after they have been deployed in new hardware.

This is not possible in Direct3D, and so in this case the new versions have to be developed before the hardware for it gets deployed. That's why it always appears that OpenGL is lagging behind, when in realit

Just because Windows XP can't run Direct3D 10/11 doesn't mean that Direct3D never supported geometry-shaders before OpenGL. Direct3D 10 had geometry shader support back in 2006, and it's what spurred the development of actual hardware that supported that feature set. It's true that nVidia had their GL_EXT_geometry_shader4 extension working back in 2007, but ATI/AMD NEVER supported it. It wasn't until OpenGL 3.2 was announced in August of this year that we actually got standardized support for geometry shade

What you are suggesting is that ATI and NVIDIA compete on features in such a way that their hardware isnt interchangable. Further, that software makers themselves would need to pick one or the other. That consumers would then need to be mindfull of who the software targets, and so on and on and on...

The fact was that when things are done as you suggest, it sucks bigtime for the companies making the hardware. When you have multiple large competitors in the market, neither

That's called "competition". If FOO is worth creating, implementing, and using because it gives a significant advantage over the status quo, then it will be an advantage for the company that implemented it, and a loss for others. You can market card A with FOO when programmers begin implementing it in games as not being available on card B.

The reality is that programmers *DO NOT* implement it in games if its not available on card B.

Thats the error in your logic. A game company cannot afford to give the finger to half of the market. This leads right back to the hardware company being fucked for spending money on arbitrary innovation.

What you are proposing, and what is happening, is that graphics become a bland sameness across all cards where everything looks horribly generic and nothing exciting or revolutionary can occur.

Maybe you don't know this, but graphics cards render what the programmers tell them to render. They look exactly like what the programers expect. If the programmer wanted it to look different, it would look dif

How about they fix their win7 drviers for not-so-old but still great performing cards like the X1800 ? Nvidia customers are having a great time with win7 atm, and even Intel integrated graphics are performing better, but I've got several friends with less than 2 year old ATI cards that perform great, but have no real driver support with trashy, even BSOD drivers from ATI for win7.

I haven't received any updates to my ATI X800 recently. Sure, they release a new Catalyst Driver pack every now and then, but it doesn't actually improve performance or fix the same bug I have in trying to Doom 3 full screen in Windows Vista.
It's also why their R400 series hardware documentation wasn't released to the community like they did for the R200/R300 and R500/600, probably because they didn't have it well documented and decided it wasn't worth paying to release specs for it after it's already end

Contractual obligations to partners (e.g. Microsoft game doesn't run) or to customers (e.g. anyone who got a bunch of those GPUs with the die bonding problem and sent them out to customers and hasn't stopped hearing about it since, say HP.)

This Sarbanes Oxley is a pain. My company told me that once an employee is is fired or resigns under Sarbanes Oxley there's no way to pay them termination because that would be an illegal expense as they are no longer receiving an benefit from that employee. Since they are worried someone who doesn't understand SOx might sue them they have told the security guards to just beat people who are leaving to death with a monitor and bury them under the flowerbeds out back.

Funny you should say that, I had my ATI 2600 HD cards BSOD me last night for the first time on Win7. It was the ati2dvag bug, wich is also prevalent on XP and Linux. First time I've had a BSOD with the ATI cards, usually the ATI VPU Recovery driver kicks in and saves me. There is no VPU recovery on Linux though.

If you do not have the hardware to support Windows 7 then don't upgrade. If your graphics card is the only thing holding you back then take a stroll over to Newegg [newegg.com] and start upgrading.

Complaining about hardware that was designed for Windows XP not working in Vista/Win7 is really akin to complaining about hardware that worked fine in Win95/98/ME not working or working well in XP. Eventually you have to upgrade hardware to run modern software. If you think ATI is choosing to end support for a legacy product t

I'll buy an ATI card when they make usable linux drivers with accelerated video like vdpau. Right now the nvidia blob is so much better and i dont really care that it is closed source. I've have a couple of friend that use ATI, and the only reason they still have windows on the computer is the crappy ATI linux support.

Well I read TFA and besides the new capabilities of DirectX 11 (which look nice, but not exactly earth-shattering to me and also will need some time to get implemented into games anyway), what I found interesting was what ATI actually did with the display output connectors.

The demo system they set up had one of those new DirectX 11 cards and that card is a dual-slot solution as all the highend graphics cards are now. But ATI did use the space from those two slots quite nicely by including dual DVI ports AND a HDMI AND a DisplayPort connector meaning you have all the different types of digital display connectors available on a single card, which would be a first, I think.

No word yet whether you can use all four ports simultaneously, but if you could, it looks like a nice new way of hooking up multiple displays:)

But ATI did use the space from those two slots quite nicely by including dual DVI ports AND a HDMI AND a DisplayPort connector meaning you have all the different types of digital display connectors available on a single card, which would be a first, I think.

I would like to see multiple HDMI outputs. The one cable - one connector - solution for audio and video.

See, this is what I don't get - why does everyone think HDMI is so awesome? It's just DVI with a couple extra pins for audio. It's not inherently higher-quality; does it have a sufficiently higher bandwidth capacity than DVI + TOSLINK that it makes an impact in real-world environments (24fps 1080p video/5.1 surround sound)? And how is having your video card double as a sound card a good idea? Isn't that just asking for aural interference from the video components?

See, this is what I don't get - why does everyone think HDMI is so awesome? It's just DVI with a couple extra pins for audio.

It's become far more than that:

HDMI 1.4 was released on May 28, 2009. HDMI 1.4 increases the maximum resolution to 4K × 2K (3840×2160p at 24Hz/25Hz/30Hz and 4096×2160p at 24Hz, which is a resolution used with digital theaters); an HDMI Ethernet Channel, which allows for a 100 Mb/s Ethernet connection between the two HDMI connected devices; and introduces an Audio Retur

See, this is what I don't get - why does everyone think HDMI is so awesome? It's just DVI with a couple extra pins for audio. It's not inherently higher-quality; does it have a sufficiently higher bandwidth capacity than DVI + TOSLINK that it makes an impact in real-world environments (24fps 1080p video/5.1 surround sound)? And how is having your video card double as a sound card a good idea? Isn't that just asking for aural interference from the video components?

First point: HDMI is all-digital, so you don't get "aural interference from the video components". It's actually a pretty cool feature of the current batch of HD 4xx0 cards that you can run the output of an HTPC on one cable.

Second point: HDMI, in the later revisions of the spec (1.3+ or so), actually does have improved features over DVI, like deeper color support, and higher bandwidth to support higher resolution displays. (It also supports 7.1 sound, not merely 5.1. Not that you actually need any of thi

Since most you other fucks just make some sort of quip with no facts, (yeah yeah, i know it slashdot) here is the wikipedia entry for DX11.

"Microsoft unveiled Direct3D 11 at the Gamefest 08 event in Seattle, with the major scheduled features including GPGPU support, tessellation[11][12] support, and improved multi-threading support to assist video game developers in developing games that better utilize multi-core processors.[13] Direct3D 11 will run on Windows Vista, Windows 7, and all future Windows operating systems. Parts of the new API such as multi-threaded resource handling can be supported on Direct3D 9/10/10.1-class hardware. Hardware tessellation and Shader Model 5.0 will require Direct3D 11 supporting hardware.[14] Microsoft has since released the Direct3D 11 Technical Preview.[15] Direct3D 11 is a strict superset of Direct3D 10.1 - all hardware and API features of version 10.1 are retained, and new features are added only when necessary for exposing new functionality. Microsoft have stated that Direct3D 11 is scheduled to be released to manufacturing in July 2009,[16] with the retail release coming in October '09"

Seems pretty big to me. The thing I see being the biggest is the work on improving multithreading/multicore support, and the whole GPGPU thing. Not to mention that the API will be very compatiable with older cards (read: no real need to upgrade cards just yet)

Yeah, there's a lot of idiots here who still think OpenGL is better than Direct3D. I doubt they'll ever change their opinions despite the fact that some of us are trying to force the facts down their throats.
I'm by no means a Microsoft fanboy, I also use OS X and a couple of Linux distros at home and work, but you just can't argue with the fact that Direct3D 11 is better than anything else out there. Hands down. It's just a better API all around. I'm looking forward to moving towards implementing a Direct

One of OpenGL's advantages was that the code would work on a number of platforms. Originally on IRIX, IBM licensed it so it worked on AIX machines. Then it moved to other platforms, surpassing 3DFX's Glide interface. OpenGL is still being worked on, 3.2 was released not so long ago.

Direct X11 offers the GPGPU support, but it also offers multithreading (some games chew CPU cores up like they are going out of style, so having threads split up among multiple cores will help performance)

They do support OpenGL, in fact ATI's Direct3D 11 cards will support the latest version, OpenGL 3.2. However, it should be noted that the OpenGL 3.2 feature set is the same as Direct3D 10, which doesn't really bring anything new to the table. Direct3D 11 is where all of the new features are.

Let's see.... It has hardware Tessellation, which ATI cards have had... forever. Oh wait, Microsoft has made it specifically so that ATI's proven implementation is incompatible. What a surprise! Now what's this.... They're implementing nVidia's current shader model? It must be incompatable. Wait, it isn't?
Microsoft spat in NVidia's eyes when they went with ATI for the Xbox 360, and now they're spitting in ATI's eyes by introducing an incompatible standard. This is just great.

DX10 was a flop because Vista was a flop. If MS let users of XP grab DX10, DX10 would have caught on in games, but it was Vista-only and no game makers were about to (or are about to) invest a ton of money in a game that's either Vista-only or in the work to make a DX9 game actually make full use of DX10 features. DX10 just was irrelevant because a great majority of the market wasn't able to use it.

Might be a part of it, but I think the real issue here is that the kind of high-end games that used to push the envelope hardware-wise, now more often than not end up on the consoles instead. Since the PC gaming platform is now like three hardware generations ahead of the consoles, console games acts like a cushion on PC gaming.. I was going to say progress, but let's be specific and say qualify as graphics progress. We'll get the occasional (late) port with DX10.1, or in the future, DX11 added -- developer

DX10 was more to alter the hardware removing variation. DX10 had little to do with implementing how the user sees the game visually and more to do with how the cards are kept up to standard and the games are programmed. Honestly, I feel that DX11 will be somewhat that way as well with the gpgpu support, threads, and so on.

Neat! The idea of drawing a 2d picture and then having an engine that auto adds wireframe and all that fun stuff seems to remove a lot of work for the developer.

I honestly thought dx11 to be more of a dx10 where most of the alterations would not be noticed by the gamer (like threading) so I'm glad they are adding something visual to help people want to push to use dx11.

I'm an OSX user so don't get me wrong. I'm not exactly a fan of directx per say but any type of innovation towards pushing the market forwar