Posted
by
Zonkon Thursday October 04, 2007 @05:07PM
from the far-if-you-look-at-crysis dept.

MojoKid writes "When DirectX 10 was first introduced to the market by graphics
manufacturers and subsequently supported by Windows Vista, it was generally
understood that adoption by game developers was going to be more of a slow
migration than a quick flip of a switch. That said, nearly a year later, the
question is how far have we come? An article at the
HotHardware site showcases many of the most popular DX10-capable game
engines, like
Bioshock,
World In Conflict, Call of Juarez, Lost Planet, and
Company of Heroes, and features current image quality comparisons versus DX9
modes with each. The article
also details
performance levels across many of the more popular graphics
cards, from both the mid-range and high-end." PC Perspective has a similar look at DX10 performance.

The real joke is that neither DX9 nor DX10 are inherently "better" any more than the original Glide API was inherently "better" than DirectX or OpenGL. Hardware has been changed constantly, to give "better" responses to this call or that call, but inevitably you have to write a driver that converts the OpenGL or DX9/DX10 or whatever into something your card understands.

In the really old days, you had people actually coding for the card on hand. This is why there's a gazillion different releases of Mechwarrior 2, each of which varies greatly in image quality and features - each had to be hand tuned to the card.

If Bioshock had been intended for DX9, it would probably look the same as that DX10 shot on DX9. They'd have figured out what they needed to do, perhaps coded a few "If ATI, do this, if NVidia, do this, if Intel Extreme fail 'your video card is too crappy to play this game'" decisions for specific hardware, and that would have been that. Since it was backported (and MS would have thrown a fit to have "no difference") they had to just do a more slappy job of it.

Then again, if not for the emphasis on ridiculous graphics, think about how many games would be able to use their processing power for some seriously wicked AI. Even Bioshock only has half-decent AI that can be twigged to and predicted fairly easily - you know that a wrench guy is going to rush you, you know that the spider slicers will hang from the ceiling and lob stuff all day till you burn or freeze them, you know where the houdinis are going to land long before the animation starts merely because you can figure out what the AI tree says for them to do in what radius... it's sad.

Hell, you can predict the precise spot on the health bar where they'll run for the health station, and if you're smart you trapped that thing half an hour ago. Now you get to watch as four of them all kill themselves on the same damn one, never paying attention to the 3 dead bodies on the floor that obviously just gassed themselve using a booby-trapped station.

But nevermind. I know the reason they want graphics over AI - the same fucking morons that could never defeat a decently programmed AI (hell, they have trouble getting through Halo on NORMAL), drool over thinking that they can see the shoelaces on Madden's boot.

Hold on for a second here. The graphics are usually very GPU intensive, but the CPU is generally not overworked by them at all. If they wanted to write good AI, they could do so without sacrificing graphics quality at all.

From what I can tell, the primary limiting factor in game AI isn't even hardware related; it's designer manhours. A good AI is one which makes use of a lot of different behaviors and has good rules for applying them. It's not really difficult for the computer to handle that, but it does take a lot of time for the designers to plan and implement all of the behaviors and rules. And if they really want to trick players into thinking the game is intelligent, they can incorporate scripted behavior in certain sit

As someone who writes AI for text-based games, let me clear you of some misconceptions.

First, the goal of "AI" isn't always to be as smart as possible. Often, the goal is to make something believable and/or of the appropriate difficulty level. It's possible that Bioshock missed the mark there, but I haven't played Bioshock yet, so I don't know.

I can write "AI" that will kick your ass every time, even without cheating. (Mobs have the advantage of being on home turf, and they outnumber you.) But that's not fun for the player, so I don't do it. Instead, I'll write something with a pattern you have to figure out. Once you learn one of the ways to beat it, the mob will be easy for you, and it's time to move on to the next area. Very few mobs get the full "try to survive at all cost" treatment, and even fewer are programmed to actually learn from your behavior.

You're describing the classic "I wish this mob would keep getting harder" remorse, but think about it: would it really make sense for those mobs to learn from your new tactics? Are they supposed to be smart, or are they just supposed to be an obstacle?

As for your dead bodies example: would you really prefer to have an infinite standoff as the mobs decide it's not worth getting killed, so they go hide somewhere with their own traps and wait for you to attack? Right... so get over it. If games were realistic, you would realdie on level 1.

> I can write "AI" that will kick your ass every time, even without cheating.> (Mobs have the advantage of being on home turf, and they outnumber you.)

You are assuming that the mob would just sit there and wait for the player, like it usually does in pretty much every game. In reality, a "level" would not necessarily know that Gordon Freeman is on his way. Neither will they have the patience to sit in their assigned ambush places, waiting for him all day long. A better AI would actually "live" in the environment where it is placed, so that it would react to the player instead of waiting for him. It would also be fun to watch. In Half-Life I really enjoyed watching those occasional scenes where monsters are wondering around doing things; like when the bullsquids feed on the headcrabs. I wish there were more things like that, things worth watching.

> would it really make sense for those mobs to learn from your new tactics?> Are they supposed to be smart, or are they just supposed to be an obstacle?

If the AI was smart, you wouldn't need a mob. You would only need a few individuals. It would be like a multiplayer deathmatch, and, judging from the popularity of those, would likely be more fun than the current mob situation.

> As for your dead bodies example: would you really prefer to have an infinite standoff> as the mobs decide it's not worth getting killed, so they go hide somewhere with their> own traps and wait for you to attack?

An infinite standoff will only happen if the game designer makes you kill off the entire mob before setting off some stupid trigger to open some stupid door. Don't program artificial obstacles and the player will be able to ignore the hiding mob and go on, just like in real life.

In Half-Life I really enjoyed watching those occasional scenes where monsters are wondering around doing things; like when the bullsquids feed on the headcrabs. I wish there were more things like that, things worth watching.

For all the guff that Tabula Rasa is getting,this one one of the things that (to me) made the world seem more dynamic and lived in. The worlds you play on are active battlefields with reasonably intelligent good and evil mobs that are jocking for tactical and strategic advantages. The "bad" mobs arrive in dropships in actual squads of various types, and will patrol/hunt through areas for "good" mobs (including the player). A lot of people don't seem to like TR very much, but this was a great idea, to me

I've seen that in the tutorial part of the game, yes.However, after that it was back to the beaten-to-death formula: a elevation-map based terrain with the very occasional cave, and mobs spawning at random and standing there, attacking you when you get within some fixed radius from them.

The only slightly original thing is that instead of just popping in existence out of thin air, they have this (badly looking) animation where a ship shows up and the guys drop from it.

You are assuming that the mob would just sit there and wait for the player, like it usually does in pretty much every game. In reality, a "level" would not necessarily know that Gordon Freeman is on his way. Neither will they have the patience to sit in their assigned ambush places, waiting for him all day long. A better AI would actually "live" in the environment where it is placed, so that it would react to the player instead of waiting for him. It would also be fun to watch. In Half-Life I really enjoyed

First, the goal of "AI" isn't always to be as smart as possible. Often, the goal is to make something believable and/or of the appropriate difficulty level.

This depends a lot on the kind of game you're talking about. For FPS, you're right, but that's not nearly the most demanding kind of game for AI. So far, no company has been able to write a turn-based strategy game where the AI comes even close to being a challenge for a good player. There, the goal is still to make AI as strong as possible, and wi

I don't think that they did an intentionally crappy job of the backport; I suspect that it was more of a "We need it to run under DX9, but this is your deadline for making it happen" deal, where the programmers weren't given the time to write software versions of the effects that DX10 does directly but DX9 doesn't do as readily.

I don't think you get it. There's a reason people aren't writing assembly any more and there's also a reason they wrote in assembly instead of 1's and 0's. Yes, it's technically possible to write everything you write in C++ in binary, heck, the compiler and linker pump that out for you, but the point is to be more productive and make less errors so that you can get more done in the limited time you have. In that sense, it's way better to have modern programming tools since you can finish up the graphics

If Bioshock had been intended for DX9, it would probably look the same as that DX10 shot on DX9.

I can tell you for a fact that Bioshock was coded for DX9 and forward ported for DX10 in the last months of development. Features weren't sacrificed to go to DX9, they were simply added when the DX10 rendering path was made. Do check your facts before making disparaging comments...

If you look at "Company of Heroes - Image quality", the first "grass effects" comparison shows am octogonal wheel.

To be fair, the viewport is VERY zoomed-in for the purpose of that screenshot.

Keep in mind that this is an RTS, not an FPS. There are rarely any reasons to zoom-in that close when you are actually playing. Take a look at the first and the third screenshot above, the "green" and the "brown" ones; that's what you look at in-game. Notice the overall amount of detail. Now, is a perfect wheel on a jeep important? You won't see it... Unless you play in a crazy resolution with 1600 pixels vertical. Even then you

When you consider the FPS differences of the tests, it's not a fair comparison of the abilities of each each API.On one test the DX9 version was running at 110fps and the DX10 version running at 30fps. The DX10 version damn well better haver higher image quality if it takes nearly 4 times as long to render a scene. Push the DX9 version futher by throwing more polygons and more complex shaders at it until you reach the performance of the DX10 version, THEN do a comparison. You'll find that there is preciou

In some cases, DX9 versions are better, in some, DX10 versions are. Overall, the difference is minimal, except for "Call of Juarez" which uses a completely different set of textures and settings, so it's an apples-to-oranges comparison.Image quality: about the same, slightly different in both cases.Performance: usually twice as good for DX9, in some cases over 5x better.

I would call neither of versions "more appealing" in general, albeit I admit that in a couple of cases DX10 had less artifacts. Yet, that

Do you have a reason to believe DX9 wouldn't work well on Vista? Trains are specifically designed to work with dedicated tracks rather than the general purpose roads cars use. There's nothing like with DX9.

Am I the only one who find the DX9 version of the pictures more appealing? With the exception of the Bioshock fog examples (which had sharp boundaries in DX9) they just look more "natural" to me.

Some did, some didn't.

You gotta understand that DX10 can do absolutely everything DX9 can, so if the DX10 image looks less natural, it's more of a human flaw than technological: it's a new area and people are only starting to discover what works best, both devs and designers.

Also I imagine that fine-tuned the DX9 version more since the majority of people out there have DX9 cards. DX10 are barely out there, they probably don't even have a good selection of DX10 cards yet to test everything thoroughly.

The only thing that worries me is that DX10 shows up slower on the benchmarks. DX10 was promised to have better performance than DX9, but don't forget all of the reviewed game use different code paths for DX10, thus load more effects and use higher precision buffers/processes in the DX10 versions. So while DX10 may be faster, it's not a fair comparison when DX10 is loaded with twice the texture sizes and effects of the DX9 version.

We'll need a more objective test written to use the same elements in DX9 and 10 and compare that.

One way or the other DX10 is the future. Even if the first few generation suck, the new features show lots of promise that will come to fruit in the coming years. DX10 has no choice but to become great. If you don't want to burn, just don't buy DX10 card YET, it's the worst moment to do so.

Wait at least until there's a DX 10.1 card out there with good price and review (DX 10.1 will come with Vista SP1). I don't expect this to be before Q3-4 2008 (which is great since Microsoft would have fixed lots of things in Vista by then, and 3rd parties would have better drivers and hardware for Vista).

For the performance part, its easy to use DX9's past an example: remember the Geforce FX line of cards? That was totally rediculous. Same damn thing is happening to DX10: I bet you its the videocard manufacturers that are messing up again.

Clearly you must not be the only one since you were modded insightful. But I really don't know what you guys are looking at. In every head to head picture the DX10 looks far superior. Maybe hatred of DX10 and Vista is causing people to have selective sight or something.

Graphics are still not as realistic as they should be to suspend disbelief. There are minor differences between DX9 and DX10 games, as shown by the screenshots in the article, differences not big enough to make me worry that I should run DX10 any time soon.

I second that, the positive differences between DX9 and 10 are not significant enough to warrant the negative differences between XP and Vista.

I'd much rather see game developers expend their man-hours on making PC games creative and better to play (and in a perfect world, not restrictively ready-to-port-to-console), rather then focus on making them graphically unique to DX10.

A surprising number of people I encounter in my work have decided to forgo Vista, no matter what Microsoft does to it. There are some people who have decided not to just bow to the dictates of corporations, who expect us to buy what they offer, to give them profits no matter how poorly they perform.

Just as organized labor had to bring rapacious corporations into line in the second 2/3 of the twe

I'm with you though. They are selling low-end laptops that are totally ruined by Vista but work perfectly fine with XP, or better yet, Ubuntu. I'm surprised Microsoft would allow this to happen because it makes a decent machine unusably slow, which makes Microsoft look bad (of course, they are bad, but you'd think they wouldn't want you to know).

Anyhow, no Vista for me. It's bad enough I've paid the MS tax twice by virtue of wanting a ne

Alky compatibility libraries for Microsoft DirectX 10 enabled games. These libraries allow the use of DirectX 10 games on platforms other than Windows Vista, and increase hardware compatibility even on Vista, by compiling Geometry Shaders down to native machine code for execution where hardware isn't capable of running it.

Does Wine support DirectX? Can I install Microsoft's DirectX under Wine?
Wine itself provides a DirectX implementation that, although it has a few bugs left, should run fine. Wine supports DirectX 9.0c at this time. Plans for DirectX 10 are underway.
If you attempt to install Microsoft's DirectX, you'll run into some

I wonder how many of these differences would be more apparently with some motion and several sequential frames. I know there are texture effects that look OK when the user isn't moving but terrible when he is, although DX9 already has enhancements for that.

Still, nothing there makes me want to jump out and buy a $600 graphics card. Someday I'll have to move to PCIe, SATA, and multi-core; perhaps that will be the time. If it's with a 64 bit OS, so much the better.

Someday I'll have to move to PCIe, SATA, and multi-core; perhaps that will be the time. If it's with a 64 bit OS, so much the better.

I (well my boss actually), just bought an Apple Macbook Pro, I just wanted to point out that your list doesn't mean Vista&DirectX, as the list sounds a lot like my new laptop. A bit off topic maybe, but it will be interesting how Apple will compare to DX10 & Vista when OS 10.5 is out in a month or so.

That GeForce 8400 only has 16 stream processors (the basis of the Unified Architecture that makes up current gen graphics cards). The 8600's suffer a great deal with double that (32) as seen in their framerate tests (apart from BioShock most games were almost unplayable at 1280x1024 - which has become the "new 1024x768" baseline).

The minimum card you want for the new crop of direct x 10 games (to actually get the "eye candy" at anything over 800x600) is the 8800 GTS with 96 stream processors.

I wonder how many of these differences would be more apparently with some motion and several sequential frames. I know there are texture effects that look OK when the user isn't moving but terrible when he is, although DX9 already has enhancements for that.

Still, nothing there makes me want to jump out and buy a $600 graphics card. Someday I'll have to move to PCIe, SATA, and multi-core; perhaps that will be the time. If it's with a 64 bit OS, so much the better.

Well, the articles missed the most important part of DX 10. Gaming/hardware review sites sometimes touch on the issue, but rarely give it as much import as it deserves. It's not 9 vs 10 that's interesting, it's that for the first time in history DX 10 output is the same regardless of hardware vendor*. Long term it will pay off in spades for customers as doctored drivers and "cheats" are no longer part of the equation when trying to evaluate hardware. This is pretty much essential for moving window composti

I think the real problem is with the article. Yes some of these games have tiny features which "require" DX10, but not a single one of them is a "DX10 game" which is the language used by the article throughout. The real potential of DX10 (or shader model 4 if you prefer, which doesn't require DX10 anyway) is the geometry shader, and *NO* game developer will be using that for the things that matter (i.e., radically gameplay changing elements) until DX10 hardware is ubiquitous. So to date, there hasn't bee

I know you mean it as a joke, but the sad part is that Team Fortress 2 players are finding that "downgrading" the game's directx to 8.1 is giving a significant performance increase with a negligible visual degradation.

DirectX Will make just the advancements it needs to keep programmers from going SDL and OpenGL. Thats what it is for. The question is not how far has DirectX come, its how far does SDL and OpenGL have to go.

I think that SDL and OpenGL need to try and get ahead of DX, as that will make more game developers consider using it. (And usability, I understand that it's not only in features that DX has an edge, but also in ease of use and level of abstraction of all hardware, not just the graphics card.)

Then the answer is going to have to be "not very far". I can't see game developers getting that excited about something supported only on a version of the operating system that people are specifically NOT migrating to in droves.

Actually that's not entirely true. I've found the best indicator of what hardware and software people are using currently in the Steam hardware survey. Vista has been steadily moving up every month. It's up to 7.9% penatration which is quite good considering how many people are supposedly not adopting it. The interesting fact that of the 89,000 people that have it running as their OS only 18000 actually have a video card installed that is capable of running Dx10. That says to me a fairly large percent

These numbers to me validate my suspicion that DX10 was nothing more than a cheep angle to sell Vista. The performance isn't a tremendous improvement and the resulting graphics are enough of an improvement that I'm going to let Vista suck down that much of my hardware.

I see enough problems getting them to adopt Vista, period. And its not just game developers. Hardware vendors don't seem to do much better. I have a computer that I built almost exactly two years ago. When I built it, all of the parts used had been released within the previous 6 months. So everything on there is younger than 3 years, at the oldest. As of September, the chipset driver hadn't been updated since Vista was in beta and the sound driver offered "limited support." All of the games that I tried ran

"shadows in DX10 are crisper and more accurate than in DX9. In the image below, the shadow in DX9 has blurry edges while the same shadow in DX10 has sharp and crisp edges"

That's great, except for the fact that shadows don't have crisp edges in the real world. Unless it's illuminated by a point-source (which immediately excludes the sun, lamps, flashlights, and pretty much every other light source you're likely to encounter), there will be a penumbra. The DX9 image here: http://www.hothardware.com/articleimages/item1031/big_stateofdx10_wic_shad.jpg [hothardware.com] is more realistic.

It's especially amusing given that one of the common features touted by modern game engines is often "soft shadows," where the shadow is given a false penumbra to approximate the effects of light reflected from a multitude of surfaces. Even if the softer versions were faked, I fail to see how a hard shadow is in any way technically impressive or new.

That's great, except for the fact that shadows don't have crisp edges in the real world. Unless it's illuminated by a point-source (which immediately excludes the sun, lamps, flashlights, and pretty much every other light source you're likely to encounter), there will be a penumbra. The DX9 image here: http://www.hothardware.com/articleimages/item1031/big_stateofdx10_wic_shad.jpg [hothardware.com] is more realistic.

Not sure how this got confused by either bioshock or the reviewers...

DirectX 10 allows for both 'crisp' or 'soft' shadowing, as some games demonstrate, the DirectX 10 shadows are 'softer' and more realistic.

The 'difference' with DirectX 10 is that shadows are done on the GPU, in DirectX9 shadows are done on the CPU. This is the 'main' difference between DX9 and DX10.

The 'crisp' choice by bioshock is NOT what DX10 is about, this is a game developer choice. PERIOD.

I know reviews like this can lead people down wrong paths, but it doesn't hurt to look up this type of information before making fun of a fact that is incorrect in the first place.

It is strange that any site 'reviewing' DX10 in comparison to DX9 would not even know the basic 'consumer' terminology for the differences, so they would know what they were looking at... Maybe someday we can get a review posted on SlashDot that is actually done by gaming professionals... (gasp)

Here is a quick list from the MS Consumer Info site on DirectX10, notice the reference to shadows specifically.-----------------------Summary

I would also just like to add to this comment that soft shadows are available in DX9, but mostly in Nvidia cards at first, and at certain resolutions. I think as we saw in dx10.1, that MS is now forcing card makers to provide all features? This would be good since I wouldn't have to compare cards just to find out what features of DX are not supported.

IS it really that the DX10 gives you the ability to stuff more complex code into shaders?

No it means the cards must SUPPORT these GPU operations, unlike previous generation where NVidia or ATI did not have GPU support for many mainstream features. (ie. making it easier on developers, as when they call for shadows, they don't have to care what card is in the user's machine.)

This is the same as DX10 requiring GPUs to support pre-emptive scheduling being handled by the OS (Vista) and DX10 requiring GPUs to sup

For people that think there is 'little' difference between DX10 and DX9 for that 'precious 1-2fps lost', or that soft shadows are not a part of DX10, just look at this simple HD video that shows the difference. DX9 looks great, but DX10 looks almost real with far more 'actions' going on in the same scene.

Ray tracing traces from pixel to light source. Unfortunately global lighting and soft shadows etc. are still issues. Ray tracing is elegant and simple but like its predecessor, we still have to approximate "real-ness" and use fake effects everywhere.

This is ludicrous. Rays can be traced forward from lights as well and cached onto say surfaces as in photon mapping, or combined with various other methods to provide a physically correct global illumination.

By the way, UBC > SFU, and prof. Heidrich is the top graphics researcher in Canada:P

Yet again RT is seen as the magical wand that will make everything look nice.Hint: CGI studios do NOT use RT exclusively. In fact they use *rasterizers*, and resort to RT for stuff that is hard to fake with rasterizing (shadows, translucency, refraction, reflection,....)

Why? Because rasterizers are cheaper. Forget about the triangle throughput benchmarks, they are useless, especially for games. As Carmack said, game developers dont want _more_ triangles, they want _pretty_ triangles, which means that fillr

What's this "we" business? DX10 is only available with Vista, and Vista sales are abysmal. And with this being a *nix-oriented site, it's falling on deaf ears.

The summary states that DirectX 10 was "introduced" to by the hardware manufacturers and Windows adopted it. I have always understood it to be the other way around. If it is the hardware makers, then why are they actively supporting two different 3D APIs (DX, OpenGL)? Does this mean that DirectX could be adopted by another OS, say Linux? Only

What's this "we" business? DX10 is only available with Vista, and Vista sales are abysmal. And with this being a *nix-oriented site, it's falling on deaf ears.

Stories posted to the Game section of Slashdot rarely see more than fifty responses.

The Slashdot Geek isn't really a driving force in PC gaming and anything said here about Microsoft and Vista tends to be tainted by wishful thinking. It isn't retail-boxed Vista that sells to the home market, it is the OEM system bundle.

Well technically the hardware makers support shader model 4, which has the main and most promising feature that DirectX 10 supports: geometry shaders. It is a fairly big distinction, but this is a more accurate way of saying what they actually meant, "Shader model 4 was introduced by the hardware manufacturers and Microsoft supported it in DirectX 10." Using OGL extensions, you don't *need* DX10 or Vista to make use of the geometry shader. Now, granted there are a number of other changes that are nice in

I don't think DirectX 10 will achieve any kind of market acceptance until DirectX 11 is released. Then everyone will bitch about DirectX 11's high-end hardware requirements, DRM lockdowns, and poor performance and they'll start clamoring for the good old days of Direct X 10.

A lot of the "improvements" are things that the game is doing differently in the DX9 and DX10 versions. Some of them, like the "litter objects" in one of the games, or gun movement effects in another, have nothing to do with DX10... it's like the game developers simply put more polish in the DX10 versions because they wanted the punters to "get their money's worth".

Are We There Yet?
The DX10 exclusive effects available in the five games we looked at were usually too subtle to be noticed in the middle of heated gameplay. The only exception is Call of Juarez, which boasts greatly improved graphics in DX10. Unfortunately these image quality improvements can't entirely be attributed to DX10 since the North American version of the game -- the only version that supports DX10 -- had the benefit of a full nine months of extra development time. And much of the image quality improvements in Call of Juarez when using DX10 rendering were due to significantly improved textures rather than better rendering effects.
Our test results also suggest that currently available DX10 hardware struggles with today's DX10 enhanced gaming titles. While high-end hardware has enough power to grind out enough frames in DX10 to keep them playable, mid-range hardware simply can't afford the performance hit of DX10. With currently available DX10 hardware and games, you have two choices if you want to play games at a decent frame rate; play the game in DX9 and miss out on a handful of DX10 exclusive image quality enhancements, or play the game in DX10 but be forced to lower image quality settings to offset the performance hit. In the end, it's practically the same result either way.
While the new DX10 image quality enhancements are nice, when we finally pulled our noses off the monitor, sat back and considered the overall gameplay experience, DirectX 10 enhancements just didn't amount to enough of an image quality improvement to justify the associated performance hit. However, we aren't saying you should avoid DX10 hardware or wait to upgrade. On the contrary, the current generation of graphics cards from both ATI and NVIDIA offer many tangible improvements over the previous generation, especially in the high-end of the product lines. With the possible exception of some mid-range offerings, which actually perform below last generation's similarly priced cards, the current generation of graphics hardware has a nice leg-up in performance and features that is worth the upgrade. But if your only reason for upgrading is to get hardware support for DX10, then you might want to hold out for as long as possible to see how things play out.