Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

rcht148 (2872453) writes "Rich Geldreich (game/graphics programmer) has made a blog post on the quality of different OpenGL Drivers. Using anonymous titles (Vendor A: Nvidia; Vendor B: AMD; Vendor C: Intel), he plots the landscape of game development using OpenGL. Vendor A, jovially known as 'Graphics Mafia' concentrates heavily on performance but won't share its specifications, thus blocking any open source driver implementations as much as possible. Vendor B has the most flaky drivers. They have good technical know-how on OpenGL but due to an extremely small team (money woes), they have shoddy drivers. Vendor C is extremely rich. It had not taken graphics seriously until a few years ago. They support open source specifications/drivers wholeheartedly but it will be few years before their drivers come to par with market standards. He concludes that using OpenGL is extremely difficult and without the blessings of these vendors, it's nearly impossible to ship a major gaming title."

wanted an ATI card. Better performance and Image Quality for less money, but I just don't have time to be screwing around with making games work:(. I miss the hey-day of my 1650. $90 bucks, rock solid stable and fast. Just couldn't keep up.

I just bought a GTX 750 TI.. upgrading from a Radeon HD 3870, so quite a bump..

I had CUDA working with the nVidia drivers directly from the site, but was getting odd audio feedback in games..Switched to xorg-edgers ppa, and now it works perfectly for games, but can't get CUDA to work....

If AMD had actually released the "only in userspace" driver, I think I would have gone with one of theirs...

ATI blows equally. Intel is known to have a little better drivers but have software worts to encourage them to CPU bound for obvious reasons. Or was the case 6 years ago when I worked for a famous game company.

Windows 8/8.1 blows on Nvidia with the latest drivers if you do not have the latest cards. Ask any owner as the majority of the 8.1 update 1 failures were NVidia related.

My ATI 7850 also craps out requiring a re-image with any.4 drivers. 12.4 and 13.4 I avoid even though they are WHQ.

The situation with the graphics markers are like the ISPs with broadband or the major telecoms when picking a cell phone. Not a monopoloy but an oligopoly run by a few. Boy I miss PowerVR, S3, 3DFX Vodoo, and Matrox.

You can bet if they were still around competing toe to toe with Nvidia and ATI everyone would benefit regardless of which side you pick. To me I view them as picking AOL vs RealPlayer. Yuck.

For the record I was an nvidia fanboy at one time too before owning ATI cards.

I don't, having to code for OpenGL, Direct3D, Glide, Rendition and MSI to optimally support all the different vendors on the market was a huge PITA. Though I do agree that the competition was so fierce that technology was bounding forward at a brilliant pace!...and that part I do miss.

ATI is catching up and are competitive. NVidia lowered the price and made their Quadro turn into the Titan serious to counter the ATi 290x.

Good for consumers. However, their drivers are shit. ATI's drivers have improved then had issues again with frame pacing and mantle on older AMD chipsets. Nvidia had some questionable hardware and now worse drivers which are unstable and Windows 8/8.1 HATE. They do not even support all of directX 11.1 which is the cause of the crashes.

Part of me feels ATI and Nvidia are doing this on purpose so they can sell the remarked gamer cards as FirePRo's and Quadro's for real professional work yada yada at an expensive price. I mean if it is so bad even for 2d Adobe apps you need a $2,000 card just so video artificats do not pop up you know you have trouble.

Or maybe I am cynical to think of a conspiracy to sell professional grade cards more with real opengl of course.

A small correction, Nvidia Quadro has not "turned into the Titan". Quadro cards are largely the same hardware as the consumer cards, but with minor changes to enable certain features. The main difference is in the drivers. Consumer drivers err on the side of speed, whereas Quadro drivers will typically have lower performance in a game type situation, but be better suited for CAD / 3D work.

A small correction, Nvidia Quadro has not "turned into the Titan". Quadro cards are largely the same hardware as the consumer cards, but with minor changes to enable certain features. The main difference is in the drivers. Consumer drivers err on the side of speed, whereas Quadro drivers will typically have lower performance in a game type situation, but be better suited for CAD / 3D work.

Not necessarily true anymore for all Quadro's. The titan is a different beast than the other high end cards that Nvidia makes. It has double precession and other on demand hardware features. It is true the drivers crippled double precession floats on it but for cheap engineering cards they are great.

But you are taking a crap-shoot with the drivers.

5 years ago ATI was the suckiest hands down! They have improved slighty and Nvidia has gone down to where they both have their good and bad versions with bugs eve

If the article is correct (I didn't follow the link), ATI is in a perfect spot to go full open source. They would have a ton of people helping to build stable and open drivers and would be able to compete with Nvidia without spending a ton of money. Too bad it'll never happen.

That's not what I meant. Currently, it's not unusual for AAA games to have 2 or 3 different paths (or more, if they explicitly support different generations of hardware) even when using a single API for performance reasons. I was saying that having to support 3 different APIs instead of 3 different code paths using the same API isn't that much more work.

You had the same thing when targeting specific features of the Voodoo 2 as opposed to the Voodoo 1 even when using Glide. Having multiple code paths for a single API to support multiple generations of hardware is not new, it was done back when we were supporting a dozen graphics APIs as well.

So what is your point? Previously we had to write for many APIs (OpenGL, Direct3D, Glide, Rendition, MSI, etc...), now we have to write for fewer APIs (only OpenGL and Direct3D). In both cases we always had multiple code paths per API so what point are you trying to make?

I really DO NOT miss D3D execute buffers. Glide was awesome, and OpenGL 1.2 on IRIX was joyful (if the OS didn't crash on you...)

I remember coming into work one day and my dev manager saying the equivalent of "sorry about your office, but NASA is having trouble with their IR2 at Moffet so we got SGI to lend us one for a few weeks..." and lo and behold next to my desk was a brand spanking new - still had packing materials stuck to it - Onyx IR2 sitting there in all its purple glory. That was my fav

Definitely! It was cool to have the 3D accelerator days, a TNT paired with a Voodoo 2 was such a cool combo! But then being able to work on an InifiniteReality the size of a pair of refrigerators with - back then - an enormous amount of computing power was just astounding (from a nerd point of view). I do miss that, and having such a buzzing development community being pulled in all different directions by the latest innovation from one of the many vendors:)

The situation with the graphics markers are like the ISPs with broadband or the major telecoms when picking a cell phone. Not a monopoloy but an oligopoly run by a few. Boy I miss PowerVR, S3, 3DFX Vodoo, and Matrox.

Ask the people stuck with Poulsbo how they feel about PowerVR graphics, they are one of the few who suck worse than nVidia for driver support. 3DFX with their Glide API was king of proprietary solutions. S3 was the patent champion, even today their patented S3 Texture Compression causes trouble for open source. And Matrox made Intel's 3D performance look stellar. YMMV but I feel the competition in the graphics market is still working fairly well, at least a lot better than on the CPU side. It's just that th

This is correct. We had more competition in terms of choosing graphics cards in the past, it didn't mean they were actually competitors or even tried to not do a completely shit job that didnt' help anyone in the long term.

Intel doesn't give a shit even today as far as graphics - good luck getting any launch game to run on any integrated graphics platform on a screen above 1024x768, where even a $50 card from literally anyone else will do better than the extra $50 intel is charging people to have an IGP. He

Really, considering the quality of drivers out of nvidia for the last year I'm glad I switched to ATI. I think it started around the nvidia 302.xx series, where the mass lockups began and the nvidia forums(before they were hacked) that had the 480k post thread with 1m+ views for TDR's. [nvidia.com] Then it was the crashing with firefox, that lasted from the 302's right up to the 320's. It only got worse about the time the 310's or 315's rolled around and the drivers were causing hardlocks across all 400,500,600 series cards. And I think it was right around the 308's where the complaints got so bad that nvidia was willing to pay shipping costs for anyone in the continental US to have their rigs sent to California so they could try to find out why the TDR problem was so rampant.

I haven't heard anything good on the state of nvidia drivers, if I have a complaint about ATI drivers is that some programs are bit more sluggish compared to my nvidia card, but I'll take the stability over the TDR, TDR, TDR, TDR, TDR, TDR. And sadly it wasn't one card(had a 400, and two 560 series cards), and one configuration, or even one power supply or a particular CPU in my case. It was across AMD, Intel, various ram speeds, paired, non-paired, different PSU's, and machines in more than one physical location.

My general policy has been to flip-flop every generation and go nvidia to ati and back again. But the last series of drivers pissed me off to no end that I dumped them for ATI, and Matrox didn't go anywhere they're still making video cards only on the business end though. The problem of course is much like the CPU business right? Remember the days of Cyrix, AMD, Intel? Well it was a case of hardware pushing so fast that not all of the companies could keep up. Same deal happened in the videocard market.

It goes to a "black screen of death" (due to powermiser settings I've determined). I also get the TDR issue where the display "loses connection to the video card" but, that DOES seem to be able to be offset properly using:

"Windows 8/8.1 blows on Nvidia with the latest drivers if you do not have the latest cards. Ask any owner as the majority of the 8.1 update 1 failures were NVidia related."

I know it's an anecdote, but you said ask anyone, so hey, I have 8.1 update 1 and saw no failures with a not the latest card using the latest drivers. Not had the slightest problem, everything worked fine and smooth (well, apart from generally just being Windows 8 - but hey, I like to try before I judge).

I'm not so sure ATI does blow equally. I have two 660ti's in SLI to power 3 monitors. Anything beyond the ~2 year old 327 series drivers does not work for me. Without surround mode, I at least get three screens with newer drivers... generally I get a stupid amount of slowdown to the point where I feel like I'm running Windows 7 on a 486. If I somehow manage to turn on surround mode, all three screens are recognized, but only 2 of them display anything. The slowdown also gets much, much worse. To date I have

Try doing anything with radeon cards that the installed drivers were not 'optimized' (ie hacked together to get working) for and watch your $500 graphics card fail horribly.eg:1. Older games (not just ancient, but only a few years ago).2. demoscene - most demos have trouble with radeon or ship with radeon specific binaries.3. gpu accelerated desktop applications, 3d design, video editors, CAD, etc. you could argue that one should only use these programs with the 'professional' model cards, but these models share the same driver code with few modifications. The only difference is that they hide the bugs with stupid certification statements like "Only use driver 4.0.456.456.22 with autocad 15.4. While nvidia drivers have issues too, by and large, it's possible to run these applications quite acceptably on the 'gamer' class cards (which are software restricted in the driver) anyway. This is great for the gamer who wants to dabble in other things.

Maybe opengl needs a reworking.. the whole point of an api is to insulate the programmer from the differences in the hardware.

Ditto for my 7950, on anything from indie games that the developers will never have heard of to really weird legacy games that run like absolute shit on NVidia for some reason. For example, a DirectX 7 game that ran better on a 2008-era Intel integrated GPU tied to an ultra-low-voltage C2Duo clocked at 1.2GHz than it did on a GeForce 9600M with a C2Duo at 2.8GHz, even when both boxes had 4GB of RAM and ran Win7; but ran better than either on single-core 1.8GHz AMD chip with a low-end 2006 mobile graphics chip with Vista on 2GB of RAM (and also runs great on my current beast of a gaming box, with higher specs than all three of those put together and then doubled, which has the 7950 card I mentioned before).

For example, a DirectX 7 game that ran better on a 2008-era Intel integrated GPU tied to an ultra-low-voltage C2Duo clocked at 1.2GHz than it did on a GeForce 9600M with a C2Duo at 2.8GHz, even when both boxes had 4GB of RAM and ran Win7; but ran better than either on single-core 1.8GHz AMD chip with a low-end 2006 mobile graphics chip with Vista on 2GB of RAM.

That sounds very strange, what was the game? Certainly sounds like an outlier than a general rule.

Malfador Machinations' Space Empires 5. It gets unplayably bad framerates on NVidia WDDM drivers (Vista or later), no matter what settings you tweak or compatibility modes you set. It's playable (though slow) on Intel chips from the same era and quite acceptable (if still lower than it should be) on AMD.

I've got a 7990 and have had no trouble playing any games (on Windows). I think they're all D3D9 and D3D10/11 games however, so I can't comment on OpenGL games. I develop OpenGL visualisation software (debug GL 4.3 drivers loaded at the moment) with an ATI card and I've had no problems there either.

Actually, it's microsoft that doesn't play well with much else. Since the driver development consists of closed and open teams, the openness of the code isn't the issue here (though it would be nice to have).

Regardless of driver development, if you ask actual developers they'll tell you they prefer DirectX to OpenGL by a mile, mostly because of tool chain support. And why wouldn't they? They're not dicking about with open source projects, they've got actual mortgages to pay and deadlines to meet.

Go thank nVidia for keeping the specs secret for so long. Open drivers for current generation AMD hardware beat the proprietary driver hands down in 2D performance and stability, they're a little behind in 3D performance but close to catching up.

I also find it very comforting to know that we'll actually have a working driver for current-generation graphics hardware AT ALL even after so long.

NVIDIA definitely write their own OSX drivers [nvidia.com]. I'm pretty sure AMD/ATI and Intel write their own OSX drivers too but these days GPU drivers are usually delivered with operating system updates (in a similar way that you can get driver updates through Windows update). Given how squeezing out GPU hardware documentation for Linux has been tough I don't think NVIDIA/AMD would be keen to help someone else write drivers that unlocked full functionality...

NVIDIA definitely write their own OSX drivers. I'm pretty sure AMD/ATI and Intel write their own OSX drivers too but these days GPU drivers are usually delivered with operating system updates (in a similar way that you can get driver updates through Windows update). Given how squeezing out GPU hardware documentation for Linux has been tough I don't think NVIDIA/AMD would be keen to help someone else write drivers that unlocked full functionality...

The article seems to mention Windows/Linux (or Linux/Window). What about OpenGL/GLES drivers on other platforms, such as Mac OS X, Android, iOS, ?

OS X and iOS well, the drivers I believe work, but can be slow. The reason is, well, Apple pretty much wrote the drivers for AMD, nVidia, Intel and Imagination Technologies. There probably was a lot of cooperation with the respective companies, but Apple pretty much wrote it themselves as the others do not have the time, money or resources to write drivers for App

The article seems to mention Windows/Linux (or Linux/Window). What about OpenGL/GLES drivers on other platforms, such as Mac OS X, Android, iOS, ?

OS X and iOS well, the drivers I believe work, but can be slow. The reason is, well, Apple pretty much wrote the drivers for AMD, nVidia, Intel and Imagination Technologies. There probably was a lot of cooperation with the respective companies, but Apple pretty much wrote it themselves as the others do not have the time, money or resources to write drivers for Apple.

Apple is not writing the drivers for AMD and nVidia. I'm not sure about Intel. At one time Apple wrote the Nvidia drivers (over a decade ago), but they never wrote the AMD drivers. AMD and nVidia definitely have internal teams writing their drivers these days.

Apple is responsible for the OpenGL stack and driver ABI, which is where they work closely with the GPU vendors. But they're taking drops of the drivers and pre-bundling them with the OS. It can make submitting bugs a problem because Apple are the ones

Intel last released a PowerVR based IGP with the CloverTrail+ chips, Baytrail and onward will be using their internal GPU designs. Apple does use them, but I suspect that, being such a huge customer, they probably get complete documentation and wrote their own driver that doesn't suck. Everyone else gets Android-only userspace blobs of bad to shit quality.

Well, I use GCC and MinGW, so that would be GNU/Linux vs GNU/Windows... Apple's OS is illegal to install on my hardware so I don't know about GNU/OSX.

OpenGL ES is used on mobile platforms, and there are some tight CPU related issues (often what would be strictly GPU hardware is emulated CPU side on crappier devices). Smartphones and tablets don't really compare to the desktop. I'd be more interested in the difference in Nvidia vs ATI drivers consoles. For instance: the PS4's AMD/ATI Radeon vs the Xbone'

I believe Apple are up to OpenGL 3.3? Well GL is at 4.4 right now, which is more or less feature parity with the D3D 11 that was released 3 or 4 years ago. So you see how throwing money at the problem (Microsoft) results in far better, more robust and more feature complete solutions than waiting for some greasy students to finish their open source coursework.

Yeah, this Vendor A/B stuff fooled no-one. As such, I wonder how much point there is (legally speaking) in doing that as opposed to just saying "nVidia's drivers are closed source and they optimise for benchmarks and popular games, ATI's drivers are open but crap, Intel gives no fucks and just wants to build SOCs."

I didn't want to comment, because I've had a love hate relationship with some of these drivers. Two video cards ago when I was building my current computer, I wanted ATI because I was tired of the NVIDIA post-kernel-install. NVIDIA got lucky because the ATI card was DOA. So I went with a 9600GT. And I kept using it till a few months ago, when I replaced it with a 630 (about the same speed, but doesn't have the 'single monitor only' problem when using the Nouveau drivers. But the Nouveau drivers proved

Not being a gamer, I'm all okay with waiting for the rich kid's hardware to catch up. Besides, too much graphics capibility seems to do nothing but encourage developers to force all sorts of ridiculous visual doodads on software that really needs to do nothing but serve as point and click program launchers.

If I had a time machine and I could visit myself in a past life, but it was even more hemmed in than Twitter—say Morse code at one millibaud—my message to self (circa mid to late 1990s) would be this: Screw games.

Yes, I had a blast playing those games. But then I started making "mixed" decisions in how I set up my system to balance the games I liked to play and the development tools I needed to use. In hindsight, that was nothing but bad mojo. The difficulty of achieving a perfect stack is exp

I used to write drivers for hardware a looong time ago (disc drives, UARTS, that kind of thing.) I realize that these graphics cards are way more complicated and trying to squeeze every last ounce of performance out of them can be a lot of effort. (I can remember spending a day trying to save a single instruction inside a device interrupt, and those were relatively simple devices.)

Even so, eventually you can't just kkep adding people to a project. If the concepts are well known then you get some decent p

It protects you against copyright infringement, but not patent infringement. The famous reverse-engineering of the IBM PC BIOS was back in the days before software patents were considered valid - if exactly the same thing happened today, IBM could certainly have sued and won an injunction and massive damages.

It's about time for someone to host a Github clone as a Tor hidden service for the explicit purpose of allowing people to share source code without having to worry about being punished by the imaginary property police.

Slashdot has become an advertising site. Intel is always the best.
Any article which compares Intel with AMD or Nvidia is a piece of crap. Intel 20 years behind in graphics.

They really aren't that far behind anymore for an enormous amount of uses, some pretty graphics-intensive. From the HD integrated graphics onwards Intel has been making great strides with every iX generation, catching up in most ways that matter in all but the most demanding areas (high end games and GPGPU). And I'm not saying that because I love these guys, I spent a decade telling customer after customer that Intel just straight up lied (as did the driver) about the graphics capabilities of the 9XX seri

People seem to still not grok exponential improvements in hardware. Intel GPUs are state-of-the art for integrated graphics, and only about 2-3 years behind the top-end discrete graphic cards at the most. They are good enough for most things now, except top-end demanding recent games.

I would be happily already gaming under Linux, if the desktops in general weren't so buggy. I will never go to Linux as my main OS until the quality assurance of the desktop reaches a professional level.

Just today installed Fedora 20 (KDE spin) on a HP 2230s laptop. After the initial installation of all system updates I restarted the computer. Now every time I login to my desktop, I'm greeted with "KWin crashed unexpectedly". I cannot start KWin at all and have no desktop effects. Please help.

At the same time I'm personally working with the Intel guys with an issue of backlight flickering [freedesktop.org] on this same laptop under Linux.

I have to deal with problems like this all the time. Open source is garbage!!!

Like there is nothing to work around in either Windows or OSX. Typically Win7 or Win8 attention seeking dialogs drive me up the wall. They block the whole screen even for tiny minor issues. OSX very often has a frozen dock that doesn't work. Lately I've been unable to add attachment to emails by drag-and-drop in Mail.app

Everyone finds workaround in all systems. I find that the Linux experience has improved immensely in the last couple of years.

I would still like to see OpenGL largely punted in favour of OpenGL ES. As you point out, it is a much smaller API and it reflects the realities of the hardware, as it actually evolved (shader programs and GPU-memory vertex buffers), instead of how the software initially wanted to see it (immediate mode and imperative matrix manipulation).

Hopefully, the inclusion of OpenGL ES as a subset of OpenGL within the 4.x versions will make this more a reality, as well as the use of things like WebGL.