Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

zendal writes "The danger with pixel shaders and vertex shaders is that there is no standard for programmability of graphics hardware. A schism has formed within DirectX between the competing demands of GPU makers Nvidia and ATI. Noted analyst Jon Peddie gives THG an exclusive first look at a White Paper on how OpenGL 2.0 is trying to bring stability and open standards to programmable graphics and GPUs."

In every emerging technology, there will always be a delay between the first appearance and the outcome of an almighty standard.It was the same with SuperVGA (took about 2 years), Internet Protocols (still on going, W3C is struggling for standards) and now OpenGL and DirectX.OpenGL 2.0 seems pretty much like the definitive solution...

gounthar said this:OpenGL 2.0 seems pretty much like the definitive solution...

Somebody better tell all those Windows developers who keep using DirectX to do their graphics. It seems that very few developers (id, Epic) still use GL in Windows, and they probably use it just because they plan to port their games. IIRC, Croteam (makers of Serious Sam) uses D3D with simply amazing results and speed.

"The choice of APIs used was guided by the criterion of simplicity, stability and portability. In that spirit, OpenGL is used for 3D rendering what enables high compatibility with all OpenGL compliant 3D cards (every usable 3D accelerator nowadays) and high stability due to OpenGL's long tradition of quality."

As long as IDSoftware uses OpenGL, there will be life left for it on the games industry.

Rarely do you see something on Slashdot that contains as much truth as that statement. Microsoft focuses their best development efforts into free products designed to crush other people's standards. OpenGL has been a continuing thorn in their side, and their ferocious work on Direct3D is aimed at obtaining the complete dominance they're used to in the gaming market. Jon Carmack has (almost singlehandedly) prevented them from doing this, and the ensuing competition has left consumers and game developers with... two really good standards. I could almost feel good about this, if it weren't for the fact that iD is competing with a monopoly, and is succeeding only because they remain privately owned and hold their market presence through sheer programming prowess.

If only we had someone like Carmack to write Office software for Linux.

I could almost feel good about this, if it weren't for the fact that iD is competing with a monopoly, and is succeeding only because they remain privately owned and hold their market presence through sheer programming prowess.

Makes you wonder if it's just a matter of time before there are extensions to d3d to make something like the quake engine redundant as a licensed technology. The quality of the quake games have been less and less important compared to the quake-engined games. Look at Half Life/ Counter Strike, Medal of Honor - Day of Defeat, etc. They are the real games, not Quake III Team Arena. Doom 3 might be a great seller, but I don't see ID veering from their pattern of game design...

Hopefully the federal government will be able to keep enough control on MS to keep such a thing from happening.. who knows.

Even if Carmack dropped OpenGL, there is still the matter of professional applications. There are several of them that still use OpenGL as their main graphics API, and it would be detrimental to Windows if they lost OpenGL, most professionals would have to move to other OSs(which could possibly include Linux), or not upgrade their OS all together.

> There is no professional graphics application available that uses Direct3D for rendering.

I guess it depends on what you mean by professional. I was also refering to Max which uses both OpenGL and Direct3D, although it also includes its own custom Heidi driver. And last time I checked (which was quite a long time ago) Martin Hashs 3d applications had switched to Direct3D, but again I dont know if you consider it professional either. Maya, Softimage, Lightwave, Houdini of course are OpenGL based. There are also a lot of smaller applications I wouldnt label consumer (they are expensive and not usefull for consumers) but more professional that use Direct3D, but they are almost not worth mentioning.

> And they don't much care about OpenGL 2.0, because they have little use for pixel and vertex shaders.

Actually, I'm seeing more game developers moving to Maya, and if that is the case, you will see a lot more interest in using a lot of the 2.0 features for developing game content. Its not so important for feature film kind of work but for game developement it is important. Newtek may be interested in it, they were demonstrating in some of their newer versions of Lightwave (6.x and up) using a lot more OpenGL features and doing tricks that game developers use in their editors, more so then I have seen from other packages, its nice to see lens flares and fog when your animating a scene, instead of having to render a few frames to see what it looks like.

Last I heard the wildcat line of cards were pretty dominant in this area. But at the same time it depends on what you are doing, game developement, tv, or film. For Film they usually want the best quality and will want something to really push the detail levels really high. TV and game cinematics it's a little lower, althought there is some that push as hard as some of the low end films. With real time aspects of games, your dealing with 800 poly for low end and 80k for high end characters (Depending on the platform and/or hardware), and just about any consumer card can be used to handle that.

Croteam uses OpenGL as their primary renderer. They do include support for a DX8 renderer in 1.05. IIRC, the OpenGL renderer was faster than the DX8 renderer on most boards (I can't seem to find the link, though).

It's especially problematic for graphics standards. GKS, which was hideously based on ideas from pen plotters, still dominated much of the 1980's. Open GL has been pretty good, but it's stuck in some 1980's ideas. For example, the strict ordering of primitives makes sense in a world of bit blasters where double-buffering and Z-buffering are expensive, but it makes little sense on modern hardware and even worse makes it impossible to implement some of the better modern techniques (such as hierarchical global scanline algorithms). Some day, we're going to have systems that cost less than a million dollars that can do real-time ray-tracing, radiosity, and other solutions of the Rendering Equation.

In every emerging technology, there will always be a delay between the first appearance and the outcome of an almighty standard.

Actually, OpenGL (which was very much an almighty standard) was famous for being very forward-looking. Rather than ratifying commonly-available functionality, it defined a standard for hardware companies to aspire to and work towards. The actual hardware came later; for a while, many OpenGL core features were available only on extremely expensive SGI big-iron.

Incidentally, OpenGL's role as defining the long-term goal for hardware has now been usurped, not by OpenGL2.0 or D3D but by Renderman.

khuber may be referring to the work going on at Stanford to develop a Renderman -> OpenGL translator (it takes a Renderman shader as input and outputs code for a sequence of OpenGL rendering passes that produces the same result). This is an attractive alternative to hand-coding calls to OpenGL extensions, which are becoming more fragmented and clumsy to use. (see John Carmack's recent.plan entry - Doom III will use a different shading back end for each family of OpenGL cards - so much for a consistent standard!)

In fact, the tables have turned. Less burdened by the OpenGL committee-based design approach, DirectX is providing a number of examples for what OpenGL might someday offer. The prime innovation is the promise of programmability included in an API, which, though quite a ways from being widely put to work by developers, is attractive to ISVs and IHVs alike. It means that graphics development can be taken as far as any creative developer wants to take it, and all the effects, looks and features will be accessible by mainstream hardware and software. It also offers that long-sought and oft-promised goal of "off-loading the host CPU." The members of the OpenGL ARB, for the most part, companies also developing products for DirectX , clearly recognize the desirability of furthering OpenGL - the ability to deliver their products across hardware platforms. GOD, I THINK THAT'S WORSE. Try this

You'd be hard pressed to say this is surprising. I can't recall a standard piece of computer hardware (like a processor or so) that hasn't had these kind of not-too-significant-non-standardness at some point in it's history.

Is it me or is this a standard marketing technique by a company that wants domination?

One of the key points stressed by the ARB is that the "open" needs to go back into OpenGL. The group has pledged that all ideas submitted for OpenGL, if adopted, are then open for use and not licensable as IP.

all ideas submitted for OpenGL, if adopted, are then open for use and not licensable

And the next sentence is an important explanation:

what we're seeing is a recognition...that graphics... is pretty much the whole point, and a commitment to that is a commitment to success in graphics.

W3C take note! The same goes for internet standards. If it can't be used by everyone, then it's not a standard, it's proprietary. Anyone who wants to make/use something proprietary is free to do so, but standards bodies should never impose them.

IMO, the only licensing issues that need exist should be more about compliance to a standard rather then IP. That is whether an implementation complies with the standard or breaks it. That is more important for a standard, then having to license several diffrent IPs just to implement a standard.

...has always been that driver support is buggy. nVidia is notoriously bad at this; their DirectX drivers are quite stable, but OpenGL blue screens left and right (especially with a lot of detail in the scene graph). I always wondered why they even bothered to include OpenGL support in their drivers, although I suppose with such a major standard they have pretty much no choice.

Now, with OpenGL 2.0, if they have to support three different API's, isn't driver quality going to suffer even more? Oh well, ATI has been getting a lot better recently, I guess we can always switch to them.:-)

I'd guess the hope is that once OpenGL 2.0 stablizes, then OpenGL 1.3 compatibility can be dropped and we're back to two standards.

THEN, maybe Micorsoft and the OpenGL group can try come to terms and maybe bring DirectX compatibility closer to OpenGL (Or vs/vs) and have a single standard.

It sounded like Microsoft wanted to come into compliance with OpenGL before, but dropped it because the OpenGL group moved too slowly. (Insert your own M$ conspiricy theory here, but I suspect they really honestly tried, and if a good opportunity arose, would come back and try again).

"Blue Screens" are caused by a fault in the Kernel or something writing to memory it's not meant to be writing to. Assuming normal user processes can only write to their own memory space, then it is a fault of the kernel. Sure, Open GL might be buggy, but it's your Windows kernel that's causing the blue screen. A remotely stable kernel would just complain that the game or whatever was using OpenGL did something nasty, then continue with its work as usual. If an OpenGL app on Linux crashes (yes it can happen) it could mess up a display driver (take NVidia's ones before the 23.13 GLX one for example) and cause X to restart. Notice that it doesn't cause a kernel panic.

always wondered why they even bothered to include OpenGL support in their drivers

If i was a video card driver writer, I'd be concentrating on making the existing one more stable instead of contemplating why we even bother producing it. Doesn't the PS2 (playstation) use OpenGL? Macs use OpenGL. What do SGI machines use? DirectX may be most popular among Windows games programmers, but it's not the only industry standard worth following. If software developers for PCs concentrate purely on DirectX they would never have had some of the amazing games that originated in the console market, as they never had DirectX support to start with.

"If i was a video card driver writer, I'd be concentrating on making the existing one more stable instead of contemplating why we even bother producing it."

I've been under the impression that video cards come out so frequently now that almost no time is put into the drivers. after the driver is released, some minor bug fixes may happen, but thats all. the majority of the work now begins on the new card that will be just around the corner...

There is also the slight problem that should the GFX vendors allow DirectX to be the only standard available they open themselves up to getting the hardware advancement terms dictated to them by MS. Like so many companies in the industry they have learned that letting Microsoft control you doesnt leave you in a very profitable place.

Look, OpenGL support under Windows is always going to be hamstrung by MicroSoft, because of MS' rabid NIH syndrome. It's just as well MS don't standardise on OGL, coz the day they do is the day the O becomes MS, and we're all fucked again. There's no point using MS and then complaining about it. Just say no. DON'T run Windows. DON'T but an X-Box/. Just say NO.

"Blue Screens" are caused by a fault in the Kernel or something writing to memory it's not meant to be writing to.

This is almost correct. Blue screens are caused by a fault in *kernel mode* (Ring 0 on Intel architecture), which is not equivalent to "in the kernel." WDM drivers [amazon.com] (like the nVidia graphics drivers), as well as all NT drivers [amazon.com] and in fact the entire USER and GDI subsystem [amazon.com] (since NT4), all run in kernel mode. None of these components are technically the kernel. Btw, wild pointer writes are a kind of "fault in kernel mode."

Assuming normal user processes can only write to their own memory space, then it is a fault of the kernel.

No argument there. See also this page [zappadoodle.com]. But as I already pointed out, the nVidia driver runs in kernel mode, not user mode, so this argument is not relevant.

Sure, Open GL might be buggy,

OpenGL can't be buggy, it's just a specification. nVidia's implementation is buggy, like I said. This is especially apparent considering that the blue screen errors have the name of nVidia's kernel mode driver in them.

but it's your Windows kernel that's causing the blue screen.

Again, confusing "the Kernel" with "kernel mode." Hey, I hate Windows as much as the next guy, but that's no reason to post incorrect technical information about it and hope nobody will realize you're blowing smoke out your ass. Next time, do a little more research [amazon.com] first.

That's true, but I already admitted I was wrong about that completely irrelevant (to the original post) detail in this reply [slashdot.org] to this helpful comment [slashdot.org]. Thanks for pointing it out again though, it *was* stupid of me to post that and then bitch about someone else doing the exact same thing later in the thread.

No, it's straight to the metal. You set registers on the GS, and DMA packets to the GIF.

Now, there is a high level library called ps2gl, and it is OpenGL-like (similiar to Mesa.)i.e. most OpenGL source code will compile without any problems.

The problems with ps2gl are two-fold:a) some OpenGL features will *allways* be dog slow on the PS2, so they aren't supported (and probably never will be) i.e. lack of stencil bufferb) Only the basic OpenGL commands are implemented / supported.

I'm not aware of any shipping games using ps2gl. It's much faster (performance wise) to just write custom VU1 TnL microcode.

Summary:If you're game runs exclusively on PCs (Windows, Mac, Linux) then OpenGL is great, since you dont' need to abstract the rendering layer.

If you're game runs on PCs and one (or more) consoles, OpenGL, sadly, isn't any advantage, since there is terrible OpenGL support on consoles. (OpenGL on XBox? Yeah, right;-(((

-=-=- Posting as an Arrogant Coward* so I don't potentially invalidate any NDAs -=-=-

* Yes I know AC really means Anonymous Coward, but from half of the posts around here, it sure doesn't seem like it.

Nvidia's OpenGL drivers are my "gold standard", and it has been quite a whilesince I have had to report a problem to them, and even their brand newextensions work as documented the first time I try them. When I have aproblem on an Nvidia, I assume that it is my fault. With anyone else'sdrivers, I assume it is their fault. This has turned out correct almost allthe time. I have heard more anecdotal reports of instability on some systemswith Nivida drivers recently, but I track stability separately fromcorrectness, because it can be influenced by so many outside factors.

Read the rest of the plan for yourself to see his comments on ATI's drivers.Note the comment about anecdotal reports of instability.

I admit, I haven't played my "blows up every time" test case (Descent 3) for a couple months, so it's possible the newest drivers actually won't crash. The DirectX component of the nVidia drivers is definitely rock solid. Looks like it's time to download the newest ones and try again:-) Thanks for the link btw.

I was pretty sure that OpenGL 2.0 is supposed to be backwards compatible with 1.3. Everything in 1.3 can be implemented in 2.0. Though that refers to programming, doing the drivers might be a completley different beast.

Could you please be more specific? I'm not wrong about it bluescreening, that's for sure.:-)

OpenGL has no scene graph. I think you don't know what you're talking about.

You got me there. I'm not a graphics programmer by any stretch of the imagination, I must have confused it with something else. This isn't relevant to my point about the drivers crashing though. I apologize for misusing a technical term, all I meant was that it tended to crash more when the scenes got more complex.

Perhaps something is wrong with your hardware.

Based on the number of comments in this thread insisting there must be something wrong with my hardware, I think I'll have to go with the majority opinion. There's probably something wrong with my hardware. Probably the power supply, actually, I just upgraded it, I'll have to try those nVidia OpenGL drivers again.:-)

I've read so many comments on the high quality of nVidia's OpenGL drivers over the years - from people I tend to believe, like John Carmack & Brian Hook. Things like "it just works", "best in the industry", "better than any other [consumer?] vendor's", etc.

What exactly leads you to say otherwise? Presumably personal experience, rather than just a desire to trash nVidia, but compared to what? Given that 3D game luminaries have repeatedly stated they prefer nVidia's OpenGL drivers to those from ATI or (shudder) Matrox, that really only leaves the few remaining "professional space" vendors (sgi, 3DLabs), and I can't imagine they're universally perfect either.

Perhaps your perspective needs widening? Or perhaps you're running into the same bug over & over and have not bothered to notify nVidia about it? (or perhaps they just think it too isolated a case to get a high priority)

I have heard bad things about nVidea on Linux. Part of the problem was a bug in certain AMD mother-boards that got fixed in the kernel two months ago. (AMD mother-boards in the sense that they worked with AMD cpus. AMD doesn't make mother boards itself). I think the problem was probably publicized more because people don't like the close source driver.

NVIDIA's 21.83 drivers were very nice IMHO. However, the 23.11 drivers are awful. Many, many people have complained of infinite loop errors and blue screens. The 27.xx series are rather poor at the moment, but they are beta. I tried them and promptly removed them when Dark Age of Camelot would refuse to load with them.

I'm still using the 21.83 drivers and will continue to do so until I hit a problem in a game that is fixed by a later driver.

The most immediate step to creating an advanced OpenGL standard will be to combine backward compatibility with OpenGL 1.3, while maintaining parity with DirectX by incorporating required features such as vertex and pixel processing,and memory management.

So developer really wouldn't be supporting three different APIs as you would suspect.

nVidia is notoriously bad at this; their DirectX drivers are quite stable, but OpenGL blue screens left and right (especially with a lot of detail in the scene graph). I always wondered why they even bothered to include OpenGL support in their drivers, although I suppose with such a major standard they have pretty much no choice.

This puzzles me. Aren't there a large number of engineers originally from SGI at nVidia (e.g. Mark Kilgard and the like) who used to be the original OpenGL "gurus"? You would think they'd know OpenGL cold?

nVidia is notoriously bad at this; their DirectX drivers are quite stable, but OpenGL blue screens left and right (especially with a lot of detail in the scene graph).

Total FUD. NVIDIA has been quite good about OpenGL support. I've played several of the first-person shooters (Quake, SoF etc.) using NVIDIA hardware and haven't had a single OpenGL related problem. Also, a person I have a lot of contact with at work uses SolidWorks (a nice 3D CAD package for Windows) which is exclusively OpenGL display with both consumer and professional level NVIDIA cards under Win2K. He hasn't had a single card-related crash, and he has some very complex models.

You should also read some of Carmack's recent.plan updates [webdog.org]. If anyone knows how to stress an OpenGL implementation, it's him. A direct quote: 'Nvidia's OpenGL drivers are my "gold standard", and it has been quite a while since I have had to report a problem to them'. That is stellar praise from an ISV. Also you should give NVIDIA credit for it's great OpenGL support under Linux and MacOS X.

In short, if you're seeing blue screens using the NVIDIA OpenGL drivers, you very likely have a hardware problem. Otherwise, they are very good.

Now, with OpenGL 2.0, if they have to support three different API's, isn't driver quality going to suffer even more?

How so three different APIs? There will be OpenGL and Direct3D. What other one do you count?

Oh well, ATI has been getting a lot better recently, I guess we can always switch to them.:-)

We use opengl at my work, and our base platform is built around the geforce2. We've had some sticky issues from time to time where the nv drivers crash - but every time they've been tracked to our own bad data being sent to the card. In a better world, their drivers wouldn't crash on bad data. At least the crash only kills the app - not the system.

sounds really great, but i don't see it happening... nVidia, ATI, Voodoo, whomever will alway wanna do the next cool great thing and that's why the extensions are available...

And we all know MS wants DirectX to rule them all. OpenGL works, and is an open standard by definition. Extensions in there make life interesting certainly, but you pretty much know what you're getting into when you try NV_texture_rectangle or NV_texture_shader. (hint, the NV stands for NVidia) sure you can find out in directx if the hardware supports XYZ before you call it, but i find the naming convention of OpenGL a bit more coder friendly. it's readily obvious if you're trying something that's not supported across the specification.

I can't speak much to DirectX, but you can also query out what extensions the hardware (or at the very least ICD) supports in OpenGL. But even if you don't, an unsupported op is treated like a non-op anyway, so even if you call an NV_ extension on an ATI board, it shouldn't do any harm (or am I incorrect?).

Regardless, the extensibility of OGL is a double edged sword. Really, it does make features board-specific, which is not the point of OGL. On the other hand, allowing these extensions does drive the future of the standard. It lets everyone throw what they got out in the public and see what sticks.

Personally, I'm glad that OGL has huge gaps in standards updates, unlike DX. After all, it *is* a standard, and should be relatively static. Each new version of the standard should be absolutely positively a good standard. Anything missing can be used as an extension in the meantime and added to the next version 4-6 years down the road. This is the strongest point of OpenGL vs DirectX, there is a controlling body of many companies, rather than one main controller (MS). Bringing together the experience of co's like MS, SGI, NVIDIA, ATI, etc not only makes the standard better but adds a level of comfort to the users of the standard.

Basically, my point is that OGL standards *should* take a long time to finalize. Everyone seems to forget that standards should be able to last a long time. OGL is now on v1.2/1.3 after an evolution of around 10 years, while DX is nearing DX9 since, what, 1995 or so?

You really should know better than to use version numbers as a sign of progression. First, Direct3D only appeared in DirectX 3. Second, there was no DirectX 4 (God knows why). Third, the first few itterations of Direct3D were awful.

sounds really great, but i don't see it happening... nVidia, ATI, Voodoo, whomever will alway wanna do the next cool great thing and that's why the extensions are available... As an OpenGL developer, I can truely say: Extensions SUCK ASS. I've been keeping up with extensions with my DemoGL library for some time now, but it's a battle you can't win, there is no consistency, no 1 clear API, but there are a few: nVidiaGL (with the nv extensions) and ATIGL, with the ATI(X) extensions. Oh, and some drivers support OpenGL 1.2, others support OpenGL 1.3... Yeah, nice and all. (not).

And we all know MS wants DirectX to rule them all. OpenGL works, and is an open standard by definition.

OpenGL is a standard, but not 'open'. You don't have anything to say about what OpenGL will be in the next version. The ARB does, but they also are limited, since nVidia and ATI are always ahead of them, and because what they offer in propriety extensions is _THE_ stuff to use in bleeding edge 3D graphics, using an 'older' ARB standard is not the way to go to stay ahead of the pack of competitors.

Extensions in there make life interesting certainly, but you pretty much know what you're getting into when you try NV_texture_rectangle or NV_texture_shader.

Oh, do you? What if I have an ATI radeon 8500, these extensions are not available, I have to use ATI's syntaxis. But what's worse: you have to rely on the cardmanufacturers documentation for these extensions. I don't know if you've tried to figure out how to do cubemapping and compressed textures using nVidia's docs, but it's a pain to say the least.

With DirectX, there is one clear manual, one clear API and no zillion codepaths to code to support OpenGL 1.1, 1.2, 1.3, nVidia's extensions, ATI's extensions etc.

I think the OpenGL API is exceptionally clean, and generally stable. Yes, there are core code paths in OpenGL that are not accelerated by everyone, nor every driver. However, I find the general implementation far better for developing real 3D applications. 99% of the time I am not developing for other hardware, I'm working to get a concept rolling at 15fps on whatever I'm developing on. Same concept that John Carmack uses to develop his next generation game engines. Just push the boundary till it crawls, because in the timeframe it takes to refine the product to a shippable solution will unfailingly show a shift in feature support. This means, you make it WORK at 15fps or even slower then, over the next year, design code paths that optimize for existing hardware.

Both Direct3D and OpenGL require 'special' code paths to handle the tons of strange hardware out there; there is no escaping that fact. Therefore the various application code paths must be designed to switch over to alternative paths, primarily for optimization purposes.

Otis, you know from DemoGL how hard it is to handle all the tons of alternative hardware out there through OpenGL, but have you done the same with Direct3D? I find the CAPS structures in Direct3D far more difficult to write support code because I end up dealing with the problem of supporting various BITS of Direct3D which may kill my code dead, instead of reliably knowing OpenGL's core code path always works albeit slowly. Another interesting consideration, once your application has demand to work correctly by over even a hundred users because it looks cool, expect the next driver iteration to support the code path better. ATI has been highly responsive to my requests in the past, and so has nVidia.

Extensions are there mainly to optimize code paths for specific purposes. With the ability to transfer frame buffer data back into texture data, allows for nearly everything to be accomplished in multiple passes, even extremely slowly. Only some very specialized pixel operation absolutely require cube mapping, generalized shader, or bump support. However, they are generally cosmetic, and should be treated as dress-up accelerated features. i.e., Grab a Toy Story DVD and watch the 'behind the scenes' stuff, it's amazing to see how the visual process works. Animate simple blocky characters, apply lighting and moderate texture detail, drop in finer meshes, dress-up the scene, and polish. That's how programming game engines work too.

I don't think so. The 2.0 proposal was brought up at the September 2001 OpenGL ARB meeting -- about five months ago. And the OpenGL 2.0 White Paper has been since at least November. While this stuff is important, there's nothing new about it. (Good thing, too; good standards take time.)

>how OpenGL 2.0 is trying to bring >stability and open standards to >programmable graphics and GPUs

The problem is that there are quite a few companies out there that do not want open standards because it gives them a competitive edge over other companies, end users, and organizations; even if it actually helps them as well. (And you know who you are).

Vertex processing can (and often do) do lighting and coordinate transformations, but it does far more than that. It can be used for anisotropic lighting schemes, matrix blending/skinning, keyframe interpolation, surface deformation, various procedural lighting & texturing approaches, even pseudo-motion-blur & complete particle systems. Take a look at this pdf [nvidia.com] for some of the many uses found so far.

Fragment processing is more for new ways of combining textures, fancy bump/reflection mapping, or creating procedural textures from scratch. You can do relatively complex mathematical operations [stanford.edu] (think "massively parallel SIMD") - even cool stuff like using textures as multidimensional lookup tables for further texturing. More than just "better access to texture memory".

The pack/unpack processors are actually pixel pack/unpack processors, and AFAIK unrelated to vertices. They're used for encoding/decoding the myriad of possible formats of image data from the host (e.g. textures). OpenGL currently copes with a large number of possibilities (RGBA, BGRA, 32 bit, 48 bit, floating point data, different numbers of channels, etc), but a programmable processor will simplify all this, and also allow for more exotic encoding schemes like the various 2D and 3D texture compression formats around. One of the stated goals of OpenGL 2.0 is to reduce the huge number of vendor-specific extensions. A lot of those extensions deal with accessing textures.

All this will be done in software (although fragment processing is notoriously slow to do in software), but hardware already exists that does programmable vertex & fragment processing. It wouldn't surprise me if programmable pack/unpack hardware also existed on modern GPUs, and was just waiting for an API to expose it.

X11R6 is old enough now so how about X12 that has OpenGL (and lots of other improvements) build in?So Xlib would have it incorporated and it would be much faster than as is done now of building iton top of Xlib and extensions.

I bet on string based ATIthe Nvidia solution you had to licence from them then the ARB did not want to get into that ugly mess only when ATI after being given time and incentive (from MS to do DirectX 8.1) did NVidia change their terms even then its a kludge

I quite like the ATI solution and they right from the start made it so that anyone could implement it, Yay for standards (-;

Actually, the nVidia extension is the string-based one; the ATI extension uses commands. And I wouldn't call either approach a "kludge" - they're both quite usable, & in fact so similar it's almost possible to automatically translate between them.

I thought the ATI command method looked cleaner, but Carmack says it was "massively more painful" [bluesnews.com], and prefers nVidia's string-based approach.

It's true that nVidia did want a licence (protecting their IP, yadda yadda) which may have slowed adoption, but come to think of it, there isn't even ONE NV_* extension supported by ATI anyway, even the useful ones like NV_texture_rectangle (and yes, the ATI hardware does support it, they just refuse to expose it under Windows until it becomes "officially" supported), so I kinda doubt it really slowed down anything except ARB adoption (which is happening now in OpenGL 2.0, the way it should be).

I think this is a cunning move by 3Dlabs - Their business is being threatened by nVidia and their Quadro range - it'll be interesting to see how unbiased they can manage to be when generating the spec.

Otherwise, I think it's a good idea. It'd be nice to see OpenGL keeping up with (or even outshining) DirectX...

Nope. DivX is a hack of a codec MS PROPOSED for MPEG4 and had REJECTED. So, not only was the original codec NOT MPEG4, the derivative isn't either. Both are obsolete anyway compared with Real 8, Win Media 7/8, Sorenson 3.1, Zygo Video and On2 VP4. Wait for the MPEG4 standard, then use that. In the meantime use QuickTime for it's flexibility and stability.

so I've heard... mind you, I actually do video compression for a living and NO client has EVER asked for their material to be compressed to DivX. Not a single one! We did get a DivX movie in once to be output to DigiBeta - looked absolutely awful. I've played with both DivX and 3ivX encoders and - take it from me - it doesn't come close to WM7 or 8, real 8, SV3.1 QuickTime, Zygo or VP3 or 4. Not even close. And no decent compressors for QT?? Eh? I wouldn't know about ripping DVDs really - I've done it a couple of times but really, what's the point? I can save a disk image of a DVD on an HD if I want to - but why recompress? Just to steal it?

Modern hardware support is critical, interfacing with what seems to be a diminishing number of compatile drivers is critical. Keeping the spec out of MS control is critical.

There are MANY options to a ground up rewrite of OpenGL supporting CURRENT hardware, working with the Hardware vendors directly is the key.

I understand this is not an undertaking to be taken lightly, I have been working on options and looking for cross platform alternatives for a couple of months up to now, there are several promising alternatives, I hope to present these in a short time.

OpenGL's time is over it seems, MS is working with vendors explicitly to limit their support, there remain major differences of opinon in it developer base and schisms seem to be forming in it goals.

If you take the time to read the spec, you'll realize that a lot of what you think is old OpenGL HAS been scrapped. Most of the more complex and kludgy functionality (such as texture management, loading, and mapping) has either been completely revamped, or drawn into the new "programmable" interface (via pixel shaders). Furthermore, they've cleaned up many of the more annoying things (such as the many ways to create objects).

What they're planning to do is have a set of "compatability" functions, for people still using the OpenGL 1.3 standard, but make a "Pure OpenGL 2.0" subset of these commands, plus some new commands, to completely replace the functionality of OpenGL 1.3.

So, in the end, they've done almost what you've recommended in this message, but still with the OpenGL name on it. IIRC NVidia and other graphics corps are behind this as well.

Your arguments are valid, but don't be so quick to scrap the car when all it needs is a tire change and a few engine block repairs:-D

I agree that this is what is necessary. It should also incorporate text drawing and fonts and 2-D image drawing and various other things used by GUI toolkits, so that we don't have to go through the pain of having to make totally seperate windows and use totally seperate code for different parts of the application.

I don't see much signs of this happening, however. Huge amounts of drawing code have been built into toolkits (like Qt and KDE and the Gnome libs and on Windows into MFC and various DLLs) that should really be in the graphics interface, and the people who know how they work are not willing to cooporate together and move all this work to a more sensible place.

OpenGL 2.0 seems to be in the right place at the right time. If the 2.0 standard can sanely standardize the programming of pixel and vertex shaders, OpenGL might be given the boost it needs to start overtaking Direct3D's hold on the gaming market. And once development moves to OpenGL, it's one step closer to widespread releases on (insert favorite OpenGL friendly operating system here) and a loosening of the MS stranglehold on computer games.

On the other hand, if this OpenGL extension craziness continues on as it has been, the project might collapse into a tangled and unsalvageable mess. The OpenGL standards people have one shot at doing this right, and whether or not they pull it off will determine their long term success or failure.

The "Specification Overview" pdf from the 3dlabs white paper page is pretty interesting. It has a list of over 250 opengl extensions and what happens to them in opengl 2.

Basically they all disapear.

Some have already become part of the standard. Some are added to the standard in opengl 2. Some just disapear altogether.

But the large majority of them are not needed anymore when you have programability, memory management and opengl objects etc.

To me that means that opengl 2 is way more flexible. Flexible enough so that we won't need as many extensions in the future.

And that's pretty cool.

(BTW: Brian Paul is a member of the ARB. He wrote on the mesa list that he hasn't been following the opengl 2 process very closely but that he expected that they would probably want him to write a free implementation).

Programmability in the graphics board creates some problems. Remember, you can run more than one OpenGL program at a time, or just have more than one OpenGL window open, which the graphics board sees as the same thing. The graphics board now has to have per-thread state, and has to be able to do a context switch. The graphics board now needs more of an operating system. For example, it has to be able to kill a vertex shader that's in an infinite loop. There's also a possibility of security holes in the graphics board.

There's the artistic question of whether textures should be drawn or programmed. In the film industry, everybody except Pixar mostly draws, while Pixar writes RenderMan shaders for everything. (The whole OpenGL vertex/pixel shader thing is basically a lightweight version of RenderMan). Artists would rather have more texture memory than programmability.

A whole language for writing data pack and unpack functions is overkill. Pack/unpack doesn't do that much. It's not like the graphics board is going to unpack a JPEG image. It's just converting bitplane order, depth, and such. OpenGL already has quite a number of modes for texture storage; this just handles the people who wanted their favorite storage mode supported. A more declarative mechanism would have been more appropriate.

They are used extensively in film graphics. All other major renderers, not just RenderMan, have shader languages. Ie vMantra, Maya, LightWave, etc.

Shaders do not "replace" texture maps. One of the most-used functions in a shader is to look up a given uv coordinate in a texture map and use the resulting color to control the shader. In fact most of the shaders we write involve manipulating texture maps, which were (as you said) painted by hand. We can do much more interesting things with textures other than just using them to color the surface!

I agree about the pack/unpack mess. I think all useful image formats could be described by these items: number of bits per sample (limited to powers of 2), number of samples per pixel, delta between each pixel (so they can be further apart than the number of samples or you can trivially mirror it with negative numbers), delta between each line (allows a "window" to be cut out of a larger image, allows flipping upside-down, and allows 90 degree rotations by adjusting both deltas). There is no need to describe what the samples are, that can be determined from the count and what function you are calling, we can insist on RGBA order for normal images.

First, OpenGl is far more stable than DX on NVidia cards. It is only outside that market that this is sometimes not true. However, you have to keep your drivers updated. Graphics programmers typically push the newest extensions and that will break old drivers. In practice, I virtually never lock up my computer using GL. However, it is trivial to lock up a system in DX. Of course, we won't even go into the compatibility of non-Windows systems with DX because that's, well, obvious, yes?

Now don't take this as a diatribe against DX. I use it frequently, and for some projects it's the thing to use. It's easy to develop quick apps, prototypes in particular, and obviously with the Xbox it's the thing to do.

As for whether a generic shader language should go into OpenGL 2, well, everyone has their opinion on that. Personally, I think it's a bad, bad, bad idea. Any pixel or vertex shader language that is implemented now will be out of date in short order. Anyone who has used these shader languages knows how crippled they are. Ergo, you'd implement these standards in OpenGl 2 and everyone would use the vendor supplied extensions instead anyway. How does that improve OpenGL?

DX has made many mistakes with regard to implementing these kind of features. They're barely used in one version of D3D and become white elephants in subsequent versions.

Fundamentally, the OpenGL standard is an API core. It only supports a minimal core level of functionality that can reasonably be expected to persist over time. Everything else should be an extension. It's not like it's hard to figure extensions out, after all. The vendors do supply documentation, and the Opengl repository maintains a list of all extension documentation. Then you can freely recognize which extensions are garbage and not use them, rather than be saddled with them for a decade.

OpenGL, it's just all around better than DirectX. It's more portable, it's faster, et al. It's only problem is, it's not as popular. Here where the only book store in town is Barnes & Noble (not that that's bad) and there are only maybe 2 people interested in programming in a general 50 mile radius, there are a limited number of books on OpenGL available at any given moment (no programming books at my library, already checked). Other than that, I have yet to find something not good about OpenGL. It has a higher learning curve than DirectX does (so I hear) but, it's worth it.

My thoughts on DirectX is, first it's Microsoft's. That's not bad, Bill Gates isn't too stupid. But, it tries to take way too much out of things. It's like a high-level language. Like BASIC. It does too much per command that it's completely unoptimizable. A smart programmer could do what DirectX can do only better. I first started programming for TI calculators. My expertise was in the TI-89 and TI-92+. There was one major problem with them. It required a kernel which came with "libraries" with it that every one used. It had basic routines in it like printing sprites on the screen, printing words on the screen (I kid you not, even though TI made a rom call to do that), even to pause the calculator (even though _ROM_CALL_051 was exactly that). The code was choppy and very unoptimized. If the programmer of a game made his own routines, he could design it and fit it into exactly what he needs w/o other junk that's useless to him. I don't know how to use DirectX but, here's a general idea along with that. You have networking code that will send 10 bytes of data to the computer at the other end while in the game. You have exactly what you want and it does it fast, efficient and only what he needs. But, using DirectX's network code, it requires that you send another 100 bytes of junk that is useless to either side. Like I said, DirectX is easier to learn but, that just puts it further into the problem of the calculators I mention above. So, why bloat your code any more than you need? But, you don't have to worry about it as much since all computers have both. Still, it helps with optimization to write your own code as much as possible.

That should sum up my views on both of them. Of course, OpenGL might have some of the same problems that I mentioned with DirectX but, I don't know of any. Anyways, have fun with OpenGL!

Personally I see this OpenGL 2.0 proposal as a wishful me-too effort by a 3D vendor-wannabe. Last time I checked 3DLabs wasn't in shooting range of NVIDIA or even ATI. Why would either of the two leading vendors stop and do a bunch of boring standardization work - initiated by a potential competitor - when they could be developing proprietary extensions to sell more products?&lt/cynic&gt

I actually think standardizing stuff like vertex and pixel shaders is the wrong thing to do right now - there are too many differences between the major implementations that smoothing them over in a universal API would be costly/inelegant, and the feature sets are changing very rapidly (new texture modes every 6 months!)... It would be smarter to wait a year or two, when things will have settled down a bit, and then write the standard in stone.

I will. In typical Tom fashion, this article uses a lot of words to express a small bit of information. From my reading, I get that OpenGL 2.0 is taking a long time to finalize, and the OpenGL ARB is trying to standardize a lot of the extensions without hitting copyright/Intellectual Property/etc issues. A decent read, but it needs to be re-written by someone who isn't so wordy.