Posted
by
Soulskill
on Friday March 11, 2011 @02:10PM
from the changing-horses dept.

arcticstoat writes "First-person shooter godfather and OpenGL stickler John Carmack has revealed that he now prefers Direct3D to OpenGL, saying that 'inertia' is the main reason why id Software has stuck by the cross-platform 3D graphics API for years. In a recent interview, the co-founder of id Software said, 'I actually think that Direct3D is a rather better API today.' He added, 'Microsoft had the courage to continue making significant incompatible changes to improve the API, while OpenGL has been held back by compatibility concerns. Direct3D handles multi-threading better, and newer versions manage state better.'"

They're actually pretty close (here's the original Doom Bible [doomworld.com] for comparison.) The point, though, is that FPSes without plot would have been way less established, and games of the mid-nineties would have gotten something of a better start in being story-bearing things. It would have been cheap science fiction, sure, but at least the story would have been action-movie-grade and not cereal-box-grade.

Doom is still the finest FPS ever created. A good arcade action game will stand the test of time. Like Tetris, Galaga, PacMan, etc Doom is a classic that will be played for decades, if not centuries. If you play these games for the story, you're missing the point entirely.

The emphasis modern shooters place on story and graphics as opposed to level design and gameplay is a major reason they suck.

That's just one bit of magic rote. Nothing to learn here, just an (ab)use of an IEEE 754 format quirk.

So "easy" and "hard" doesn't exactly apply. Finding it out may be hard, using it certainly is easy. It's not like you have to change anything about it to have a (more or less accurate) fast sqrt function.

You CAN try to understand it (it's not that "magic" once you see why this applies), but you don't NEED to. It's a bit like DirectX coding. You CAN try to understand how DirectX does all the "magic" for you

I actually dealt with a variety of fast square root and inverse square root functions and profiled them when creating a particle system for a MMORPG (Eternal Lands). That algorithm is indeed fast (at the cost of accuracy)... but it's still significantly slower than the hardware SSE inverse square root routine. Hence this function:

Yeah, problems always look easy after someone else has solved them for you, I guess.

Your advocacy of blissfully ignoring the internals fits in well with all the "yay Visual Studio!" and "yay.NET!" that I'm hearing in these comments; a legion of dull-eyed "developers" that are perfectly happy to eat only the food provided them, trusting their master to give them the right balance of nutrients. I'm sorry if that sounds harsh, but this whole thing is disgusting.

However, the title of the magazine is "Custom PC". It is worth keeping in mind that if the PC and Xbox are the only platforms you are targeting then DirectX is a valid choice for development technology.

Otherwise, you are better off developing in OpenGL, where you can target PCs, PS3, iPhone, iPad, Mac OS X, WebGL, industrial Unix (not all 3D apps are games, dontchaknow?). The only thing you can't do much with is the Xbox (technically possible, but deliberately closed by Microsoft).

Also, the pace of change in OpenGL has picked up tremendously with the stewardship of the Khronos group. So OpenGL is starting to have parity in features again after lagging for some time (plus, you can get those features on Windows XP for those still on it).

isnt correct; the article seems to contradict himself. carmack says direct3d is better because of incompatible updates made to the API, where as OP says its multi-platform performance is stellar? let me just load up a copy in my FreeBSD...yeah, that doesnt work.

his opinion also seems to contradict his own drive toward open source. if the thing you like only works with one vendor, how do you anticipate ever FLoSS'ing your code?

He's talking about OpenGL being good at cross-platform support, not Direct3D. Because Direct3D doesn't have cross platform considerations, that's allowed Microsoft to make more significant changes and pull ahead of OpenGL without worrying about breaking compatibility with other vendors' implementations.

Geez, if I were an OpenGL developer and Carmack started talking about things that OpenGL should implement to make his game engines work better, I'd be like "Yes sir, Mr. Carmack!" Seriously, those game engines are what's keeping people using OpenGL in the first place. It's too bad that ID software doesn't have the resources to fork that shit and develop it to suit their needs. I'm sure that it would be better.

It's pretty obvious that the smartest Microsoft engineers are working on game-related projects, and it's smart. Microsoft might be watching its empire erode, but games are a field where their dominance might actually be growing. DirectX is a big part of that, and the Kinect has also really stirred the pot. Lots of comments here are to the effect that Carmack is stating what has been obvious to everyone else for years. Yes, Carmack was a true believer, and his (late) heresy is a sign that MS alternatives in some fields are just... quixotic. It's not quite like RMS saying that he really should just start using Windows because it works better, but it's about 10% of the way there.

He's clearly talking about desktop environments, given that this is a thread about DirectX which is not renowned for its heavy usage on servers, and his phrase "general use". On the desktop even the most hardcore Linux enthusiast would have to admit that the sub-1% penetration is pretty lacklustre, compared to OSX let alone Windows.

Yes, there are plenty of things even in my flat that run Linux that aren't desktops/laptops; yes, Linux on the server is much bigger than Linux on the desktop is. But in penetrat

Of course that's like comparing a new 2010 Corvette Grand Sport to a new 1932 Ford Coup. One is an old tech car even though it's still nice and shiny.The capabilities of phones and mp3 players (other than touchscreen and pocketablity), are far inferior to those of a modern desktop or laptop no matter how you slice it. They don't have the storage, the screen res, the graphics processing power, the raw cpu power, the memory, etc.Sure, they've got some great games, and they've got 3000 slightly different versi

And this is modded insightful ?Free Software is not a religion, it is a policy, and as strong political roots, altough these political roots are certainly not aligned to "bipartisan politics".

The issue with open souce and OpenGL is that a large part of the implementation of OpenGl (at least the efficient implementation) does not depend on any open source activist/developper but on the good will of video card developpers.

Now Microsoft has the "monopoly advantage" if they say now you need to cut off your left feet to implement Direct3D the videocard fabricant (Intel, Nvidia principally) will find somebody in charge of getting his or her feet cut off to keep the market.OpenGL has to reach a concensus...

Now Microsoft has the "monopoly advantage" if they say now you need to cut off your left feet to implement Direct3D the videocard fabricant (Intel, Nvidia principally) will find somebody in charge of getting his or her feet cut off to keep the market.

This is nonsense.

The reality is that Microsoft works closely with the hardware and software developer so that everyone remains on the same page.

Microsoft builds a consensus around what is possible and what is desirable - not only in video and sound, but in every aspect of PC gaming.

They work with the GPU manufacturers. Basically when new GPUs are in development, so is the new DirectX. So MS has a chat with nVidia and AMD. They tell the GPU makers the kind of things they want, the GPU makers tell them the kind of things there hardware is going to have, and they are able to come to a standard that everyone supports. That is why when new GPUs come out they support all the features of the new DX. It isn't some amazing coincidence. Also it is proper support, a single standard that works well with the abilities the cards have. You write your DX driver, and everything works.

OpenGL functions in much more of a lagging capacity. New video cards come out, and then it gets support for whatever it is they bring to the table sometime later. Khronos doesn't seem to go out and engage the vendors during development and try to have OpenGL ready to meet the next gen cards. Also their strategy often seems to be "just use extensions for it," which means that you can have differences between vendors for how things work.

DirectX, OpenGL.... why should the end user care? Unless you're developing a rendering engine, you should probably just be using a rendering engine not caring about the underlying mechanics. I mean, how many times should we have to implement stencil shadows from scratch, mesh loading from scratch, paged geometry from scratch, and on and on? Odds are, if more than just a handful of people need a piece of functionality, a good rendering engine probably already has it implemented -- and a better, faster, mo

[quote]...Khronos doesn't seem to go out and engage the vendors during development and try to have OpenGL ready to meet the next gen cards.....[/quote]

Complete utter bullshit. The chair of the ARB OpenGL working group is the lead for GL driver development for NVIDIA. Much of the jazz for the OpenGL specifications is driven by the hardware makers. Many GL extensions that were first exposed by NVIDIA made it into GL3, and that tradition continues today.

The major stinks people have had with GL:
1) GL3 was late and a disappointment (the disappointment is mostly related to not getting direct state access of GL objects)
2) Direct State Access (i.e. can edit objects without binding them first) is available as an extension. As of now AMD and NVIDIA support that extension (it has other parts too, for reference that extension's name is GL_EXT_direct_state_access. I'd make a substantial bet that GL4.2 will have it in core.
3) Few tools in comparison to Direct3D (though glDebugger is for free now)
4) Inconsistent behavior and implementations of GL across different hardware vendors. Intel's GL sucks monkey balls, NVIDIA's lets you get away with murder and ATI makes you follow the spec with all i's dotted and t's crossed. Additionally, what is fast for one IHV is not fast for another (buffer object usage hints I am looking right at you).

There is also some truth to that GL now "follows" Direct3D, i.e GL4 came out after D3D11, etc.Though, there are extensions in GL that completely blow the living crap out of anything you can do with Direct3D, take a look at GL_NV_shader_buffer_load. Full blown (read only) pointers in shaders (pointers point to data of buffer objects). Nothing in Direct3D compares to it.. and it is for GeForce 8/9/1xx/2xx/3xx cards! For the next generation (GeForce4xx) there is even write access (GL_NV_shader_buffer_store).

This post doesn't make any sense. The people who define the OpenGL spec include delegates from ATI, Creative Labs, Intel, and Nvidia. Khronos doesn't go out and engage the vendors? They're a consortium of the vendors you claim they don't engage.

The reason for the core OpenGL spec lagging with consumer level graphics stuff is largely due to its incredible breadth of applications and target platforms. With OpenGL 3.0 there was a lot of contention between the people who wanted to turn it into a streamlined rea

It's been true for almost 10 years. In fact Microsoft's support for DirectX has always been better than what OpenGL had. Microsoft made it easy to use with all their programming tools and languages and had a great documentation. The API was always cleaner too. There were tons of books written for DirectX. This is the area Microsoft handles extremely well - their Visual Studio development environment is the best IDE on the market and they create great tools for developers. Their mobile development tools kick Apple's and Google's (C#, Visual Studio and Silverlight against Java...).

It would be nice to see open source community wake up and start developing a competitor, as just now Microsoft is the driving force that innovates new technologies for PC and Xbox360 graphics and gaming. But for once it looks like the fact they're the only one doing so isn't slowing them down - they do a good job.

Don't forget this is slashdot where any praise of something MS does is likely to get you modded a troll, which itself is actually trolling.Like most companies, MS has done good and bad things, sometimes with the exact same action. To blindly label all Microsoft as evil and all things Linux as good is just illogical, unjust, and rather stupid. Unfortunately there's a lot of that around here.

So here's a few opinions bound to start the flaming from the mindless:
Microsoft has done many good thing

VS makes me want to code. It makes browsing code so much easier. Debugging is so nice that I will often fire up a debugger and step through code if I have a hard time figuring out what something is doing. I have been in several situations where my entire team couldn't find a multithreaded related crash using gdb, but I have never had a bug escape me when debugging through Visual Studio. You can actually control administer your machine a lot more efficiently through VS than you can by clicking through the control panel. Its easy on the eyes, though you don't get "hardcore" points w/ coworkers for a black background with neon text. In my opinion, coding is easier, bugs get found faster, code gets written faster. Compile error? click on the error, and it will take you to the offending line. Huge convenience. And note I am talking about raw C++ here- no managed code,.net, or any other stuff.

Clearly there are drawbacks- Intellisense just stops working sometimes, for no apparent reason. The build system is different than make, and can be annoying at times, but its way easier than fighting with autotools, though CMAKE is a little nicer (though fortunately CMAKE can generate vcproj files). Integration w/ source control other than VSS is a pain- usually I just avoid that altogether and leave them unintegrated.

I have tried alternatives- I used Emacs in college exclusively, until I got sick of a million meta commands and having to write programs just to configure my IDE the way I like it (a little exaggeration there, but not much). browsing between files on large projects was too cumbersome.

I have tried Eclipse and its god awful view system, where god forbid I click the wrong place or hit the wrong button my whole screen just gets deranged. I find the Eclipse model very awkward, the plugins are buggy, and intellisense works even less frequently than VS. For Java, Eclipse is good.

I tried QTCreator, which was alright, but seemed very focused around building QT apps. I just wanted to do general dev, and the UI was too Apple-ish for me- I don't need a shiny IDE, just one that is pleasant to look at and lets me get things done.

I have not tried net beans for C++, mostly because Eclipse CDT has lead me to believe that java IDE's shoe-horned to C++ IDE's don't seem to work that well. I could give Code::Blocks another shot as well.

So seriously, what do you recommend? And what specifically do you think your recommendation does better than Visual Studio?

Just because you've only actively used that one, doesn't make it "the best IDE on the market".
I, for one, have used quite a few in 3 different platforms and I have no patience for visual studio and I cringe every time I have to do anything ".NET".

I am a huge "Apple Fanboi". I started my non-educational development with CodeWarrior back in 1998. I worked with the horrendous IDE apple provided back in the pre-OSX days and also with RealBasic. I have worked with XCode. I have worked with Eclipse, and I have worked with Visual Studio. Visual Studio has, for me, been the best IDE in the world. Fast, responsive, the best auto-complete I have tried, and the debugger is just a bliss to work with.

I am a fulltime apple user that sometimes has to do some.NET work. I don't mind VS. It is nice and get's the job done. Personally though I want my IDE to be as light and as out of my way as possible. I enjoy XCode, but really I do most of my work in textmate (although when I'm doing web development I tend to use Coda).

I can't find any faults with VS except for it's just a little too heavy for my liking, but it's still a rock solid IDE.

Microsoft makes a fantastic debugger, but it isn't the one in Visual Studio. I'm talking about WinDbg. It's the best debugger I have used on any platform.

My problem with Visual Studio isn't with the functionality it has (although I find it slow), it's what is missing. The biggest problem for me is the lack of refactoring support. There are plugins that add this, but they add $250 to the $700 price tag (for the professional version).

I like Visual Studio, even though I'm still not 100% sold on.NET (Java and Android developer by trade.) I'd say it's probably the best IDE out there. I've had nothing but bad luck with Eclipse, including a persistent issue where the menu bars disappear after an exit on Linux, and the only way to get them back is to clear out.workspace.

You say Visual Studio is too much bloat. OK. But then you cite Eclipse as a counter-example?!?!?!? Hell, for example, Eclipse takes 10x longer to just start vs. Visual Studio (for my machine), let alone compile...
Just about every IDE I've seen in 2011 has bloat, IMHO.

I use eclipse about 90% of the time and Visual Studio only 10% but I'm 100% sure you shouldn't have immediately mentioned Eclipse as a better alternative if you're going to go with the "bloat" argument. I'm often tempted to put off critical OS security updates just because I know I'll have to wait for eclipse to open again after the restart.

It would be nice to know why you feel so strongly against Eclipse and everything else that is not Visual Studio. Looking at the people who get computer science master degrees I would say a very large precentage of these people do not use Visual Studio, I'm just using that as an example because that is what I know.

I do with you agree that almost nothing feels bloated with a good machine bought in this day and age (except some badly written Java and.net

this is actually the reason why ye olde time programmers keep away from women and are in such scarce supply. Any erection causes irreversible instant death as all blood is suddenly evacuated from their body into their bulbous appendage. A few die off every year or so, but never as bad as when the internet first caught on, there was a relative extinction event when that occured due to the prevalence of adult entertainment...

Not really. I personally find the autocomplete features in Visual Studio insanely useful.

Most of my university programming was done in C/C++/Java in a Solaris environment. I run a Linux desktop at home and still do a decent amount of tinkery programming there, and I do maintain some PHP code at work. At home, I tend to use Emacs. At work, on Windows, I tend to use Scite for PHP.

Those, compared to Visual Studio, are downright painful to code in. If I need to find the name for a random function I have to

Valve's flagship, the Source engine, is a fork of the quake engine if memory serves.

GoldSrc (the Half-Life 1 engine) was a fork of the Quake 1 engine. Source is a continuously upgraded version of the GoldSrc engine. The most recent version of the engine in a released game is the Left 4 Dead 2 engine, which added a few new features that weren't present in earlier version, such as the AI Director and the fog simulation.

Having said that, at some point OpenGL was dropped from Source, and finally re-added when

So back in the Doom days, there was no such thing as DirectX. OpenGL was all their was. Of course it was high end cards only, no consumer stuff.

So move on up to 1996 and the 3dfx Voodoo comes out. It couldn't support full OpenGL, but Glide was based on OpenGL and it brought real 3D to consumers. DirectX was at 3.0 at this point and had no 3D. Glide, or a subset of OpenGL with a wrapper (how Quake did it) was it.

DirectX 5 came out in mid 1997 and did have 3D, but it was somewhat basic. I mean it could support what consumer cards could do, but lacked a lot that OpenGL had. Still no real comparison.

However by 2001, DirectX 8 was out and DX was showing some real competition to OpenGL. nVidia had been doing DX and OGL as native APIs for their cards for some time, and both ran just as fast. Also now the cards had programmable vertex and pixel stages, just like the high end pro card, and in fact nVidia was selling their consumer hardware in the pro market as Quadros.

From there, DX just started pulling further and further ahead. DX10 was a major update and brought some cool new GPU features, like fully unified shaders. Support for it was not a lot on the software side since it required Vista and games have to deal with older computers, but the GPU makers loved it. OpenGL was not fast in terms of catching up.

DX11 pushes things forward again, and again OpenGL is playing catchup and doing it in a poor fashion with extensions. Not just new graphics features either, but things like support for real multi-threaded and multi-tasking rendering. The ability to treat a GPU much like a CPU and task switch on it and so on.

Then of course there's DirectCompute, part of DirectX. GPGPU integrated in to the API and the same for all vendors. Of course there is OpenCL, a similar idea, but it is not integrated as DC is in to DX.

So back when Carmack was an OpenGL fan, it was because it was the best. However it isn't anymore and as things have changed so has his opinions.

Maybe it would be in Khronos' best interest to fork OpenGL into two projects? OpenGL for serious business apps, and some new API for open standards gaming?

No. We are reaching a point where casual software starts relying on 3d accel and hardware video accel. Even browsers and flash (yuck) hook up into accelerated APIs. All of these need the "serious business apps" api, not the "games" api. So regardless of what microsoft thows at DirectX in terms of resources the amount of resources thrown at opengl and opencl in the next five years is likely to grow significantly year on year. Probably even in excess of resources thrown at DirectX.

Did you read the article? The gist that I took away from it is that despite the headline claiming that Carmack prefers DX, he does not prefer it enough to redesign the entire codebase to leverage it. Despite all of the improvements in DX, DX is not better enough than OpenGL at this point to make it worth the transition.

It actually elevated him in my esteem. Nothing's worse than someone clinging to some technology akin to a sacred cow. If he deems it better and uses it because he considers it better, more power to him.

A good programmer (and a good project lead) will use the tools best suited to the project. Not cling to his pet technology. And everyone who ever had to serve under someone who had one such pet tool/technology/language will know the value of project leaders who can look past their favorite toy.

Why is this surprising? He has always struck me as a very smart, pragmatic engineer. If brand x widgets are now better than brand y, any good engineer will switch. His goals seem to be fairly obvious: he wants to build the fastest, most efficient graphic engine he can so he is going to pick the best tools available to do so.

His honesty is refreshing in a world of shills, ego based flame-wars, and corporate astroturfing.

'The actual innovation in graphics has definitely been driven by Microsoft in the last ten years or so,' explained AMD's GPU worldwide developer relations manager, Richard Huddy.

One would imagine that a company that develop and make GPU accelerators would be the innovator in the field but apparently AMD is fine with being in Microsoft shadow.

Modern graphics hardware is nothing without a library and API to it. At least for gaming purposes (and excluding consoles, though they really aren't cutting edge by the time they're in the shops anyway), Microsoft controls what programmers can ask hardware to do for them, and therefore ultimately they control what hardware can be designed to do.

You know what you just did,/. ?? From now on, heck, even tonight, a Friday night, no less, he will have to introduce himself in parties as 'Doom Creator' instead of waiting for either the awed silence that usually follows the mention of his name, or the alternative THE CARMACK?!!?

Did we just fall into Digg, or something...Or did the submitter just want us to *feel* old all of a suddeGET OFF MY LAWN!!