Posted
by
Unknown Lamer
on Wednesday September 12, 2012 @12:21PM
from the after-dnf-it-had-to-happen dept.

Mesa 3D has famously always not been technically OpenGL (lacking certification), but times are changing: "This is a great day for Mesa and open-source graphics drivers. Just a tad over a month ago, I submitted OpenGL ES 2.0 conformance test results to Khronos for Intel Sandy Bridge and Ivy Bridge GPUs with Mesa 8.0.4. There were no objections during the 30 day review period, so we are now officially conformant! Finally being on that list is pretty cool. Not only is this great news for my team at Intel, but it's terrific news for Mesa. Mesa has had a long history with OpenGL, the ARB, and Khronos. This is, however, the first time that Mesa has ever, in any way, been listed as a conformant implementation. This is a big boost to Mesa's credibility."

I actually did some tests of Photoshop's 16-bit colour depth support. Can't remember if it was PS4 of PS6 though.

Anyways, it turns out that when writing 16-bit TIFF files, it only actually included 15 bits of colour depth. I have no idea if this was an internal limitation for some reason or a flaw in their TIFF routines, but I did confirm that it was true.

Who cares- intel's all that matters until NVIDIA and AMD get there act together and release the FULL specifications / documentation / source code. And no. AMD hasn't done that. They haven't even released sufficient code to get a partial driver to run on truly free platforms. Most distributions ship non-free software which is why it works at all. It doesn't even work that great. It's just that the gamers don't care about the issues non-free drivers create or too stupid to realize the root cause. And if your

Who cares whether you run a "purely free" OS or not, unless you want to be really idealistic.

I run Linux not because it's "libre", but because it is much more pleasant to use than Windows (which lacks a lot of Unix goodness, has an ultra annoying updating strategy, has no package manager, and has a philosophy of downloaded programsthat have adware or an installer that wants to install something "extra") or Mac (ok I didn't really ever try Mac but since I like

I care and millions of people who have AMD and Nvidia cards on their devices (should if they don't) care. I have been running linux for 10+years but I am of the belief that if vendors support proprietary drivers for their devices, I am all for it. OSS drivers haven't caught up with the h/w in over a decade and the h/w is fast evolving, with newer players entering the scene. Even acceptable Q OSS drivers cannot be created in a meaningful way.

Even intel's linux drivers suck compared to windows drivers. And they actually develop them !

Vendors cannot just build and give everything away - it is time people accepted that.

Your argument is out of date. Intel Drivers have overtaken those of Windows in speed and reliability, and routinely have cutting edge features over their proprietary counterparts.

Everyone who plays 3D games or uses their gpu for any sort of distributed computing cares. Intel graphics solutions are simply not worth having on a primary desktop and I'm damned sure not going to stop doing the things I enjoy out of some fucked up sense of idealism. Most of us, even among linux users, aren't code-hippies. Sure, we'd prefer FOSS drivers but as long as there is a working solution we really don't care.

If you want to paint yourself a little sign and go picket outside their offices, have at it. The rest of us will be over here not giving a damn.

Would had been nice if one could have some Apple-like ability (I don't know how it works on Apple machines and I assume others have done it to) where you could switch between integrated and "solitary" graphics so to speak.

Use Intel drivers in the CPU for desktop and playing video but switch over to the graphics card for specific tasks like when you run a game.

With HDMI I suppose there's no issue hooking them up together since it's digital anyway. Whatever to use an internal bus or not is optional. I guess i

MESA was supposed to be a software implementation of OpenGL. As such, it has always sucked in terms of performance - even for software rendering. I never understood why it became a sort of wrapper for 3D hardware drivers. My understanding was that Gallium3D, some state trackers and Wayland were going to make Mesa irrelevant. Where did that go wrong?

But the good news is that OpenGL ES is just different enough to break the vast majority apps that use the GPU on desktop Linux if your graphics drivers only implement OpenGL ES and not OpenGL. Try running Ubuntu on an OMAP. The lack of 3D acceleration support in any app (since they all expect OpenGL) is painful.

If Intel keeps up the pace, and nvidia/ati don't want to start loosing market, they'd better follow the example.Intel is already taking over the medium-end market for non-gamers, and low-end market for gamers. Especially due to the huge power saving differences.

I had bought a motherboard with an onboard integrated Intel graphic chipset, and I've always had problems with it for everything concerning 3D (even a small transparent 2D window using the 3D chipset sometimes crashed).Finally I've bought an ATI (the cheapest model as I don't want to play games) and I'm happy with it and free drivers.

I have been using nothing but intel, unless for gaming, for the past few years at least. These days their integrated stuff seems fairly competent for gaming too. I base that on running portal 2 on an integrated chipset.

I recently built a system for a friend of mine using ans AsRock 4 extreme, and an 1155 based i5 CPU with ivybridge.

As I am not made of money, we cheesed it and coaxed an old ATX power supply with molex plugs into working for us (with some cheap adaptors), and used the integrated video. He rarely games anyway, and we agreed it was a good foundation to work from for later purchases.

As I am not made of money, we cheesed it and coaxed an old ATX power supply with molex plugs into working for us

That's the epitome of false-economy...

You can get an 80%+ efficient Seasonic PSU shipped to you for $45. The savings in electricity makes it easily worth it, and the reliability and extremely low noise make it an extremely good investment.

The older GMA chips are pretty terrible - even on Windows the drivers frequently crash and burn, and the performance is abysmal. I'm not sure if the drivers have been open-sourced, but I don't think they have. They discontinued these long ago - just how old is your motherboard?

The newer "Intel HD" chips (integrated into the CPU, not the mobo chipset) are a completely different beast. Brand-new design, fully open-source drivers with significant Intel support, and surprisingly powerful (especially per-watt).

The older stuff, yes.Those new core i3/i5/i7 CPU's : the integrated graphics have become VERY potent.

And yes, I have an i5 that comes with HD Graphics 3000 so I should know.

In fact, I also have an Nvidia NVS 4200M sitting in my Dell Lattitude that automatically kicks in when I require 3D stuff (read : games) and battery is not a concern. So for fun I started AION on the Intel hardware goofed around a bit, closed the application and then started it up again but on the Nvidia hardware (easy thanks to http://en.wikipedia.org/wiki/Nvidia_Optimus [wikipedia.org]). Im not saying AION is the new standard to test things on, but for a 3D game it looks good (lots of details & eye-candy) and I simply happen to have it on my machine. (1680x1050 btw, I had the settings on automatic)To be honest I'm a bit shocked to notice they both look VERY alike. The nvidia picture seems to have more hmm 'powerful colours' (?) while the intel gfx were a bit more washed out but then again seemed to have less visible edges (AA?) though some of the effects looked 'simplified'.Otherwise, both maintained a steady 45+ fps which is more than good enough for me. wow.

Seems they come pretty close to each other and the difference in DirectX version supported might explain the noticed difference in effects. They were still there, but just slightly less... hmm... complex.

Sure there are a lot more powerful dedicated gfx cards around (hey, laptop here!), but trust me when I say that for 95% of the market, these integrated graphics are more than sufficient ! If you're in the 5% of users that 'needs' a powerful 3D processor, then by all means do; but claiming all Intel IGP's are a piece of shit is like saying that you don't have a decent printer if you don't go for the Xerox iGen 150 !

Define "piece of shit"; the most recent Intel iGPUs outperform low-end discrete nVidia and AMD cards, and their drivers are decent these days (particularly on Linux, where they've put a lot of effort).

I fired up Portal 2 on an Ivy Bridge Mac Air recently, at native resolution (1400x900), on the default settings (which had pretty much everything on "high"). It got ~40 FPS in typical gameplay. Not exactly setting the world on fire, but a pretty impressive showing for an iGPU.

I never really understood what Mesa was. I thought it was what you installed if you wanted software rendering of OpenGL. If you wanted hardware rendering, you installed drivers for your hardware. But now Mesa is providing hardware accelerated OpenGL? What's the point if we have open source Intel drivers?

Mesa provides a software renderer. Technically, all opengl implementations are supposed to also provide software implementations of functions not supported in hardware. They're supposed to gracefully fallback to software (and the lovely performance that gets you).

Mesa's software renderer means that it's easier to write drivers, so you can have a fully working implementation as you reimplement various features with hardware support.

I never really understood what Mesa was. I thought it was what you installed if you wanted software rendering of OpenGL. If you wanted hardware rendering, you installed drivers for your hardware. But now Mesa is providing hardware accelerated OpenGL? What's the point if we have open source Intel drivers?

I don't get it.

Mesa is the user-space API that talk to the driver. Some driver(eg: nvidia) may provide their own libGL.so, other may use libGL.so from mesa(eg: open source drivers). Also mesa can be use to render in software and even off-screen(eg: batch processing images files or video).

Here's the relevant part (sorry, editing on a phone isn't that easy...):

Now comes the fun part: modern hardware acceleration. I assume everybody already knows what OpenGL is. It’s not a library, there will never be one set of sources to alibGL.so. Each vendor is supposed to provide its ownlibGL.so. NVIDIA provides its own implementation of OpenGL and ships its ownlibGL.so, based on its implementations for Windows and OS X.

If you are running open-source drivers, yourlibGL.so implementation probably comes from Mesa. Mesa is many things, but one of the major things it provides that it is most famous for is its OpenGL implementation. It is an open-source implementation of the OpenGL API. Mesa itself has multiple backends for which it provides support. It has three CPU-based implementations: swrast (outdated and old, do not use it), softpipe (slow), llvmpipe (potentially fast). Mesa also has hardware-specific drivers. Intel supports Mesa and has built a number of drivers for their chipsets which are shipped inside Mesa. The radeon and nouveau drivers are also supported in Mesa, but are built on a different architecture: gallium.

> Each vendor is supposed to provide its own libGL.soWrong. Each vendor is supposed to provide a *_drv.so module which is loaded either by the X server (for indirect rendering) or by libGL (for direct rendering).For indirect rendering, libGL doesn't need to know anything about the hardware the X server is using; it just converts GL calls to GLX protocol and sends it to the X server.Unfortunately, nVidia just has to be different, so they provide their own libGL. At least newer versions can handle talking

MesaGL is an implementation of the GL API that can use any of several backends to do its actual work, including a couple software renderers and also hardware renderers for many Intel, AMD/ATI, and nVidia chipsets. Your distribution probably splits each renderer into its own package for historical reasons.