Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "Intel's open-source Linux graphics driver is now running neck-and-neck with the Windows 8.1 driver for OpenGL performance between the competing platforms when using the latest drivers for each platform. The NVIDIA driver has long been able to run at similar speeds between Windows and Linux given the common code-base, but the Intel Linux driver is completely separate from their Windows driver due to being open-source and complying with the Linux DRM and Mesa infrastructure. The Intel Linux driver is still trailing the Windows OpenGL driver in supporting OpenGL4."

This has little to do with the architecture and mostly to do with vendor support. This has always been a problem for non-windows OSes. Even apple's opengl isn't exactly the best in terms of performance. Linux easily outperforms it when using nvidia's driver.

Of course, if you go out of your way to destroy desktop Windows in pursuit of tablet market share, it becomes a self-fulfilling prophecy.

NO executive management with the high-quality management training that is standard within MS would do anything at all like make statements to destroy the market share of their current market-leading product line! Isn't that called the Ratner Effect? The only thing even comparable would be for someone to fall prey to the Osborne Effect, and of course no one with a background in management at Microsoft would ever... oh wait a minute...

They don't care if it 'fails'. After a year or two and the fuss starts dying down, machines start breaking and needing to be replaced, and voila - the desktop monopoly kicks in, and windows 8.whatever is a smashing success (regardless of how many users want to smash their computers). Meanwhile, Windows Phone garners whatever market share it can manage based on a trickle of developers starting to code to the Metro UI. May not be enough to make WP a success, but without it, WP would've been a complete non-

Microsoft blew its right foot off with Windows 8.
They went to the doctor to get it reattached with Windows 8.1 only to wake up to find out that a second left foot was attached in place.

Unfortunately, Win7's dual-left-foot support was actually pretty good; but was removed because you can't operate the imaginary ipad-killing tablet that Balmer dreams about with two left feet...

That's the weird thing about Win8: Vista, while a failure, at least had the decency to founder largely because everything kept from XP was antique and everything scrapped and rebuilt was immature. Win8 started out as a product that people (at least the Windows-using ones) mostly liked, and then was systematically m

Linux is not crap on the desktop in any way, it is crap as a games platform. Not everyone (OK the majority but still not everyone) wants to play games all the time. Some people actually want to have a reliable work platform and for that, Windows is crap.

Unless I count devices that can't show more than one window at once. Phones can't, and I accept that because of the 4-5" screen. But why can't I run two phone-sized apps side-by-side on my Nexus 7 tablet? Windows has supported showing two apps side-by-side ("Tile Vertically") since I started using Windows in the Windows 3.1 era, and it got even easier with "Snap" in Windows 7. Even Windows 8's often-ridiculed "modern UI" allows snapping a Windows Store app to a vertical column as wide as a phone's display.

thats actually not true, unless you count desktop computers as the only type of computers.

As in number of installs, linux dominates world wide as the most installed OS peroid.(includes android). It dominates in the supercomputer, phone, server markets, and it makes a sizable presence in the Mainframe, realtime, and embedded markets.

Compare with windows, which has desktops on lockdown, has a small presence serverside, and virtually non-existant, and viewed as some form of sick joke, elsewhere.

I used to swing back and forth between Windows and Linux, but honestly the past few years it's been all Windows. Windows 7 and Windows 8 are actually decent OSes, plus they run the software I need and the games and media that I want.

oh they've been wanting to get into the game for a loong time now. pick any time between this and 10 years ago and they've always been one year from releasing a chip that could compete with atiamd/nvidia.

Bear in mind that today's 5000 series actually does compete pretty bloody well with nVidia and AMD. It's near impossible to get a faster GPU in a thin and light laptop. The GeForce 740m is the same speed as it, the 750m is getting into power brackets that can't be put in a thin and light, and is only about 10-20% faster than the 5200 pro.

For me, Intel is doing a pretty impressive job of catching up. We've gone from intel being no where in terms of GPU performance to being able to equal the best nVidia and AMD can do at least in the power constrained market.

We're also hitting diminishing returns with game graphics. It used to be generational differences between games was huge but these days can you really tell the different between this years shooter and last years shooter?

We're also hitting diminishing returns with game graphics. It used to be generational differences between games was huge but these days can you really tell the different between this years shooter and last years shooter?

For the trained eye? Sure. For the masses? No. The difference we'll see will be mostly in the little details (hair/fur, clothing [maybe we'll finally get clothing that actually behaves like clothing instead of a mesh on all characters], reflections and just generally better lighting, non-shitty water [it's coming, it's coming!], and just maybe we'll stop using sprites for beams and related) , and the amount of things in the scene (and their detail).

For me, the big one is texture resolution.. on the 360/ps3, they use a ton of post processing effects to cover up the low polycount models and low resolution textures. They give a 'wow' component the way those tvs at bestbuy do when theyr'e all set to 'vivid' mode.

I got some bad news for ya, these new Consoles don't have any custom hardware like past times. It's all standard PC components now for the PS4 and Xbone.

Not terribly relevant: regardless of what consoles are made of, the broad outlines of what games are going to look like generally depends on what is within shooting distance for doing a console port for. There may or may not be some improvements in the PC version, if yours can handle it (Skyrim HQ textures pack, support for higher-than-TV resolutions, etc.); but if serious surgery to the game is required to get it working on a console, that is a major limiting factor.

uhm the graphics are the only thing you can tell are different. game engine gameplaywise being pretty much the same. a big problem having been though that due to consoles the models have stayed at same detail level for ages.

we're not at diminishing returns in that regard yet though, plenty of graphical improvement could be done with double the power...

Geometry calculations scale with the cube of the number of polygons. Non-ray-tracing engine's days are numbers and ray-tracing is soon(tm). Ray tracing is O(1 K), where K is large, current engines are O(n^3 K) where K is small. N is becoming an issue.

That includes effects like ambient occlusion (+16 rays), shadowing (+1 ray per light source, maybe more for soft shadows)? The advertising industry already use real-time ray-tracing systems (with a render farm in a back room), so it's only a matter of time before that technology gets squashed into the space of a console.

Sure, but if you're even remotely into gaming, you're not going to want a superthin machine because you can't get decent performance from them. There's also an issue of driver breakage. I won't even touch radeons for that reason, and Intel's drivers are even more broken. Intel gpus are 'ok' in a pinch, if the game is flexible enough (ie a quick impromptu deathmatch at the office after work), but I would never consider them viable for gaming, CAD, or other 3D graphics design work.

Intel my be looking at Linux as a "second tier" gaming platform. Steam may very well end up like the Netflix Streaming of video games, and Intel might be happy to have their CPU/GPU package in a low end Steam box.

My bet, and Steam as well as GOG are already showing this is there pretty good market for back catalog titles if you price them cheaply enough and make it super easy customers to purchase/install/play. That includes curating a catalog that will run on inexpensive modest hardware. You can also ke

Intel has been making great strides in GPU performance, especially for notebooks. This is probably primarily driven by Apple, but if you ignore the 4x MSAA problems, it's quite competitive with an nVidia 650m. And I've heard they're working on some pretty big improvements in Skylake.

Technologies like Crystalwell, and the amount of die space Intel is committing to this these days, make Intel a much more credible co

We'll see what happens when games are no longer saddled by 2005 era gpu technology. When that happens, I suspect the gap between intel's best gpus and nvidias/amd's midrange will widen considerably. You can see this is already apparent in the graphs of that anandtech article. As rendering demands go up, the gap widens. Intel's integrated gpus just won't have the vram bandwidth that the dedicated cards provide, nevermind the raw fillrate. This is because of limited die real estate and power draw require

HD4000 in my i5 runs FF14 at the lower range of the highest settings damn near 30FPS, an thats with PhysX at the highest settings for both characters and NPCs, @ 1366x768. I could easily get 40+ FPS by dropping quality down to medium / medium low, but it wouldn't look near as nice, and it is more than smooth enough as it stands.

Not too shabby for a non-gaming intended laptop, especially when ATI / AMD has bitten everyone in the ass with their "equivalent" laptop GPUs that shit the bed and couldn't run anywh

Hence why even non-gamers were so excited about Valve's gambit. Even with the few games released so far, it has brought tons of much-needed development effort to the areas GNU/Linux was lacking in. Imagine how things will be if SteamOS & Co. succeed and it becomes a major gaming platform. Free software purist or not, everyone is going to benefit.

Pah! Intel has already a steady history of making very good Linux display drivers. The Intel GMA hardware has had more featureful Linux OpenGL support for a long time when compared to other platforms. For example, the GMA X3100 (GM965 chipset) has OpenGL 1.5 support under Windows and OS X, but OpenGL 2.1 support under Linux.

I'm just as happy as very other Linux user out there about the Intel drivers starting to get competitive... but I'm pretty sure the reason Intel's management is dropping big bucks on Linux graphics driver development is Andriod not SteamOS. SteamOS is probably a pleasant nice to have they they get without any large amount of extra investment.

You probably mean 4500MHD if it's a laptop. Anyway, the Linux driver for 4500MHD should play any kind of video without breaking a sweat. I would inspect the video player and compositor to see if they perform badly. Sadly, tearing is still something you meet way too often in Linux world. I have found out that Compiz (slow) and Compton are the ones that do not tear. Mutter can also be configured tear-free by putting this into/etc/environment: "CLUTTER_PAINT=disable-clipped-redraws:disable-culling".

Actually yes you will see a benefit. Every driver revision brings new OpenGL features which the older hardware supports. So if your card can render OpenGL 4.1 in Windows then it will eventually render OpenGL 4.1 on Linux. Optimisations that speed up the overall implementation of OpenGL apply to all generations of cards as long as they are non-hardware specific optimisations.

Not to turn this into a "why Linux isn't main stream" thread. But I see the 2 biggest road blocks for Linux adoption as lack of support for Gaming and MS Office. I know you can run them in wine/VM etc. But an easy installation, at least currently, of Office is needed. So many Schools and businesses require it. Also a lot of people who would love to run exclusive Linux, are also gamers. I had my wife running on Linux, but she would get Office files for her Girl-scout troop and School (she was in college at

The headline is bad/misleading - many of those benchmarks are showing a disparity of more than 10% between the drivers. Using the numbers from the Phoronix article, Linux results are the highest number from any Linux driver (there are many cases where the most recent driver was not the best) to try and prove headline:

'Windows win by 13.1%''Windows win by 13.1%''Linux . win by 18.1%''Windows win by 13.1%''Linux . win by 2.1%''Windows win by 1.1%''Linux . win by 2.1%''Windows win by 1.1%''Windows win by 4.1%''Windows win by 1.1%''Windows win by 15.1%''Windows win by 19.1%'

So out of 12 results, 5 showed a 10%+ difference between Linux and Windows Intel drivers in favour of Windows and 1 showed a 10%+ difference in favour of Linux. The conclusion that the drivers are neck and neck does not follow from the premise for around 40% of the results and that's when being unfairly generous to Linux!

The Windows driver also had much larger spikes in the frame latency than the Intel Linux driver.

That sounds true looking at the summary statistics of the graph but the question is at what frequency and by how much when compared to Linux on the same frames? We know the Linux statistics have at least one peak (of 45) but none are visible on that graph because the Windows values have been drawn over the top of of the other values. It's a bad visualisation of the data and it's hard to learn all that much from it...

Do you remember the days when Linux didn't perform well in this area ?

In all honesty I don't (I remember tearing being more common which is a slightly different i

I don't think there exists a single pro gamer that uses Intel graphics hardware.
The Nvidia driver on Linux is more than adequate - and considering AMD/ATi's drivers are crap on Windows it's hard to produce any meaningful comment on that area.
Also, if Direct X is so essential and magical then why don't consoles use it?