Posted
by
Soulskill
on Tuesday March 04, 2014 @04:31PM
from the off-the-poorly-rendered-table dept.

An anonymous reader writes "Citing 'code we consider to be permanently "experimental" or "beta,"' Google Chrome engineers have no plans on enabling video acceleration in the Chrome/Chromium web browser. Code has been written but is permanently disabled by default because 'supporting GPU features on Linux is a nightmare' due to the reported sub-par quality of Linux GPU drivers and many different Linux distributions. Even coming up with a Linux GPU video acceleration white-list has been shot down over fear of the Linux video acceleration code causing stability issues and problems for Chrome developers. What have been your recent experiences with Linux GPU drivers?"

The worst part is the Android app. It used to be pretty much perfect. Now it is badly broken.

I used it a lot in the car. There used to be zoom icons but now you can only pinch to zoom. Worse still when you pinch the map stops following your location and sticks to the centre of the pinch, meaning it is impossible to zoom while following yourself.

They got rid of navigation without setting a destination too. Most apps let you just drive around and use the map for speed camera warnings or seeing traffic conditi

whatever it is it fucked up printing instructions two 3 different addresses

and why would I use google maps to find my room in the hotel, if i was already at the hotel? I needed instructions from the car rental to the hotel, it failed to give me that instead jumbling my searches together into one mass of retarded, the like of which I have never seen before, and hope to never see again, but here you are...

to me this all sounds like a lame excuse for the lack of quality of their own software. I mean it's true that there are bugs in the kernel and everywhere on X and alike, but all other apps play nice. only chrome is playing the "poor little guy" part. all other software rants and complains when they find a bug, but they still manage to work it out and to help everything get better. Linux is not the only platform having frustrating bugs that can cripple any piece of software. but it's the easy prey for anyone preparing to become a competitor.this is the typical tactic of making people "dependent" on their software, then complaining that some of the platforms it runs on doesn't have as much quality to be excused for a poor performance so they can make it work worse and then they have another excuse to impose a bit more of their own platform like the one running on chromebooks or something else about to be launched.

For starters: Every game that makes use of 3D and is available for the three platforms, scientific software like Paraview [paraview.org], Slicer 3D [slicer.org], 3D rendering software like Blender [blender.org], the famous video player VLC [videolan.org],...

Typically, Linux applications work around bugs with various tricks and (mis)use of X calls (see Ilja van Sprundels talk on 30c3).

Perhaps a standardized test suite program that systematically tests all 3D features in order, in combination -- similar to the Acid Browser tests -- would help evaluate which GPUs are well supported in Linux/X. You know, trying to actively crash X in the most distinct ways possible.Then people would be more pressured to make their drivers work properly, rather than saying "well,

Yeah but none of them work well in linux. Half crash, the other half are slow or glitchy. Let's face it, we're not going to get good 3D on linux until a) someone makes some decent drivers, and b) X dies.

You're full of crap.

For slow: the highest framerates in some games have been recorded on Linux.

The only 3D software I use regularly is blender, slic3r and minecraft (and varius 2D video players---not sure why you included them). They work flawlessly and glitch free on Linux.

Well I'm not surprised that you can configure a machine so that under linux you get good performance, but if you pick some decent and otherwise random hardware, chances are much higher that you will have slow/broken 3d performance if you slap on some reasonably popular distro, than if you go with windows. It is irrelevant to most of us that there exists at least 1 person on the planet who has managed to run minecraft well under linux, as long as most of us have a great sense of trepidation every time we ru

Well I'm not surprised that you can configure a machine so that under linux you get good performance, but if you pick some decent and otherwise random hardware, chances are much higher that you will have slow/broken 3d performance if you slap on some reasonably popular distro, than if you go with windows.

It's a stock lenovo with stock ubuntu. No tweaking/hacking/configuring needed.

My experience with NVidia and Intel has been that it "just works" recently. In 2005, 3D was a bit flakey, but then it was a bit

I would add, however, that you missed a big one: hardware video acceleration in general quickly gets one into the world of DRM, patents, and other BigCorp-induced headaches that have been causing Linux trouble since day one. This has always been the major impediment to hardware acceleration in the open source drivers at least. Even the Linux binary drivers have had acceleration features stripped from the for DRM reasons.

As a Linux user for close to twenty years, I'd argue that the quality of the GPU drivers has improved remarkably over the past few years. For general desktop compositing and engineering 3D work I find the open-source radeon drivers work fine now; far better than they ever have in the past. Not gaming-quality yet, but improving all the time. This Google Chrome decision sounds more like the typical BigCorp excuse to avoid Linux support than a valid diatribe against the current drivers to me.

they'd remove the blacklist completely --- and all the driver vendors would quickly fix the bugs (if there even are any).

Yeah, good luck with that... nVidia doesn't care about linux users, Unity is currently super buggy because of poor drivers.nVidia only recently started working on optimus support.
And using a laptop with an nVidia card is a nightmare, I constantly have artifacts, crashes, and things that misbehave.

On my work laptop I've disabled the nVidia card in BIOS, because I wouldn't get anything done using it... The result is that I can't use external displays etc.

Yea, I used to have the worst problems with the nVidia drivers on laptop (Quadro 3000M, hell yea) until I realized that the problems were all caused by my weird dev configurations I was using. When I switched back to the lastest gcc version everything magically worked again. I think the drivers are using some weird configuration of the linker or something (maybe caused by the new linker version released a couple of years ago). So some of the driver problems are caused by the fact that we developers tinke

I don't get this nVidia doesn't work on linux stuff. It's the only video card I've ever gotten to work, well not counting Intel which had until recently abysmal 3D performance. Two ATI cards returned because they just killed the machine but 9 years running Nvidia on linux. I think the problem with Nvidia on Unity is more because of Unity which is still pretty buggy.

I remember these types of problems in the early days of Linux, only then it was audio drivers. Getting audio to work was a disaster. Video typically worked ok but that was before nVidia and AMD were the major players. Now the tides have turned and audio works like a dream and video is what sucks ass.

I swear I've had more issues with video this last year than I did in the last 15 combined.

My experience maintaining a dual-seat Linux setup (with two NVidia cards) over the course of several years is that I absolutely avoid upgrading at almost all cost, because it ALWAYS breaks. And when I have to reboot, it is always video, or USB getting into some funky state.

When MS was developing their GPU acceleration for IE, it was a complete shitshow. Tons of very common drivers (the current ones for about half of the at-the-time dominant GeForce 8x00 series, if I remember the story right) were buggy, and would either cause glitches or just not render anything at all. A few others failed in other interesting ways, including crashing the browser.

They were able to get on NVidia's case and demand updated drivers that weren't shit, at least for that particular application. Goog

Is this really something that's best fixed by expecting Nvidia/ATI/Intel to release higher quality drivers for every distro? Or is this a distro problem, where LInux will simply never have ability to handle acceleration very well because it's a constantly-moving target?

It's an honest question. I'm curious to see what people involved with either Linux or GPU drivers thinks.

AFAIK the Mozilla folks have not had the same complaints about Linux graphics drivers, have they?

The solution is to avoid using the Google Chrome browser, unless you like being spied on all the time by Google. Load up Firefox with a completely fascist set of add ons and do your best to browse safely.

This is the correct solution. Been on FF since Opera abandoned the Linux community last year (saw the writing on the wall with the yet-to-be-released Linux version of their Blink browser -- not that it even matters since all functionality that I loved Opera for died (or will die) with 12.x). Anyways, I was pleasantly surprised with how fast it's become vs. the last time I used it [on linux], which was around 2009/2010. Still not as fast as Chromium or Opera, but fast enough to do the job without wonderin

I really just don't see why anyone would use Chrome. I never did get it. IE comes default with windows... so you use it if you're too lazy or don't know what you're doing you leave it on there... Opera has some neat, unique features... so ok... But Chrome? Really? What positive purpose does it serve? Firefox has had its issues over the years but time and again it's proven to be the most stable, most user friendly browser over the long term.

In my experience it's faster WRT opening, new tabs, etc. Also FF was hogging memory pretty badly for me. Keeping FF open for a few weeks would invariably result in it using more and more memory, and eventually need to be restarted.

Chrome's process per tab model keeps it from having quite as much memory go to what Wikipedia calls "external fragmentation" and Firefox's about:memory page calls simply waste. These are pages that can't be decommitted because they have at least something left in them. Mozilla is pushing Firefox toward process-per-tab, but the Electrolysis project isn't quite done yet.

Also, you're doing it wrong. What website do you need to keep open for weeks on end that can not be bookmarked or session-saved?

Pages to which I expect to be able to refer while my laptop is disconnected from the Internet, such as while riding the city bus or while ins

Firefox has had its issues over the years but time and again it's proven to be the most stable, most user friendly browser over the long term.

I think I switched from Firefox to Chrome at around 2010. At that time, Firefox was definitely not the most stable or the fastest browser out there, chrome was.

Switching back hasn't really been something that I'm willing to invest the time in at the moment, as it's easy to just download chrome, log in, and then have all your extensions, bookmarks, etc. come back to you.

I understand Firefox does that now, but it still requires me to find extension equivalents and migrate the data which frankly isn't worth th

Why did I chose Chrome over Firefox? Because I got sick of the memory leak problems under firefox. When I browse, I use a shit-ton of tabs. After about 3 days, firefox is consuming over 1GB of memory even after I close every single tab. If I let it go about a week, it's up to nearly 2 GB. Once the memory hits about 800MB, it starts to hiccup/pause all the time. When it gets to its worst, I can't even watch a video on youtube without it pausing for 1/2 second every 5 seconds. I went through year after year o

Yep, and then when you reload all of those tabs:1) oops, those ones don't reload because you have to log back in, and then you lose your context2) oops, those other tabs use server side sessions which are now expired, so the page is no longer valid and can't be reloaded3) oops, any pages that have any complex script state need to be put back into the proper stateNot to mention that just closing the browser takes it like 5 minutes to unallocate its 2 GB of memory.

The question was asked: "I really just don't see why anyone would use Chrome...What positive purpose does it serve?". I was simply answering. Isn't that what we do here in slashdot discussions?

As for any denial, there's nothing for me to be in denial about. I've been using chrome as my primary browser for (I'd guess) approximately 2 years now and I've never had cause to complain about it. Like I already acknowledged, memory footprint is probably the big issue people complain about with Chrome, but that's a

I must admit, I don't do gaming on my Linux rig, but... aren't there major 3D games being published for Linux via Humble Bundles, Steam, GoG, and no doubt others as well? Is this a support nightmare for those companies? And if not, how is it that they can work with GPUs in Linux, but the living gods of code over at Google can't hack it? I'm at work and can't be bothered to look up compelling examples, but I'm pretty sure The Witcher 2 runs on Linux, and that's a pretty GPU-intensive title. When somethi

> Every linux distro has a different driver with a different level of support for the specific revision of the specific card a user has.

You mean like anyone with a Windows box?

Linux distributions are just collections of upstream projects. That includes the kernel, the user land, and anything else.

Someone comparable to myself either has some version of the kernel or the Nvidia blob drivers. That's the official driver from the hardware vendor. I might have a different version than someone else, but that ha

This is what we get when the journalists get ahold of some technical info and start waving it around with the safety off. I assumed from reading the summary that even if the functionality was "permanently disabled", if the code was already built into the browser, you would just have to find the right bits to twaddle in the binary to enable it. Although I guess that is indeed "permanent" for the vast majority of users.

Mind you, if I only turn on HW acceleration in the advanced settings panel, GMail runs sluggishly. If I also then enable your software rendering override, then GMail appears to run normally, but in both cases I still get the sluggish Jira pages. I'

Then you take off your Linux Zealot glasses and compare it to OS X or windows, you find that Google is probably right to not release GPU acceleration. I have found off and on that we get a lot of silly glitches happens with GPU acceleration, artifacts are common, values not moving at the right speed... For the advanced user, we know how to deal with it, move a component etc... but for an end user it could be a major issue, and turn people off to the product, and it is better off going without until it works

Why should a company that uses Linux as a server OS adopt and support the development of GPU drivers that are not useful in their business context? The only companies who have a vested interest in doing something like that are game companies, specifically Valve.

I had to abort a windows to linux port because the intel linux graphics driver is BROKEN (Intel Atom N455). I spent weeks convincing a customer he was better off moving his code base to linux, and when I finally got the OK to build a prototype, the UI was unusable. I really wish the GPU manufacturers would provide enough documentation so the Open source ppl could come in and fix it.

wow. what an incredibly impressive rant. I always love when my competence is brought into question by an anonymous coward.

FYI, my linux port DOES work, just not on the specific platform that the client initially chose. I offered to explore this cost reducing move, which was progressing swimmingly until i hit this linux/intel/qml opengl incompatibility.The only downside was I spent some of MY time exploring the options, and we now need to stay running Windows Embedded for a little longer, until we can quali

I've been using Linux as my primary OS for 10 years. My desktop PC does dual boot into windows for a few games but spends 95% of the time in Linux. I've done a bit of gaming and other graphics intensive applications under Linux without any problems. As a part time gaming machine, there is a mid range NVIDIA card hiding inside and I've always used the proprietary NVIDIA drivers which are as good as those on windows. There was a time when installing those drivers was a bit of a pain, due to other developers trying to to force their extremist political views on users, but it is a very simple process now.

Some drivers might have problems but there is no reason they couldn't take the same approach as Firefox developers: provide a user controlled, easily accessible, option to enable hardware acceleration... Maybe that last point shows why I don't care what Google does with Chrome on Linux or any other platform... Firefox works for me on Linux, Windows and Android.

Agreed. I've been using Linux on the desktop since 2008-ish, drivers have been solid since right around 2010 in my case, though I can only speak for NVIDIA since that's been all that I've run in that span.

Not having flash in chromium was one of the many straws. This doesn't help.

I used to use a Chrome/Firefox combo to segregate my browsing/cookies. Just switched to multiple firefox profiles and added a "Close Tabs to the Right" plugin (to restore the one thing I missed about chrome). Much happier and I doubt I'll ever go back.

It's obvious that the google gui programmers just use windows or mac gui APIs and don't know how to code.
Linux GPU code has been extremely stable. Maybe they can learn how to program from the folks at Steam ? LMAO
The new Steam Appliance runs Linux.
I use a GTX 560 in a MacPro 2,1 running linux on bare metal with NO ISSUES.

I've a fresh install of Mint 16 here on a Thinkpad with an AMD RV710 and the Mesa driver seems to be working fine. Steam games & Netflix work a treat. I haven't installed Chrome, though, it's performance my suck but Chrome is easily avoidable.

I understand that drivers == performance == competitive advantage, so the vendors want to keep SOMETHING secret, but hasn't the state of the art advanced quite a bit beyond what the vast majority of people need? Can't the vendors just release a plain-vanilla, rock-solid, super-basic driver that offers 90% of the performance? Or hell, even 50%? I mean, if I somehow managed to run Linux on a 75 MHz Pentium with 1 MB onboard VRAM in 1998, surely I should be able to expect *some* acceptable level of performance

You do realize that Windows 8's failure doesn't really change anything for Linux, right? As usual, Microsoft is fighting against its own past: people are choosing between sticking to 7 or moving to 8, Linux almost never enters into the equation.

If Google is so confident that it is driver bugs causing issues, then I'm sure they can put together test code to test for and expose the bugs. In other words, instead of complaining, give the vendors code that will show them the issues and allow them to resolve them. You don't have to cover every issue - just share the code you intend to use and let the vendors fix their drivers - OR - show you where your own code is responsible.

Pretty much every successful video game developers do just that... The bugs get fixed...sometimes....someday....maybe....if the stars are aligned...

Realistically, coding against video drivers (regardless of platforms) feel like web development, where you have to fight over countless (well documented ) bugs on each implementation until you're blue in the face, and if you're lucky, 5 years down the road, it will get fixed.

Things change at firmware boot time, that affect the OS from that point forward. The linux ecosystem is catching up to (U)EFI, but for my slackbook pro, I get no 2d/3d acceleration in EFI mode, because the video BIOS gets disabled. X11/DRM/Mesa requires the video BIOS for hardware acceleration. You can either patch ELILO (which does NOT boot on ia32 Macs anyway), use fakebios, or boot using CSM (legacy BIOS emualtion). At least that way I get 2D/3D, but CSM breaks a host of other shit, so it's no better really.

That's.... kind of insane.

OS X/Macs don't have those problems, and it's not even designed to be used with non-EFI cards. You may not get video at firmware time, but as soon as the video drivers load they'll find the card and not care if it's EFI or BIOS.

I don't know enough about PCI-E to know why that is, but I'm assuming all that needs to be done is the driver needs to match to a specific PCI-E card id, and then tell the card to start. No UEFI or BIOS involvement, beyond what could possibly just be the sys

From my own research, difficulty appears to vary by card manufacturer, linux distro, and specific task. If you pick the right distro, support is decent. If you pick the wrong distro, you spend many hours wandering the internet safari. I can sympathize with Google's position.

In the briefest terms, AMD/ATI = Hard Mode, or so it appears.

Most recently, it took me a significant part of a weekend to setup a GPU-based Dogecoin miner on Debian, using ATI cards. The first and most painful lesson was learning tha

given that CPU horsepower today is good enough, and tomorrow will be more so. Besudes how much video power do you need for your typical low-rez linux display.

So you are fine with Linux requiring gobs of CPU horsepower and delivering low video performance? Then it is technologically worse option than Windows. Windows lets me squeeze more out of my hardware. Why would I use Linux anymore then?

There was a time when I used Linux precisely because it was the faster option and gave me more power. There are still good reasons to use Linux. But this unoptimized bloated software is really starting to now appear everywhere on Linux world. Not good.

Seriously, I've used GPUs from all three manufacturers and found every Intel and nvidia hardware/driver combination I've tried to work well in Linux, and every AMD combination to be the opposite. I wish it were not so, but it is, in my experience.

Somehow Intel is able to do this but AMD is incapable of writing decent drivers. Great hardware is useless without software which is why I don't own any AMD gear and with the exception of an old PowerMac 9600 never have. I do use Nvidia and put up with their closed drivers on Linux because they do at least function unlike AMD's.

The intel HD3000 onwards are not horrible, especially if you are comparing on performance per watt, which is the way the market is headed. The traditional desktop is dying - admittedly a long and protracted death.

I've never felt compelled to bother with such a setup. I have a rather large monitor. Dunno if I have room for another one like it. On the other hand, the whole "virtual workspace" thing seems to already accomplish a lot of what other people use multiple monitors for.

Perhaps someday when I am REALLY bored I will buy a couple of cards high end enough for this to matter and horse race both operating systems.

I find it hard to believe that I've been using Nvidia for almost a decade and I see none of these problems either. I keep hearing people ranting about linux sucks and 3D on it is broken while I happily keep rolling on. Either I'm smarter than I think or a lot of them are trolling. Regardless Google can suck my dick. I support those who support me. Chrome isn't that special anyway. Sure it's faster. I've seen that but it's not like it's 30,000% faster. I can give up a millisecond or two not to put up