Posted
by
timothy
on Thursday June 07, 2012 @11:53AM
from the hey-man-we've-all-got-problems dept.

An anonymous reader writes "CERT/CC has called out AMD for having insecure video drivers. AMD/ATI video drivers are incompatible with system-wide ASLR. 'Always On' DEP combined with 'Always On' ASLR are effective exploit mitigations. However, most people don't know about 'Always On' ASLR since Microsoft had to hide it from EMET with an 'EnableUnsafeSettings' registry key — because AMD/ATI video drivers will cause a BSOD on boot if 'Always On' ASLR is enabled."

I'm always genuinely baffeled by this kind of statement to be honest. Linux has what.5% or 1% share of an absoloutely huge market. If I could fail like that it would be success beyond my wildest dreams! There are plenty of successful hardware vendors which sell relatively niche laptops, desktops etc with far less of a global market share, and they are successful businesses.

How anyone can consider anything, when it has millions of deployments, a failure is

if gnu/linux was aiming to become the predominant desktop OS, displacing microsoft, then it certainly has failed

if gnu/linux was aiming to beocme a major player in the arena, maybe not the overall leader but boasting enough of a market percentage that it couldn't be successfully ignored or neglected by software devlopers and hardware OEMs, then yeah... it's probably failing at that too.

if gnu/linux was aiming to become a viable alternative to the market leaders for people who care about free software and people who care about being in full control of their own OS, well it has become a rousing success at that.

Windows succeeded in a very different market to what Linux now competes in...

Dos came bundled, and windows was pushed as the natural progression from dos. It had very few competitors, most of which were considerably more expensive both for the software and the hardware required to run it on.

Linux on the other hand came much later, and is faced with a market already dominated by an incumbent player who has no interest in promoting linux as the natural progression away from their existing product.

If both windows and linux were introduced new to the market today, i think the story would be very different... Alternative OSs to the incumbents have a very rough time of it, commercial ones outright fail due to not being able to build marketshare fast enough to fund development (look at beos etc), desktop linux would have been considered a commercial failure and dropped years ago if it was a commercial product, only due to being open source and thus not dependent on revenue has it been able to build a user base slowly and steadily.

then nivida has vastly improved them since the last time I used them (early 2010), at which point "they install" was about the best thing that could be said for them... as long as you add "just make sure not to update your kernal" to the end.

all fanboys suck. end of story. mint vs ubuntu, google vs apple, nintendo vs sega, fuckin coke vs pepsi... if you're on one side of an arguement, and you can't see the cons of your own side as well as the pros of the other side, you don't really understand the arguement and you shouldn't be speaking.

The reason AMD's drivers suck is that they only have to be as good as nVidia's, which these days is a very low bar to meet. It used to be nVidia made good drivers and that was the main reason to purchase a nVidia card, but sadly that doesn't seem to be true any longer. Instead of forcing AMD to come up to nVidia's level, nVidia chose to sink to AMD's level.

Instead of forcing AMD to come up to nVidia's level, nVidia chose to sink to AMD's level.

Well there's a reason, because some braniac at nvidia decided to move their entire driver team from the 400/500 series drivers to the new 600 series drivers. And their secondary polish team is now doing the main driver writing. It gets worse though, as the driver writing itself has occasionally been outsourced to other companies.

There was a huge thread about this over on the evga forums. And Nvidia went nearly 6mo without anything even approaching a solid driver. Though the new 300's seem to be more or

HAHAHA! I have had so many piss poor nvidia cards in the last few years that I switched to AMD now and havent looked back. IMHO the last good card that nvidia made was the 8800. I have a pile of broken 2xx cards in my desk that I am looking at right now. They seem to last a few months to a year. RMA'd cards from MSI and asus always come back and work for a few more months before failing to POST or creating graphics errors.

Things change people. ATI drivers are not even that bad anymore. Sure they update a bit much and nag you to update, however they are stable and I have had less problems with my current 6850 then any nvidia card since 8800 was popular. I think the reason was poor solder, or too heavy heatsyncs warping the cards or something on the 2xx series. Perhaps they have fixed it now with there newest cards but fuck if I am switching back till ati lets me down!

This is how the video card game always plays. Someone starts slipping and someone else takes the lead. For me, reliability is the best benchmark. Nvidia as I said has been shit so they lost me (and everyone i recommend cards to) as a customer. My 6850 has been rock solid since i purchased it. Not even any driver crashes! And yes I am aware that for the last 10 years ATI had crap drivers. This has been mostly fixed with their windows 7 drivers, so thats a few years ago now.

And I've had 3 Seagates turn to shit whereas the Samsung have never let me down...its called luck of the draw folks. they crank these things out like flapjacks and bad ones get out all the time, which is why I judge a company not by whether or not they put out the occasional dud but what they DO about said dud and what kind of service you get. that is why I use Sapphire and Gigabyte cards, Gigabyte and Asrock boards, and I guess that now Samsung is gone I'll be stuck with WD hard drives. Not because these c

AMD may not have the best drivers, but I dont recall any AMD drivers that allowed me to play games and fry eggs with the same piece of hardware like Nvidia.

You don't seem to have looked hard enough at the AMD drivers, or developed software that uses them.

I write management software for supercomputing clusters - and GPU's are one of the shiny things right now. Mainstream GPU drivers have hooks that let us do things that admins like to know about - monitor temps, fan speeds, voltages, etc.

The GTX 400 series was indeed very power hungry, with one GTX480 eating nearly as much power as two of the equivilent ATI cards. I know firsthand, I have two computers with GTX 470s and they heat up the upstairs loft so much the house's AC can't keep up in the summer. put those two computers into sleep mode, no problem.

The GTX 500 series was a significant improvement on power draw and heat dissapation.

This isn't very surprising AMD/ATI have always had crappy drivers. I wish their fan base would stop apologising for them and demand AMD put more effort into their products.

This can't possibly be true. I've been troll moderated to death and beset by ATI fanboys at every turn, for years now, on slashdot in the past, all assuring me ATI not only has awesome drivers on EVERY platform, including Linux, but that NVIDIA is unusable and the choice of the foolish. These trolls can't possibly be wrong, can they? I mean the video quirks and game bugs, rendering problems, and kernel crashes commonly associated with various ATI video cards can't possibly be real can they?

I've never encountered any major problems with either NVIDIA's or AMD/ATI's drivers on Windows. Just couple of weeks ago I had a GT520 that I installed to a HTPC and just couldn't get it working. Turns out it was the motherboard that wasn't compatible.

This isn't very surprising AMD/ATI have always had crappy drivers. I wish their fan base would stop apologising for them and demand AMD put more effort into their products.

While I'm an Nvidia person in recent years, ATI has done a lot over the past year to address a number of the concerns with their software/driver package (specific application profiles; greater customization, etc...)

And, as bad as their Linux support may be, it's far better than it used to be (remember trying to enable accelerated graphics on an ATI setup in Linux ~7 years ago?).

so you're crying about fanboys, while running around fanboying. typical.

all fanboys suck. end of story. mint vs ubuntu, google vs apple, nintendo vs sega, fuckin coke vs pepsi... if you're on one side of an arguement, and you can't see the cons of your own side as well as the pros of the other side, you don't really understand the arguement and you shouldn't be speaking.

In comparison and I don't know if it's motherboard, video card, etc, but:

HDMI works flawless for me on my high end AMD card, but I remember having a nightmare with an Nvidia card. However, neither of them reflect on any form of reality - you could have the same problems with probably any graphics device period regardless of the manufacturer and more dependent on the quality of the individual chip binned and the driver/os combination, etc.

I find it a bit concerning that we're now 4 levels deep into this and you still haven't even acknowledged my original point, you've constantly tried to turn this into some kind of comparison between other companies, something I never mentioned, something that isn't at all relevant here.

Please tell me what the link is between "AMDs drivers sucks and I wish their fans would demand better drivers" and "well Nvidia sucks too!". Congrats you're exactly the reason why AMD are laughing all the way to the bank. Why

stone2020 wasn't making excuses for AMD. Quite the contrary, he was saying that you're biased if you nail AMD to the cross for this, and give NVIDIA a pass on the things that they mess up. "I hope you complained..."

Don't just hold one company's feet to the fire. Hold the whole industry's feet to the fire. Fail to do this and you're just a fanboi.

I think you're just nitpicking. The OP pointed out that despite their crappy drivers AMD is still the better alternative if you have to choose a graphics card. Contrary to what you think that's both informative -- though arguably not very much, since most people knew that already -- and relevant to the point about AMD you've made.

When a problem with their chosen product is pointed out, they try to deflect it with criticism of the product offered by someone else. Happens all the time with videocards. The two camps have some really rabid fans who cannot accept any criticism of their chosen card and if it happens they instantly start screaming about the other vendor.

The two camps have some really rabid fans who cannot accept any criticism of their chosen card and if it happens they instantly start screaming about the other vendor.

What's wrong with "Sure, my card has problems, but your card also has problems, and here's how your card's problems are more noticeable in practice"? If bad isn't allowed to complain about worse, that's the perfect solution fallacy [wikipedia.org].

What's wrong is this isn't a discussion about AMD vs nVidia, it is a discussion about how AMD should fix their shit. To then try and deflect and say "No they shouldn't because nVidia isn't perfect," is stupid.

A discussion (a real discussion, not fanboy screaming) about the merits of the two cards is useful if someone is looking at which they might want to buy. However responding to a problem in AMD drivers with "But, but one time nVidia produced a bad driver that caused overheating!" is not productive. Tryi

The obvious point is that drivers for both premium GFX card vendors have significant problems due to all the chasing of the better performance at cost of everything else, often including system stability and compatibility.

The obvious point is that drivers for both premium GFX card vendors have significant problems due to all the chasing of the better performance at cost of everything else, often including system stability and compatibility.

Sure, new drivers can cause problems, but 'performance at any cost' is what the gaming market generally demands.

Either the gaming market has to change or Nvidia and ATI have to change their customer-base.

Look. I'm no fanboy here. I've build plenty of AMD and Intel boxes. Back in my gaming days, I would flipflop between ATI and nVidia cards. One thing is for certain however. nVidia is *not* worse. They may be more expensive or overpriced for what you get. They're drivers might even be bloated. But nVidia drivers have always, and I mean always been superior to anything ATI have coded. That whole.NET framework dependency (is that still a problem, I don't know) was always a major PITA when setting up a new PC.

I was on the phone to an nvidia rep about a week after that happened. They had a very very very bad time with that one.

Both companies have had their occasional spectacularly bad driver releases, that, in the course of years of business is not a huge surprise.

I think this is more about AMD not bothering to keep their drivers in step with modern windows software practices. There's a strong case for "if it works" (which generally it does) don't break it, but eventually have to keep up with technology and AMD

Acronym Overload Detected. A summary is supposed to summarize but I couldn't tell what this story is about unless I already know.

Notice that the first reference to ASLR in the summary is actually a link to Wikipedia. If you hover over the link, you get the acronym expansion. While not as effective as expanding it in the text, it's nice to have the full Wikipedia article available in case you want to read up on it prior to digging into the article.

Why would I have to do that just because someone else decides for me what I need or not need to read.
I'd like to decide for myself what's relevant or not. Having to change the browser ID every time to XP, Vista, 7, 2003, 2008, etc. to look up information on microsoft.com is not an option.

1) Don't start your post in the subject line, that is fucking annoying. Are you new?2) What do you mean "the" article content? He doesn't know which content it showed him, and neither do you. But I notice you're anonymous and cowardly, so you're probably a shill as well.

Wrong, although you may have inferred that. He outright stated that the article says to you that it is being altered for Linux users to contain information Microsoft doesn't think he needs; "It's very useful for microsoft to refuse to diplay content it decides I don't need to see." He didn't say "to refuse to display articles it decides I don't need to see" but you (and some other idiots) just added that word into the sentence as you read it.

Unfortunately expanding-out the acronyms doesn't make the summary any clearer:

"CERT/CC has called out AMD for having insecure video drivers. AMD/ATI video drivers are incompatible with system-wide Address space layout randomization (ASLR).

'Always On' Data Execution Prevention (DEP) combined with 'Always On' ASLR are effective exploit mitigations. However, most people don't know about 'Always On' ASLR since Microsoft had to hide it from the Enhanced Mitigation Experience Toolkit with an 'EnableUnsafeSettings

Unfortunately expanding-out the acronyms doesn't make the summary any clearer:

"CERT/CC has called out AMD for having insecure video drivers. AMD/ATI video drivers are incompatible with system-wide Address space layout randomization (ASLR).

'Always On' Data Execution Prevention (DEP) combined with 'Always On' ASLR are effective exploit mitigations. However, most people don't know about 'Always On' ASLR since Microsoft had to hide it from the Enhanced Mitigation Experience Toolkit with an 'EnableUnsafeSettings' registry key â" because AMD/ATI video drivers will cause a Blue Screen Of Death on boot if 'Always On' ASLR is enabled."

What?

Actually that helps. I didn't recognize the ASLR and DEP acronyms since there wasn't enough context to know what they were talking about, I didn't immediately recognize the term "Address Space Layout Randomization", but when I saw "Data Execution Prevention" it became much more clear what they were talking about.

But a little explanation would have been nice. Something like "DEP and ASLR are security mechanisms used to make it more difficult for malware to execute code or to predict memory addresses where programs and their data are located"

Now look, I don't want to be a pissy elitist, but this is slashdot, news for nerds. If you don't know what ASLR or DEP are then you probably don't belong on this site at all; if you can't figure out how to use google to figure out what they are then you definitely don't belong here. These are not new technologies and there have been probably a dozen articles discussed here on slashdot about the relative merits of various operating systems' ASLR implementations (Windows best, MacOSX worst) and even highly de

aslr = a way to secure your memory so it's harder for malware to run attacks.EMET = a bunch of tools that windows uses to secure the machine. aslr is one of these toolsbsod = blue screen of death. your computer is frozenAMD = a company that was formerly known for making computer chips, but is now in the graphics card businessATI = a graphics card manufacturer that AMD bought.DEP = another tool in the EMET toolkit.cert/cc = an organization that is viewed as an authority on computer stuff.

in short, AMD drivers suck so much that microsoft has to override its own computer protections to keep AMD from crashing your machine. so the drivers are not just unstable, they make your machine more vulnerable to malware. cert says, "epic fail".

Uhh, no. Windows DLLs have always been relatively addressed, and are capable of being loaded at different locations in the virtual address space (google "rebasing"). However, for performance reasons, most DLLs specify a preferred address the loader will attempt to slot them into. All system DLLs specify this, which results in their routines being loaded at predictable addresses (even across machines).

ASLR means that, on boot, a different location is chosen in the virtual address space to load DLLs into, so that system routines are not always at the same location, making certain types of security exploitation significantly harder.

The story is about DEP and ASLR effectiveness at blocking exploits. IT has nothing to do with the title or the ATI/AMD aspect.

The CERT article mentions it, and it mentions it in that you cannot use the DEP/ASLR protections (in the kernel) because ATI/AMD make an incompatible driver. And since graphics drivers are kernel things, loading them means the kernel must disable DEP/ASLR, making your machine just that much less secure because of it.

DEP and ASLR is not a "security standard", and it cannot be exploited. It is a technique to mitigate a software exploit such that the chances of successfully executing an attack is much smaller. It's not foolproof, but then it was never designed to be. It does make things statistically more secure, however (as in, your chances of getting malware through some exploit are reduced, and it takes longer to write malware that can circumvent it).

Microsoft is constantly telling people that they won't sign their drivers unless they pass strict quality and certification standards. MS should just deny that to drivers as buggy as these are reported to be.

Oh wait... that would mean MS Is actually committed to quality as opposed to just needing an excuse to deny the little guy who wants to write some driver-level code.

Software can be made 100% secure assuming it is the only attack vector.

1. That's assuming too much and ignores reality (humans) so this is automatically bunk. But I'll take this as "credible" to discuss the next point.

2. People, like you, have claimed that you should be able to write a mathematical proof for your code, and if you can, it's secure (because supposedly it's only going to do what you tell it without error).

This totally ignores the concept of Complexity - complex (and even unexpected) behaviour

Perfect software doesn't exist and neither will it ever. If someone claims they can, make sure you know where your wallet is. It's like the koan, "If you meet the Buddha, kill him."

Exactly. One of the best examples that comes to mind is the guidance software written for the first space shuttle computer, and even that had bugs. It was also 20x more expensive than the normal going rate at the time, and technically speaking it contained only about one two hundredth less bugs by number of lines of code.

Damn impressive for sure, but far from a zero. It also cost half a billion 1960's dollars!

How wrong you are. Ever hear of a Simple Branch Prediction Analysis attack? We covered that back in 2006, if not earlier.

Your original comment said:

And since humans make both hardware and software, it can't be infallible. Hence why we have branch prediction, error correction, and more.

... which implies that you consider branch prediction to be a form of mitigation against errors, similar to error correction — i.e. that the reason branch prediction exists is to improve security.

A Branch Prediction Analysis attack makes use of branch prediction to break security, but that's irrelevant — it doesn't change the reasons why branch prediction existed in the first place, and it certainly doesn't turn branch prediction into a security feature.

Quantum physics makes 100% never a reality for anything. The point is that given enough brain power, one could prove something is secure. Whether or not it's a good idea to assume something is 100% secure is a whole other issue.

Proving the existence of unprovable statements within logically consistent systems doesn't prevent there from being provable ones... If you are very lucky indeed, the ones that are provable and the ones that you care about might even overlap...

It's not a proof that there are no provable statements, which would be self-contradictory.

Yes, that would be self-contradictory, but a statement that "most interesting statements are unprovable" would not be self-contradictory. Most statements about computer programs, for example, are undecidable as a consequence of the halting problem's undecidability.

"Undecidable in general" != "unprovable for any program". Or rather, just because it's not possible to write a general purpose algorithm that will tell you whether any given computer program {terminates, crashes, is secure against a particular threat} doesn't mean that, for any program I'm ever likely to write, there doesn't exist a proof. Or in other words - there exist billions of programs where it's difficult to know for certain whether they terminate - but those programs aren't relevant to determining w

Yea, whenever I relate my problems with my 4870 ATI cards and the BSOD on boot just about every time I turn on the computer (I have 50 or so.dmp files from Jan 30 to 1 today) I get jumped on as well. This has been going on through Windows XP Home and XP Pro as well as Windows 7 now. I'm hesitant to update drivers since I've killed the machine in the past with updates forcing me to roll back the version and now the cards aren't supported.

TFA basically gives AMD a downmod (consider it a +1 Sucks) because they do not care about supporting simple security features (which some of other posters extrapolate, along with their personal experiences to, they suck worse than Nvidia). Making code compatible with ASLR is not complicated or time consuming at all (I have been involved in linux driver programming), it is just that they have not bothered with it. The result is a simple and effective shield that ASLR and DEP provide is broken.

My mistake. ALSR enabled by default on VS2008, and was able to be selected on VS2005...and the WDK for Vista and above, also by default.

The summary claims that "AlwaysOn" ALSR isn't enabled by default "because of AMD". The summary also claims that AMD drivers are unsafe and insecure. TFA claims that it isn't enabled because of "some software, including AMD". The fact is Microsoft declares the forced ALSR unsafe -for a reason-. Forcing it on at that level has no benefit for the things that already support i