Windows DLL-loading security flaw puts Microsoft in a bind

The rediscovery of an old attack method based on the way that Windows loads …

Last week, HD Moore, creator of the Metasploit penetration testing suite, tweeted about a newly patched iTunes flaw. The tweet said that many other (unspecified) Windows applications were susceptible to the same issue—40 at the time, but probably hundreds.

The problem has been named, or rather, renamed, "Binary Planting," and it stems from an interaction between the way Windows loads DLLs and the way it handles the "current directory." Every program on Windows has a notion of a "current directory"; any attempt to load a file using a relative path (that is, a path that does not start with a drive letter or a UNC-style "\server" name) looks in the current directory for the named file. This concept is pretty universal—Unix-like systems have the same, called a "working directory"—and it's a decades-old feature of operating systems.

Windows, again in common with other operating systems, has the ability to load DLLs at runtime, during the execution of a program.

A systematic flaw in Windows applications certainly looks bad for Windows, but the operating system vendor is not in a position to provide a fix.

Where Windows is different from other operating systems is that it combines these two features; when a program instructs Windows to load a DLL, Windows looks in several different places for the library, including the current directory. Critically, it searches the current directory before looking in more likely locations such as the System32 directory, where most system libraries reside.

It's this that opens up the problem. When a file is loaded in Windows by double clicking it, and using file associations to start up the right program, Windows sets the current directory of the newly-started program to be the directory that contains the file. In the course of opening the file, they also try to load one or more DLLs. In normal circumstances, these DLLs will be loaded from the System32 directory. However, if an attacker knows the names of any of the DLLs the program tries to load, and puts a DLL with one of those names adjacent to a data file, he can ensure that his DLL will be loaded whenever the program tries to open the data file. Programs can also change their current directory manually, so the current directory will often appear to "follow" the last data file loaded.

Hence, "binary planting." An attacker creates a data file—which can be perfectly harmless in and of itself—and "plants" a DLL in the same directory. He then entices someone into opening the data file, for example through a link on a webpage or an e-mail. The vulnerable program will then attempt to open the data file, and in so doing will load the malicious DLL, allowing the attacker to do whatever he likes.

So, for example, an MP3 and a malicious DLL are placed alongside each other. The user double clicks the MP3 to load it, causing iTunes to start, and causing iTunes to use the folder with the MP3 as its current directory. During its startup, iTunes loads a number of DLLs, and one of them will be the malicious DLL.

The newly reported issue is a slightly new take on this old problem; instead of placing the DLL and data file on a local or LAN disk, they're placed on Internet-accessible file servers, using either the HTTP-based WebDAV or Windows' usual SMB/CIFS file-sharing protocols. In this way, a link to data file (with its side-by-side DLL) can be put into webpages or sent in mass e-mails.

Everything old is new again

The peculiar thing about all this is that this vulnerability has been known for a long time. The order in which directories are searched is documented, and has been documented for many years (that documentation dates back to 1998, and there are likely references that are older still, if one has any decade-old developer documents handy), and the dangers of using the current directory for loading libraries were explicitly highlighted a decade ago. As well as warning in the documentation about the dangers, Microsoft bloggers have also written about the issue in the past, telling developers how to avoid the problem.

To reduce the impact of this problem, Windows XP (and Windows 2000 Service Pack 4) changed the DLL loading behavior, by introducing a new mode named "SafeDllSearchMode." With this mode enabled, the current directory is only searched after the Windows directories, rather than before. This new mode was made the default in Windows XP Service Pack 2, and all subsequent operating systems from Microsoft. It is not, however, a complete solution; if a program tries to load a DLL that does not exist in the Windows directories, it will still fall back to attempting to load from the current directory, and so will still load a malicious DLL.

To address this, Windows has, since Windows XP SP1, provided a mechanism for programs to opt out of loading DLLs from the current directory completely. This has to be something that the program explicitly chooses to do, however, as there is the possibility that some software will depend on loading libraries from the current directory; making an OS-wide change to prevent the current directory being used in this way would break those programs.

All of which adds up to a tricky situation. Programs that are vulnerable to these problems are buggy. The operating system's standard behavior may not be ideal, but it provides the ability to write safe applications—though the design is plainly poor, it's not a fatal flaw, and certainly can't be described as a bug. It's just an artifact of Windows' long history. The behavior made sense in the security unconcerned world of single-user, un-networked 16-bit Windows, which is where it was first implemented, but is plainly undesirable in the modern world.

The fact that programs may legitimately depend on the behavior is another complexity: Microsoft can't easily make a unilateral decision to remove the current directory from the DLL search path, because the impact of such a change on legitimate programs could be substantial, and crippling. Microsoft has been telling developers to be careful, and to make sure they do the right thing, for many years now.

This places the company in an awkward position. A systematic flaw in Windows applications certainly looks bad for Windows, but the operating system vendor is not in a position to provide a fix. Though most programs will be able to get away with disabling DLL loading from the current directory completely, that's a determination that must be left to the program's authors, not Redmond. Unfortunately, not all vendors will do this in a timely manner, or possibly even at all.

Fixing the problem

If something isn't safe by default, and requires extra development effort to be made safe, programmers are going to write unsafe programs.

Microsoft has produced a hotfix that allows system administrators to forcibly disable DLL loading from the current directory for applications of their choosing, or systemwide. This still runs the risk of breaking those applications, but this approach allows users to test that their applications will remain working and apply the fix if it's harmless.

To try to maximize its usefulness, the fix is a little more complicated than simply disabling DLL loading from the current directory completely. It has three levels of protection: it can disable DLL loading from the current directory when the current directory is a WebDAV directory, it can disable it when the current directory is any kind of remote directory (whether it be WebDAV, SMB/CIFS, or anything else), and it can disable the use of the current directory for DLL loading completely. In this way, programs which legitimately need to load libraries from the current directory can still be made safe from network-based attacks.

This is not a perfect solution—disabling network-based libraries still permits exploitation via USB key, for example—but it does allow people to protect themselves. In conjunction with disabling WebDAV and blocking SMB at the network perimeter, it should offer reasonably robust protection against untargeted mass attacks. But it places a substantial burden on administrators to create the necessary registry entries to enable the protective behavior, and for many the only practical answer may be to enable the safe behavior system-wide and just hope it doesn't break anything.

If developers are paying attention, we should expect to see a spate of fixes that tackle this problem. In its own security bulletin, Microsoft says that it's currently investigating whether any of its programs are susceptible to the issue, and as mentioned, Apple has already updated iTunes to avoid the problem; there are sure to be many other companies with work to do for the same problem.

In that sense, the extra publicity that this old problem has been given is good news: it should raise awareness of the problem ensuring that more developers take care to address it—and for many, it should be a simple fix, as simply disabling the use of the current directory for library loading will suffice.

But it does highlight a bigger issue. If something isn't safe by default, and requires extra development effort to be made safe, programmers are going to write unsafe programs. It's an issue seen time and time again with more conventional security flaws such as buffer overflows: trusting programmers to do the right thing doesn't work. They may do the right thing some of the time, perhaps even most of the time, but they won't do the right thing all of the time, and software will contain exploitable flaws as a result.

Microsoft has made great strides with its own software to strive to eliminate common exploitable bugs from its own code, and to reduce the exploitability should those bugs be discovered, but even these efforts have not been 100 percent successful, and bugs are still found. And as the (re)discovery of this problem shows, getting that message out to third-party developers is an uphill struggle.

The company says that it is looking into ways to make it easier for developers to avoid this mistake in future, but short of making it impossible—by removing DLL loading from the current directory entirely, or at least requiring it to be explicitly enabled—it's hard to see what the company could do to improve the situation, as for most applications the problem is already easily avoided with just a single line of code.

In a world where developers can't be trusted to follow best practices, perhaps the correct response should have been for the company to make the new hotfix opt-out, rather than opt-in; make it establish a system-wide default, and allow administrators to revert to the old behavior for those programs that require it. This means disabling documented, standard functionality, but there's little practical alternative.

Microsoft has said that it is willing to break backwards compatibility if it's necessary to solve security issues, but typically only when the backward compatible behavior is irredeemably unsafe, and not required for any legitimate purpose. This is, after all, why the company changed the search order in the past, to search the current directory after the Windows directories instead of before.

The situation here is certainly trickier—programs might quite legitimately depend on the current, dangerous behavior—but if exploitation using this attack vector becomes widespread, the company may be faced with little alternative but to break compatibility and provide bulletproof protection after all. History has shown that the ability to write safe programs is not enough: software needs to be safe by default, as painful as that may be.

Change the software. Fix it. Tell the third party vendors to update their software. Stop messing around with keeping legacy, unsecured code around for people that are too cheap to update their 10 year old software.

Apple has no problem doing it. Yes, people complain and bitch and moan...but it's for everyone's own good that Microsoft fix these problems.

Just to note: this is a flaw in any compiler which provides the possibility of dynamically linked libraries. You can do similar things in Unix, OSX or Linux. It's a basic flaw of using a system path variable to search for commonly used files, and it's a known flaw for decades, now.

This particular flaw is somewhat unique, due to the patch that Moore got it to work on remote shares, but saying that this is a systemic flaw which makes Windows look bad is just silly. More accurately, it's a systemic flaw to all operating systems which makes it ten times easier for users to actually use the damn things.

I dream of the day we are rid of DLLs. Should have been killed more than a decade ago

You really want every program that rendered html in 2002 to have been statically linked against MSHTML 6.0? if all you could do is statically link against a binary then how would the OS vendor deploy security fixes to things like common controls?

So, for example, an MP3 and DLL are placed alongside each other, and when the MP3 is opened up in iTunes, iTunes loads the DLL

This sounds ridiculous. Why would iTunes load a DLL because you opened an MP3? If iTunes loaded all of its DLLs on startup, it would not be an issue. No amount of security can protect you against incompetent design. Worst case, if it really needs to load and unload DLLs when individual songs are played, it could simply call setcwd() prior to doing so, or prepend _wfullpath(argv[0]) onto the DLL name. Simple.

I use DLLs in the current directory a lot, myself. By making the DLLs open source and optional, it is a great way to make your program modular, while keeping the core from becoming bloated. Of course there's a risk that someone can replace your DLL with their own malicious one. A DLL should be treated exactly like an executable file, because that's exactly what it is.

The answer, to me, is better education. The same red flags should go up when you download a DLL or try to invoke it for the first time.

From your description is sounds like what you're saying is that apps were written in an idiotic fasion. if you're going to change your working directory to something that's not secure you shouldn't be loading dlls without using fully qualified paths. That said, if the OS provides a method for doing something that 9 out of 10 developers get wrong I can't really blame the apps.

Just to note: this is a flaw in any compiler which provides the possibility of dynamically linked libraries. You can do similar things in Unix, OSX or Linux. It's a basic flaw of using a system path variable to search for commonly used files, and it's a known flaw for decades, now.

This particular flaw is somewhat unique, due to the patch that Moore got it to work on remote shares, but saying that this is a systemic flaw which makes Windows look bad is just silly. More accurately, it's a systemic flaw to all operating systems which makes it ten times easier for users to actually use the damn things.

Pretty sure I can't make a Unix program load a .so of my choosing from a location of my choosing just by having a user open a data file. Linux, for example, doesn't look in CWD for .so files. That's unique to Windows.

CWD is not PATH and is not LD_LIBRARY_PATH. CWD is something that attackers actually have a small amount of control over, if they learn that some programs change their working directory to that of the data file they're loading. Not so LD_LIBRARY_PATH or PATH.

Why not create a DLL-specific folder type that will only accept DLLs and restrict DLLs to this folder? This means the only change 3rd-party would have to make is creating such a folder in the app directory, moving existing DLLs to that folder and updating the app to recognize the new folder type and how to find it.

This would prevent common data files, like MP3s from mistakenly (or intentionally) being placed along side DLLs. MS will have done their part and it would be up to the devs to do theirs.

Just to note: this is a flaw in any compiler which provides the possibility of dynamically linked libraries. You can do similar things in Unix, OSX or Linux. It's a basic flaw of using a system path variable to search for commonly used files, and it's a known flaw for decades, now.

Is that the case with Unix type OS 's? My understanding was that dynamically linked libraries were searched for within specified library paths - and which (by default) were owned by the root user. It would be unusual to search current working directory I'd have thought...

So, for example, an MP3 and DLL are placed alongside each other, and when the MP3 is opened up in iTunes, iTunes loads the DLL

This sounds ridiculous. Why would iTunes load a DLL because you opened an MP3? If iTunes loaded all of its DLLs on startup, it would not be an issue. No amount of security can protect you against incompetent design.

Loading dlls on demand is absolutely a reasonable thing to do. Imagine you're writing a music player that supports every single codec known to man. you could A) load all the codecs that you support at startup, or B) load some small subset of common codecs and then load the rest on demand.

If you do A your app will take forever to launch as the dlls containing those codecs will have to have their dllmain function executed on load and any code pages that are touched will be paged into memory causing a lot of disk IO. It'll also increase your active attack surface from a security standpoint.

The catch is there is no way to really tell a dll is being malicious / bad if its in the current directory without giving pop-ups for all some hundreds of dll's you load every 10 minutes to do something. I think by default any "remote" dll loading should come with a huge red flag. However that starts to break websites etc, that use content delivery networks for Javascript, Ajax, JSON scripting etc. Making it a pretty sticky situation all around. An alternative could be to signature dll's if the dll's signature doesn't match the programs then it's blocked, not fool proof as people will just work out collections of signatures though. Almost need to have what video game anti-hack software does where all applications come with a file print, that checks things like file size, location, type etc.

Or MS is just going to have to force the option out and say stuff it to developers accept the cuts, and just clean it's wounds and move on. I think they are trying the mid road though, give options on both, and continue to push / make it more difficult to build software with the "loop-hole" in place.

I dream of the day we are rid of DLLs. Should have been killed more than a decade ago

Yes, brilliant. Now instead of having to patch one browser to fix flaws, we should have to patch every single application that uses an embedded browser. Hundreds of 'em.

Yes, exactly. Stop farting around with patching and using tape and spit to fix this crap. Throw it all to the curb, keep what works and tell the developers to update their software. It's not impossible.

But wait, doesn't this attack first rely on someone being able to put a file on your computer, presumably in a directory that has another program in it. If this is the case, then it seems you are already screwed, because if the hacker can do this, they could also do other things as well, like just up and overwrite your notepad executable.

While this seems to be an annoyance, it doesn't seem any worse then the attacker being able to put a compromised DLL in your system32 directory in the first place. In other words it counts on you already being prime for exploitation.

They should have fixed it years ago. They've had a whole decade to kill off the broken apps. Auto-loading DLL's from the current directory is a bug. It "made sense in win16" is simply because win16 was one big bug. The OS should (reluctantly) allow programs to load DLLs from data directories, but applications should explicitly request that.

Now it's true that some program, somewhere relies will rely automagic-autoload, but -uck'em. Sometimes you just have to break the applications. (Of course it needs to happen less often if you don't make elementary mistakes in the first place.)

So, for example, an MP3 and DLL are placed alongside each other, and when the MP3 is opened up in iTunes, iTunes loads the DLL

This sounds ridiculous. Why would iTunes load a DLL because you opened an MP3? If iTunes loaded all of its DLLs on startup, it would not be an issue. No amount of security can protect you against incompetent design. Worst case, if it really needs to load and unload DLLs when individual songs are played, it could simply call setcwd() prior to doing so, or prepend _wfullpath(argv[0]) onto the DLL name. Simple.

It's not that iTunes loads the DLL on demand. It is that iTunes was not running at the time the user clicked on music file, which caused iTunes to load. Of course, iTunes could load the DLL dynamically only once a user requests to play the MP3, but I doubt it.

How does the DLL get into the CWD in the first place? The program files directory is protected by UAC. I guess it could happen if running out of a non-standard directory or off a USB drive, but that's not the case with iTunes. Also, couldn't this be solved if signed apps could only load signed DLLs?

So, for example, an MP3 and DLL are placed alongside each other, and when the MP3 is opened up in iTunes, iTunes loads the DLL

This sounds ridiculous. Why would iTunes load a DLL because you opened an MP3? If iTunes loaded all of its DLLs on startup, it would not be an issue. No amount of security can protect you against incompetent design. Worst case, if it really needs to load and unload DLLs when individual songs are played, it could simply call setcwd() prior to doing so, or prepend _wfullpath(argv[0]) onto the DLL name. Simple.

It's not that iTunes loads the DLL on demand. It is that iTunes was not running at the time the user clicked on music file, which caused iTunes to load. Of course, iTunes could load the DLL dynamically only once a user requests to play the MP3, but I doubt it.

iTunes must be loading DLLs on demand (with LoadLibrary/LoadLibraryEx), because I don't believe static dependencies (imported through the executable header) search CWD.

But wait, doesn't this attack first rely on someone being able to put a file on your computer, presumably in a directory that has another program in it. If this is the case, then it seems you are already screwed, because if the hacker can do this, they could also do other things as well, like just up and overwrite your notepad executable.

While this seems to be an annoyance, it doesn't seem any worse then the attacker being able to put a compromised DLL in your system32 directory in the first place. In other words it counts on you already being prime for exploitation.

This is exactly what I was wondering... but I'm just chucking it up to not really understanding the issue... like the whole "loading MP3 + DLL" thing. Why would it look in the directory with the MP3 instead of IT's own directory? Count me in the very confused camp.

How does the DLL get into the CWD in the first place? The program files directory is protected by UAC. I guess it could happen if running out of a non-standard directory or off a USB drive, but that's not the case with iTunes. Also, couldn't this be solved if signed apps could only load signed DLLs?

CWD can be anywhere.

This is something I should explain, I'll make an update momentarily. If you spawn a program by getting the shell to load a file (i.e. clicking in Explorer, using the default "open" handler in your browser, etc.), CWD will generally be set to be the directory that the data file resides in.

On Unix, you must add '.' to your LD_LIBRARY_PATH in order for shared objects to be loaded from the current directory. Much like running as root or other questionable security practices, this is frowned upon and is not the default (I could understand for a developer, but presumably they would know what they are doing). On Linux (not sure of other Unixes), LD_LIBRARY_PATH is reset whenever calling a setuid program, like sudo.

So for this attack to work on Linux, the user would have to manually add LD_LIBRARY_PATH to their environment (by editing the .profile). They would then have to log out of their session for the environment variable to be picked up by their WM. Then they would have to run open the app, causing the malicious .so to run. However, the malicious code would still be limited to whatever privileges the user has. If you wanted to the code to run as root, the user would have to edit /etc/profile as root, since the LD_LIBRARY_PATH gets reset when running sudo. This requires a lot more social engineering than putting up a link to a file on some web/samba server...