The students are skeptical that turning off non-executable stacks, turning off canaries and turning off ASLR represents a realistic environment. If PaX, DEP, W^X, etc., are effective at stopping buffer overflow exploits, is there still value in learning about them?

@Fixee I'm going to be brutally honest here, if you really are questioning the idea of teaching buffer overflows, I really don't think you should be "teaching" a class about security. Considering they are one of the most basic and universal security exploits, buffer overflows should be an integral part of understanding and teaching how to attack ANY system.
–
mrnapMar 3 '11 at 5:46

22

@mrnap I've been teaching buffer overflows and shellcode writing for 13 years. As most people in the security arena know, this was once the #1 vulnerability (5-10 years ago) and has since been supplanted by other vulnerabilities. I asked the question above to poll professionals on the continued value of teaching these techniques (which takes a lot of time) when other topics are becoming more relevant. You criticize me for even asking the question: believe me, if I'm not constantly questioning the impact of what I'm teaching, then I'm not doing my job.
–
FixeeMar 3 '11 at 20:50

3

@mrnap We get 10 weeks to teach a security class
–
FixeeMar 4 '11 at 4:23

18

@mrnap, whether the clear answer is a resounding "YES" or not, it is always a legitimate question to ask. In fact, someone that doesn't constantly question the current conventional wisdom is the one that shouldnt be teaching a class on security. As we all know, questioning assumptions is one of the best tools in our toolbox, and most often leads to finding vulnerabilities.
–
AviD♦Apr 10 '11 at 7:51

5

@avid nicely said! @mrnap, please also note the great advice in our faq: Be nice. Treat others with the same respect you’d want them to treat you. :)
–
nealmcbApr 11 '11 at 12:12

All you need to bypass ASLR for Windows is an information disclosure vulnerability that will let you know the base address of a loaded DLL in the process (that was the first vuln that Vreugdenhil exploited). From that, you can use a ret-to-libc attack to call any function in that DLL.

The bottom line: stack (and heap) overflows are absolutely still relevant today. They're harder to exploit than they used to be but they're still relevant.

To complement, in a few specific contexts buffer overflows can be easier to exploit because of memory layout leaks, e.g. in kernels. Heartbleed was also caused by a buffer overflow, and this will happen again because we still write and will continue writing low-level code for hardware interfaces and OS libraries...
–
Steve DLMay 23 at 10:33

Besides @Larry's and @SteveS's excellent concise answers, I want to point out a very important point:

The students are skeptical that turning off non-executable stacks, turning off canaries and turning off ASLR represents a realistic environment.

Hopefully this is true for your students' systems.
In the rest of the world, however, this is still very common, unfortunately. Besides the platforms that don't support these, there are always poorly build products that require shutting these off, older versions of OS, and even just bad misconfigurations.
Still very realistic, sadly.

On top of all that, 2 more comments from an educatory pov:
1. somebody has to build those defenses, right?
2. Even if hypothetically they were right - you only need pointers in C/C++ doesnt mean a Java developer shouldnt learn how these things work, inside the computer, right?

Yes. Apart from the systems where buffer overflows lead to successful exploits, full explanations on buffer overflows are always a great way to demonstrate how you should think about security. Instead on concentrating on how the application should run, see what can be done in order to make the application derail.

Also, regardless of stack execution and how many screaming canaries you install, a buffer overflow is a bug. All those security features simply alter the consequences of the bug: instead of a remote shell, you "just" get an immediate application crash. Not caring about application crashes (in particular crashes which can be triggered remotely) is, at best, very sloppy programming. Not on my watch!

For completeness, non-executable stacks and canaries do not prevent buffer overflows; they just shut off some of the easy ways to exploit buffer overflows. The traditional buffer overflow is about replacing the return address with a pointer to malicious code which is loaded as part of the data which overflowed the buffer; the malicious code gets executed when the function returns. The non-executable stack means that the attacker will not be able to place his code on the stack (he will have to arrange for a jump into some library code, e.g. the execve() implementation in standard library). The canary prevents the return address from being used if a stack buffer was overflowed (assuming the overflow is "simple": a contiguous chunk of data). But the overflow may also overwrite some other data, including function pointers (in particular in the context of C++).

Great post, few extra points: 1. People have to learn to write less buggy code, not blindly rely on mitigations (as the author correctly indicated, the stack based buffer overflows have not been eliminated, only made harder). 2. How are they supposed to undertand what DEP, ASLR do, if they don't understand the attacks that made these protections necessary in the first place? 3. Looking for bugs makes them a better coder, as they trust themselves and the system less, thus pay attention to behaviors that happen not only when things work properly, but also when they go awry.
–
MarcinMar 2 '11 at 15:00

There are already great answers here, so I won't try to re-cover them. However, since we're talking about mechanisms that prevent buffer overflow, I'd like to emphasize something that you might like to pass on to your students:

Just because a program is written in Java does not mean there are no buffer overflows. The Java Virtual Machine itself is not implemented in Java; it is implemented in (probably) C or C++, which are just as prone to buffer overflows as any other program.

I'd like to answer from the point of view of a student who has recently started learning about Buffer Overflow Attacks in depth. I too had the same concerns and had doubt in the benefits of learning about buffer overflow attacks but considering how much I have learned to get to the meat of it, I am very glad that I chose to do it.

I have so far had to:

Learn x86 assembly - mainly Ubuntu based

Learn some C programming

Become a ninja with gdb

Understand shell code

Look for every IT security forum I possibly can and learn the process of making software/operating systems safer

These skills are very transferable and I have no regrets. Unfortunately, I do not have someone teaching me this. I have had to rough it out with video tutorials and example codes for my Master's project. Buffer overflow attacks may be less of an IT security concern than it was 5-10 years ago but it can definitely serve as a stepping stone to learn about more complex or contemporary threats.

No, don't teach buffer overflows. Teach memory corruption. Teach exploit primitives. Yes, they need to know the history as well, but it shouldn't take up more than half of the class. I say when it comes to actual homework, give them challenges that have all of the memory protections enabled, but maybe make the vulnerable server weaker, with some memory disclosures that give hints at various offsets.

They should also understand that many memory protections are mostly gimmicks, and often times the "random offsets" aren't really as random as you might think they are. They should learn to recognize bugs, and use the bugs they have to poke at the things they know, so that they can get towards things that they want.

They should be thinking in terms of overwriting function pointers rather than return addresses, and be jumping into known executable code (ret2libc, ROP) rather than jumping into shellcode on the stack/heap.

Being a college student fresh out of a c++ class, yes I know its not the same.., but my professor made sure to put extra emphasis on teaching about buffer overflows and how to prevent them, and I really encourage you to continue teaching about buffer overflows in any course you teach programming. It can't harm them to do so.