So I was asked to put together a syllabus for a series of courses on the basics of secure coding, for a programming team. Though the time constraints are a bit... constraining, I'm working around that...

However, I'm coming up a bit short on relevant topics, though I feel that there should be something else. It's been a while since I've done this, so these topics are admittedly not fresh in my mind...
Note that this is only one part of a larger series, the other parts are dealing with all the other aspects of a security course - principles, best practices, theory, SDL, etc etc. This part is only on the actual coding bits.

So, for a course on Secure Coding in C, what I have so far is (for each type of attack, the course will cover what it is, and how to prevent it) :

Buffer Overflows

Stack overflow

Heap overflow

Integer Overflows

Format Strings attacks‎

Race conditions – TOC-TOU

‎“Dangerous” APIs‎

Still waiting to hear back if databases are relevant; web issues are not.
What else would you suggest, specifically for C?

Also, I see in your list is missing one important thing - dealing with pointers. I suppose for that item there should be separate topic. That is a one real PITA and attention should be pointed to this issue from the early start.

I know this answer is going to be unpopular, but I would tell the people to stop programming in C or any unmanaged code. JIT optimizations in managed code frameworks are better optimizing (which is something which you shouldn't care about in the first place as a developer) in 2011.

In the case of brownfield, legacy devshops-- they should utilize a different compiler, such as a safety-enforced one built on a neighboring language. It's going to be easier to do this than to write formal specifications in Z to be translated into Hoare Logics -- although Klocwork Insight (as discussed in Hacking Exposed Linux, 3rd Edition) can aid with this potentially 70 year project that you just created for yourself.

It would be good to know what kind of software it is -- and what infrastructure and risk management is commonly applied around it. The SATE project would be a good place to realize that security-focused static analysis tools and fuzzing won't help alone. If you want software assurance -- you need to change the language, or go formal methods. Those are the options.

oo yeah, I'd love to tell them to stop using prehistoric C... in fact I have to restrain myself from saying that. But that's not really going to work... I suffice with being ridiculed for suggesting they try to minimize some of the dirty C tricks they love doing - creative pointer arithmetic, mangled memory management, and of course the beloved union which just makes my fibers scream... While this answer is very much coming from the same place as me, I have accepted that this is one battle that is hopeless... :(
–
AviD♦Apr 17 '11 at 6:55

1

I do not agree that everybody should stop coding in C, that's just ridiculous. The people who make very very speed-sensitive applications should write in C, and I really like it when these are assembly-optimized too. It makes the same thing your Java/.NET counterpart does, but some dozen times faster. Now, this is really the exception, so I'll +1 your answer because unless you're writing a managed framework, a graphics engine, a kernel, a video codec, or things like that, you should really consider things like a garbage collector. But C still has a vast field of uses and you can't deny it.
–
Camilo MartinMar 20 '12 at 15:54

At the risk of not answering the question I would certainly want to point you to David Rook's (aka @securityninja) Principles of Secure Development work over at securityninja.co.uk.

While it won't help you with the specifics of a particular language I personally find his approach spot on. He uses the analogy of learning to drive a car. Rather than teach people how to crash a car in different ways (think exploits) we teach them how to perform basic manoeuvres, etc (think input/output encoding like @RoryAlsop said) which should hopefully mean they avoid the crashes.

Like I say, not exactly an answer to your specific question but hopefully a valuable resource for you and those in a similar position nonetheless.

Thanks, that is indeed a very good approach, and the one I usually take with training series. However in this case I'm looking for the specific techniques, relevant for each specific language - the general principles etc are done seperately, in prior sessions.
–
AviD♦Apr 14 '11 at 13:48

If I where teaching the class I would first introduce the students to Smashing the Stack for Fun and Profit and how a stack based buffer overflow can be used to corrupt the stack frame for a the function that you are in and control the return address. I would get an old XP SP1 machine and OllyDBG and step the exploitation process for the entire class to see. (Or if you go with an aleph example, you can use a modern Linux disto with the memory protection systems disabled)

Then you should cover modern defensive systems such as: ASLR, Canaries, and NX Zones. If you look at modern exploits that are able to work in this environment you won't see stack based buffer overflows. You'll see dangling pointers. A great example is the Pwn2Own for 2010 against IE8+Windows7.

I also think it is necessary to cover how these issues are found in the real world. Such as fuzzing frameworks like peach (Great homework assignment!). Also cover code analysis tools like RATS(open source), and Coverity ($$$). Valgrind is also interesting, especially if you go into dangling pointers.

Hmm, dangling pointers is a good point to add. Stack smashing and such is already covered, and I dont have time to get into testing, CR, etc. Anyways that wouldn't be in this session (as I said this is the practical, how-to coding part - anything else is done seperately).
–
AviD♦Apr 14 '11 at 21:12

@AviD♦ I was under the impression it was a semester long course.
–
rookApr 14 '11 at 21:23

naw, its not college courses, its training for professional developers - "programming team".
–
AviD♦Apr 14 '11 at 21:31

You don't mention general good programming habits. Design by contract, be explicit about your functions' memory management, strive for clean interfaces - the discipline involved here may be going the extra mile but makes it much easier to spot potential problems.

Thanks, decent idea, but harder to implement as a security lecture. The memory management I might be able to push on them, since there are so many other issues involved with that (if it was C++ on Windows I'd be pushing SAL, but it's not...). I'm open to ideas on how to deliver this effectively, though.
–
AviD♦Apr 17 '11 at 7:46

+1 any practice that reduces bugs in general is good for security, as some proportion of bugs will be security relevant.
–
frankodwyerApr 22 '11 at 8:01

On UNIX like systems a NULL dereference can actually be exploitable in a kernel context, so it's pretty important to stress religious checking of memory allocation functions.

In userland it can only lead to crashes (unless someone actually trap SIGSEGV and the signal handler is vulnerable, but then I'm not aware of any cases where trapping SIGSEGV would be a smart thing to do).

The most useful security knowledge isn't language or platform specific, so it is better to try to teach general principles of 'secure coding', and show examples in 'C', rather than teach 'secure coding in 'C''.

For example understanding the principle of 'deny by default' is a key insight that works in a myriad of contexts, languages and platforms, and covers off whole classes of vulnerabilities. Deny by default leads to input validation which is in turn is a defence against buffer overflow, format string attacks, and many others.

Thanks, but as I noted above this one set is part of a larger series, all the general principles and best practices are taught in other parts. This one single session is specifically about C language-specific issues and implementations.
–
AviD♦Apr 22 '11 at 12:10