Something sitting on your disk

Menu

When C++ meets SAL annotations

SAL annotations introduced by Microsoft is an interesting attempt to improve security and reliability of C/C++ code. They fill the gap between human-readable documentation and documentation that compiler is able to understand. Understanding actually means that compiler can apply static analysis and check code against common mistakes like null pointer dereferences, buffer overflows, etc. To be honest, compiler is also able to analyze code without SAL – but accuracy of such analysis is questionable at best. SAL annotations allows compiler to make analysis more accurate without tons of false positives. I had recently an occasion to check myself how code analysis works in VS2012 for real world C code. It seem that it performs pretty good – for example it was able to detect deadlock in similar (more complicated) code:

After running code analysis, VS2012 is not able to detect any flaws. It seems that compiler does not understand C++ locking primitives, even if it uses WinApi under the hood. Fortunately std::mutex implementation performs runtime checks that can detect mistake. But trading compilation error for runtime error is not good deal (especially in production code).

NULL dereference
Another simple example is NULL pointer dereference. VS2012 code analysis is able to check this kind of bug even without SAL annotations (if pointer is local variable):

char* p = 0;
if(var) p = (char*)malloc(10);
strcpy_s(p, 10, "hello");

As expected, compiler shows warning: C6387: ‘p’ could be ‘0’.
This code is of course in poor taste provided that we are coding in C++11. I will use std::unique_ptr instead of raw pointers, which will help to avoid memory leaks:

After running code analysis, again no flaws detected. std::unique_ptr does not use SAL annotations so compiler cannot determine if p.get() can return NULL. To avoid false positives this situation is just ignored.
But what if get() method would be declared with _Maybenull_ annotation? Well, in this case you will get false positives all the time, even if p will be correct non-null pointer (yeah I’ve checked it). So this is definitely not a solution.

New overload
It seems that VS2012 is at least able to deal correctly with new operator:

dereference of memory allocated by new (std::nothrows) will raise warning

dereference of memory allocated by default new will pass

But what if we have to use overloaded new that does not throw? I know this is not very good idea to overload new in this way (it will cause problem with STL containers), but unfortunately, sometimes in legacy code it is a must. So let’s check how overloaded new will be treated:

C++ Paradox
Some wise computer scientist said that all problems in computer science can be solved by another level of indirection. Unfortunately in case of static code analysis this is not true – another level of indirection introduced by C++ abstractions (like smart pointers, portable mutexes, etc.) makes static analysis much harder. There is paradox in here: you use C++ standard library to make code more safe, only to realize that now your code analysis tools are useless. Maybe in some distant future Visual Studio will be able to verify C++ code more or less correctly. But currently all what you can do is to choose lesser evil: resign from automatic code analysis or resign from C++ standard library classes.