Undefined Behaviors in ISO-C: Their Effect on Your Embedded Software Part 1

Optimizing compilers sometimes give you...well, unexpected results. You have probably seen this before, but maybe wasn’t sure what happened and why.

This two-part blog is about the undefined behaviors that exist in ISO-C, the way optimizing compilers make use of it, which is often not well-understood by programmers, and the unpredictable software bugs that result from it and frequently show up in code that is attempting security checks.

Although the ISO-C language is widely used to build safety related software, ISO-C is not a “safe” programming language. Errors are not trapped as they happen; instead, they are trapped after executing an erroneous operation. The program continues but in a silently-faulty way that may have observable consequences later on. Furthermore, the ISO-C standard specifies a long list of circumstances, called "undefined behaviors", in which no requirements on the behavior of the program are imposed. Compilers are not required to diagnose undefined behavior, and the compiled program is not required to do anything meaningful, it may crash, silently generate incorrect results, or coincidentally do exactly what the programmer intended.

Why does "undefined behavior" exist, and what’s good about it?

Failing to explicitly define the exact behavior of every possible program is not an error or weakness in the C language specification. Instead it is an important feature to underpin the underlying principles of the language such as: impose few constraints on the programmer, allow low-level access to the underlying hardware while retaining (some) portability, and enable fast program execution and small code-size.

By making the result of certain operations intentionally ambiguous different CPU designs can be supported without sacrificing performance. Because no specific behavior is required compilers are free to do whatever is the most efficient for the target platform. For example, when adding two signed integers, the compiler does not need to verify and take action if the result overflows and becomes negative.

What's bad about undefined behavior?

Undefined behavior, particularly in combination with optimizing compilers, also has a dark side that can cause very subtle bugs that can have a critical impact on safety and security. Every programmer understands that accessing a null pointer or dividing by zero are erroneous actions that cause undefined behavior. Writing code to detect and handle such cases seems simple, but it is not. Even very experienced programmers are sometimes fooled by the precise meaning of their program when a legalistic interpretation according to the semantics of the ISO-C standard is applied. Compiler developers often base their optimizations on such legalistic interpretation of the standard. Sometimes the code that should detect and handle undefined behaviors is “miraculously”, but legally, optimized out of the executable code. I will go deeper into this in Part 2.

Other examples of undefined behavior may also be considered as "easily perceived and understood" such as:

Reading from uninitialized variables

Signed integer overflow (notice that the behavior of unsigned integer overflow is defined!)

Shift equal to or greater than the width of the operand

Modifying a variable more than once in an expression

Array / buffer overflow

Pointer overflow

Violating type rules

Modifying a const variable

Negating INT_MIN

Modulo operation on a negative signed integer

Calling a library function without fulfilling the prerequisites

Data races caused by conflicting actions in different threads

The list is vast, ISO-C11 specifies 203 circumstances that cause undefined behaviors. Due to this large number and the subtleties involved programmers cannot be trusted to reliably avoid undefined behavior, which could result in programs that silently misbehave.

Furthermore, misbehavior due to undefined behavior is not easy to detect using dynamic tests since in most cases the undefined behavior is exposed for certain inputs only. As a result code that contains undefined behaviors may “work” for a while, and then “break” when ported to new hardware, or after upgrading the compiler or changing its optimization level.

What do Safety Standards say about undefined behavior?

The undefined behavior topic is not explicitly addressed by safety standards such as ISO 26262. Most safety standards refer to other (industry) standards that provide rules for safe and secure coding such as MISRA-C and CERT-C. At the SEI CERT website you can find an overview of all undefined behaviors including the coding practices that mitigate the specific case of undefined behavior.

Today some compilers, including all TASKING C/C++ compilers, do detect violations against the MISRA and CERT advised coding practices and warn the programmer accordingly. This ensures that the intentions of the programmer are retained in the compiled program.

In Part 2, I will show how a compiler can use undefined behavior to optimize the code and potentially outsmart the programmer in his endevour to create safe code.

About the Author

Gerard Vink accompanied the evolution of TASKING from almost the very beginning of the company’s journey. Being the Head of R&D he is responsible for compiler and debugger technology. Before joining TASKING in 1988 he worked on MCAD and computer animation software. During his more than 25 years at TASKING he witnessed microcontrollers evolve from simple 8-bit cores into complex heterogeneous multicore systems, where the complexity of compiler and debugger technology advanced accordingly. Gerard studied mechanical engineering and computer science.