Running Valgrind on Python Extensions

As most developers know, Valgrind is an invaluable tool for finding memory leaks. However, when debugging Python programs the pymalloc allocator gets in the way.

There is a Valgrind suppression file distributed with Python that gets rid of most of the false positives, but does not give particularly good diagnostics for memory allocated through pymalloc. To properly analyse leaks, you often need to recompile Python with pymalloc.

As I don’t like having to recompile Python I took a look at Valgrind’s client API, which provides a way for a program to detect whether it is running under Valgrind. Using the client API I was able to put together a patch that automatically disables pymalloc when appropriate. It can be found attached to bug 2422 in the Python bug tracker.

The patch still needs a bit of work before it will be mergeable with Python 2.6/3.0 (mainly autoconf foo). I also need to do a bit more benchmarking on the patch. If the overhead of turning on this patch is negligible, then it’d be pretty cool to have it enabled by default when Valgrind is available.

Note that even with the suppressions file in effect, Valgrind will miss many leaks if pymalloc is active. The only difference is that you won’t see the uninitialised read warnings that pymalloc generates.

If you allocate a block with pymalloc, it will be carved out of a larger allocation. If you forget to free that memory, Valgrind doesn’t notice because it is only tracking the larger block that pymalloc allocated (which is properly referenced by pymalloc internals).

By bypassing the pymalloc code, each of the allocations is tracked separately by Valgrind and the leak is evident. An alternative approach to the problem would have been to annotate the pymalloc code with Valgrind’s memory pools client API so that it knows what Python thinks is going on inside the larger blocks it allocates. This is a fair bit more work, and I don’t know if it is worth the effort (it’d probably be useful for debugging pymalloc itself though).