1. Virtual bugs

I first really knew that all software was fundamentally insecure back in 2001.

I was working for an artificial life games company. We made virtual pets – amazing ones with a simulated brain, biochemistry and genetics.

I’d just built a new networked version called Creatures Docking Station. It let the cute, furry, egg-laying Norns travel through portals, crossing the Internet directly between player’s computers.

The game engine was built using the language C++. It was fiendishly complicated – neuron simulators for the Norn brains, a scripting language implementing all the in game objects. About 20 people had worked on it, with varying needs, skills and time pressures.

I knew that there were bugs in it. I’d previously stress tested the code – randomly mutating Norns and force breeding them with each other in a diabolical machine, while the game was running in a debugger. It found a new crash bug every hour – I’d tap Gavin on the shoulder and get him to fix each one. We never got them all.

The symptom – the game crashing occasionally due to a mutation – wasn’t itself a world shattering problem. No real lives were on the line. Bad user experience, but so what?

Two reasons:

1) In C++, bugs in this category let an attacker do anything they like. That is, much like a chain saw, with great power comes great responsibility.

2) With the new networked game, it would let an attacker do anything they liked, remotely and automatically from across the network.

In short, a player of our game could have their machine taken over remotely – their documents deleted, spam sent, their Internet banking password sniffed (not that many people used Internet banking back then). Whatever the attacker wanted.

At the time, there was no tool or technology or budget available for me to fix this. I did what every programmer did – closed my eyes. Ignored the problem. Hoped nobody would do bad things with it.

I knew though that nearly all general purpose software, particularly written in C/C++, was likely to be insecure.

2. A simple promise

Wind forward to 2015.

I’ve been worrying for a while about the long, cold cyberwar. A small part of that war is basic security of all computer systems – so it’s hard for criminals or rogue states to, say, remotely turn on your microphone without you knowing.

Linking this to my old experience with C++, and a constantflow of security vulnerabilities which could only happen to C/C++ code, I had the idea that as an industry we should stop using C/C++.

Peter had shown me how good Go is now (we use it a lot at work), making my historical needs for C/C++ now obsolete. Suddenly, it felt possible to completely stop using those languages.

My view is that it is particularly important to sort this out now. Embedded devices are joining the internet more and more – even if you’re writing something which is standalone now, some other programmer will connect it to something in 5 or 10 years. I don’t want my physical devices to be easy to hack into. The pledge I’d really like embedded systems developers to take is to try using and improve on the new more secure toolchains.

C++ is secure now – a few people pointed out that C++14 can now be used with safe pointers and sanitisers. Others have proposed friendly dialects of C where you turn all the safe compiler options on.

In principle I’m up for this, but only if it is forced in an explicit language variant – otherwise someone will shoot themselves in the foot later. I’m not sure it is worth it in most cases, compared to using Go or Rust. Either way, legacy C/C++ code is the really big issue.

Go or Rust flaws – a few people don’t like them, sometimes for aesthetic syntax reasons, sometimes claiming they are hard to use. I don’t think C has a particularly great syntax – I can remember trying to learn it when I was 15, it wasn’t easy. Sure, if you don’t like them, pick something else. It doesn’t mean you have to juggle with chainsaws.

Of course, these new languages still have parts written in C, at least for now. There can always be bugs in their compilers and assemblers. I don’t think this is a big problem, as those parts are a much much smaller surface of attack – albeit a valuable one.

Application binary interfaces – what can we use instead of C as the standard ABI? Pretty well all languages in the “open source” world interoperate with each other via C bindings. If you took my promise, would you still be able to write Python bindings to an existing C library? Pretending we don’t need C is just fantasy.

This is by far the best criticism. Of course, the Java and .NET worlds have spent a decade building entirely new ecosystems which strongly discourage C bindings. So it’s perfectly possible. We will need something specific to use instead. I don’t know what it should be – this needs strong leadership, maybe from the Rust people.

Post navigation

3 thoughts on “Promising to make software safer”

With respect to the “standard ABI” concern: the C ABI isn’t the only interop story, and maybe it’s not even the most important. HTTP and TCP/IP are definitely up there as major ways of interfacing code written in wildly different languages and styles. Taking the “network ABI” idea to heart, then, perhaps we can isolate, contain and minimize the influence of legacy programs and libraries by encapsulating them in processes or virtual-machines, communicating with them via channels. Qubes https://qubes-os.org/ is an interesting step in this direction: it uses Xen to construct per-app-instance virtual machines. You could imagine taking this even further in the direction of an object-capability-style architecture.

I don’t want to sign your promise for the moment because C still hits a sweet spot of (i) having a very good fit to operating system interfaces and so being good for systems programming; (ii) it’s possible to write high performance code if you need to; (iii) language standard is almost comprehensible; (iv) compilers are available nearly everywhere. I am working on a couple of projects at the moment where these are key requirements and so if we had to start them again we’d still choose C, I think. But I’d sign a slightly weaker promise, along the lines of “I promise not to use C or C++ for a new project unless absolutely nothing else will do.”