Composing Software Systems

Many tools have been developed over the years to automate different aspects of software assembly:

In 1977, Stuart Feldman released `make`, which has had such a long-lasting impact on the software industry that he received the 2003 ACM Software System Award for it.

In 1993, Mark Burgess began work on cfengine, one of the first tools for automating the configuration of a large number of computers. (Today, Puppet is better known in that space.)

In 1998, Steve Traugott and Joel Huddleston published "Bootstrapping an Infrastructure", arguing that networks should be treated like 'one large "virtual machine", rather than as a collection of individual hosts.' In support of this view they relied on tools like version control and `make`, familiar to software developers.

The pace of development has continued increasing, and there are now a bewildering array of choices for process automation tools. Unfortunately, most have missed two key insights:

Software assembly and system assembly require fundamentally the same processes, just at different scales.
If you can't reproduce your build process reliably, then you can't maintain it.

We'd like to encourage all developers and system administrators to follow good, reproducible, engineering practices, if for no other reason than that, frankly, we don't like it when your stuff breaks while we're using it. You may have better reasons, though: If you're managing a large-scale IT setup or a software shop with long-term support or regulatory requirements, process failures mean spectacular recovery costs. If your data center caught fire tomorrow, how long would it take you to bring up a new one? If your biggest customer reports a bug in a ten-year-old version of your software, would you even be able to compile it today?

In this talk we'll present the unmet needs we've observed, connect them with experience from non-obvious sources such as functional programming research, and share our proposed solutions.