I'm Tired of Makefiles

At Cesanta, we use Makefiles to build our firmwares, libraries, and perform other things. But the more I use make, the less I like it. I want my incremental builds to be reliable, and I want to be able to reuse my Makefiles as needed. make can't provide that, and builds just kinda-sorta work most of the time.

Most annoying issues are:

Makefile reuse is a problem

Consider: we have a library mylib, which is a separate project on its own, and it has its own Makefile. The end product is a file mylib.a. So, among others, there is a rule which looks like:

mylib.a: src1.c src2.c src3.c
.... some recipe to build mylib.a ....

Now, we have a project app with a separate Makefile, and we want to use mylib.a there, and of course we want it to be up to date. How would we do that?

We can't just include ../mylib/Makefile, because it has a lot of irrelevant stuff such as unit tests, etc, and we don't want variable names to clash.

We could add a target wich a simple recipe which just invokes make properly:

mylib.a:
$(MAKE) -C ../mylib mylib.a

The obvious problem here is that there are no prerequisites, so make won't be invoked when we change some mylib's sources. That's not acceptable.

We could factor out prerequisites in a third Makefile, specifically and carefully designed for inclusion, and include it in both mylib/Makefile and app/Makefile, and we'll make sure all the paths are correct, etc, etc. But this is really too much work here, in the app's Makefile: we want to just use the lib. We don't want to care about how to build it.

My next idea was to make mylib.a target phony, so that make will get invoked every time (and will be a no-op if no prerequisites changed), but is's also bad since a target which depends on a phony target will be rebuilt every time. We don't want the app target to always get rebuilt.

Ok, something that would really work is to add the make invocation to the recipe for every target which depends on mylib, like:

Then we do some refactor during which some_header.h is renamed or deleted, or we just switch to another Git branch which doesn't have some_header.h, and incremental build fails with:

make: *** No rule to make target 'some_header.h', needed by 'src1.o'. Stop.

We have to do a clean build.

Well, technically, it's not a problem of make as such: we could maintain all the header dependencies manually, instead of using dependency-generation facility of gcc. But it's a very common pattern, and it has severe flaws.

Timestamp-based judgement on what needs to be rebuilt is unreliable

Again, it works “most of the time”, probably for the most people, but it's not perfect e.g. when one switches back and forth between Git branches. And I do that very often.

Usually it only causes unnecessary builds (which is annoying but probably not critical on small-ish projects), but sometimes it can be a bigger issue, if for whatever reason some sources were copied with timestamps preserved.

Conclusion

I actually can cope with a lot of shortcomings which stuck for historical reasons. For example, I dislike that make does not fail if the recipe of a non-phony target did not actually create the target.

I also dislike that when we declare a variable like that: FOO = foo, the variable FOO can still be overridden from the command line. My point is that if the author of makefile wants some variable to be overriddable, they should just use FOO ?= foo. But this is also not critical: we'll write override FOO = foo when we want (even though we want that most of the time).

There are plenty of other issues, some of them require ancient wisdom to write Makefiles which are correct, but I got used to most of them. But the points I elaborated above seem too much for me.