>Most C compilers seem to support multiple C source files as input. This>is similar to my scheme of compiling multiple sources one after the>other. So if the input to a compiler looks like this:>> gcc -c A.c B.c C.c>>where each of A, B and C include the same header file H.h, then once H.h>has been processed for A.c, the results of that (symbol tables and so>on) could be re-used for B and C without needing to process either H.h>or H.pch again.

But it isn't done that way: the compilation of each .c file is treated
independently. A common header file will be (re)processed for each
code file that includes it.

Also recall that the standard headers form a nesting hierarchy: some
of the more common headers indirectly pull in many others. Nesting
also is common in wizard generated code, and in C++ template
libraries, etc.

A pre-compiled header contains already digested data that can be
plugged (more or less) directly into the compiler's internal data
structures. Loading a .pch file very often is much faster than
(re)compiling its constituent source files, and the time savings grow
with the number of files involved.

It works best when you have many code files all of which include a
common *set* of headers. You extract the common includes from the
code files into a single common header file. In each code file you
then include that one common header. Then precompile it.

For a project with just a handful of files it really doesn't matter
... but with complex projects with hundreds of source files I've
gotten ~80% reductions in build time by using precompiled headers. I
have seen 45 minute clean builds turn into <10 minutes.