I think it's generally good practice to #include the header for any types used in a CPP file, regardless of what is already included via the HPP file. So I might #include <string> in both my HPP and CPP, for example, even though I could still compile if I skipped it in the CPP. This way I don't have to worry about whether my HPP used a forward declaration or not.

Are there any tools that can enforce this #include coding style? Should I enforce this coding style?

Since the preprocessor/compiler doesn't care whether the #include is coming from the HPP or CPP, I get no feedback if I forget to follow this style.

7 Answers
7

This is one of those "should" rather than "shall" kinds of coding standards. The reason is that you would pretty much have to write a C++ parser to enforce it.

A very common rule for header files is that they must stand by themselves. A header file must not require that some other header files be #included before including the header in question. This is a testable requirement. Given some random header foo.hh, the following should compile and run:

#include "foo.hh"
int main () {
return 0;
}

This rule has consequences with regard to use of other classes in some header. Sometimes those consequences can be avoided by forward declaring those other classes. This isn't possible with a lot of the standard library classes. There's no way to forward declare a template instantiation such as std::string or std::vector<SomeType>. You have to #include those STL headers in the header even if the only use of the type is as an argument to a function.

Another problem is with stuff that you incidentally drag in. Example: consider the following:

Here bar is a class Foo data member that is of type Bar. You've done the right thing here and have #included bar.hh even though that would have to have been included in the header that defines class Foo. However, you haven't included the stuff used by Bar::Bar() and Bar::add_item(int). There are many cases where these calls can result in additional external references.

If you analyze foo.o with a tool such as nm, it will appear that the functions in foo.cc are calling all kinds of stuff for which you haven't done the appropriate #include. So should you add #include directives for those incidental external references foo.cc? The answer is absolutely not. The problem is that it is very hard to distinguish those functions that are called incidentally from the ones that are called directly.

Can you elaborate on this? I can't see how that would really verify order-independence.
–
detlyMay 21 '13 at 4:17

it doesn't really guarantee order independence -- it just ensures that #include "x.h" will work without requiring some preceding #include. That's good enough if you don't abuse #define.
–
kevin clineMay 21 '13 at 4:50

If you need to enforce a rule that particular header files must stand on their own you can use the tools you already have. Create a basic makefile that compiles each header file individually but doesn't generate an object file. You will be able to specify which mode to compile the header file in (C or C++ mode) and verify that it can stand on it's own. You can make the reasonable assumption that the output does not contain any false positives, all required dependencies are declared and the output is accurate.

If you are using an IDE you may still be able to accomplish this without a makefile (depending on your IDE). Just create an additional project, add the header files you want to verify and change the settings to compile it as a C or C++ file. In MSVC for instance you would change the "Item Type" setting in "Configuration Properties->General".

I don't think such a tool exists, but I would be happy if some other answer disproves me.

The problem with writing such a tool is that it very easily reports a false result, so I estimate the net advantage of such a tool to be close to zero.

The only way such a tool could work is if it could reset its symbol table to just the contents of the header file it processed, but then you run into the problem that the headers that form the external API of a library delegate the actual declarations to internal headers.
For example, <string> in the GCC libc++ implementation does not declare anything, but it just includes a bunch of internal headers that contain the actual declarations. If the tool resets its symbol table to just that what was declared by <string> itself, that would be nothing.
You could have the tool differentiate between #include "" and #include <>, but that doesn't help you if an external library uses #include "" to include its internal headers in the API.

doesn't #Pragma once achieve this? You can include something as many times as you want, either directly or via chained includes, and as long as there is a #Pragma once next to each of them, the header is only included once.

As to enforcing it, well maybe you could create a build system that just includes each header by itself with some dummy main function, just to ensure it compiles. #ifdef chain includes for best results with that method of testing.

I would say there are both advantages and disadvantages to this convention. On the one hand it is nice to know exactly what your .cpp file is including. On the other hand, the list of includes can easily grow to a ridiculous size.

One way to encourage this convention is not to include anything in your own headers, but only in .cpp files. Then any .cpp file using your header will not compile unless you explicitly include all other headers it depends on.

Probably some reasonable compromise is in order here. For example, you may decide it is ok to include standard library headers inside your own headers, but no more.

Always include header files in CPP file. This not only shorten compile time significantly, but also saves you a lot of trouble if you decide to go with precompiled headers. In my experience, even doing the trouble of forward declarations is worth while practice. Break rule only when necessary.