I have compiled the following program on GCC 4.7.2 and the latest 4.8, using Ubuntu 12.10 with Linux kernel 3.5.0:
const int MAXD = 24;
constexpr int count(int n, int depth=1){
return depth == MAXD ? n + 1: count(count(n, depth + 1), depth + 1) + 1;
}
#include<iostream>
int main(){
constexpr int i = count(0);
std::cout << i << std::endl;
}
Both versions of GCC will use over 3.3 gig RAM in about 30 seconds. For each step I increase MAXD, the RAM usage will double until my computer swaps or the kernel kills the process.
It will never reach a recursion depth of more than 24, but the call graph is sort of a binary tree, so it will visit 2^MAXD - 1 nodes. Since the recursion is so shallow, it should not have to use any memory. Clang 3.1 compiles it without using "any" memory.
Before posting this report, I asked on Stackoverflow, where it was suggested I report it here.
My guess is that this has something to do with unlimited memoization?