The test file defines \m macros of the form \x1, \x2, \x3 etc.
They are defined via \let that does not take memory for the definition,
because this is shared with the macro (the empty macro \e in this case).

\m=0

Here is how much of TeX's memory you used:
4 strings out of 498654
30 string characters out of 6225568
1060 words of memory out of 5000000
322 multiletter control sequences out of 15000+600000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
1i,0n,0p,54b,6s stack positions out of 5000i,500n,10000p,200000b,80000s

\m=100000

Here is how much of TeX's memory you used:
100004 strings out of 498654
588925 string characters out of 6225568
1074 words of memory out of 5000000
100322 multiletter control sequences out of 15000+600000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
3i,0n,0p,54b,6s stack positions out of 5000i,500n,10000p,200000b,80000s

\m=498654

Here is how much of TeX's memory you used:
498654 strings out of 498654
3379475 string characters out of 6225568
1074 words of memory out of 5000000
498972 multiletter control sequences out of 15000+600000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
3i,0n,0p,54b,6s stack positions out of 5000i,500n,10000p,200000b,80000s

pdftex --ini test already fails with:

! TeX capacity exceeded, sorry [number of strings=497943].

Number of strings (max_strings)

TeX Live 2013 uses max_strings=500000 in texmf.cnf. Some strings are preoccupied (names of primitives, ...). The macro names, defined in the test file, use short one character/letter command names that are not stored as strings. Strings are stored as indexes to the memory pool.

The number of strings can be increased, e.g. (bash):

$ max_strings=1000000 tex --ini test

But then the limitation of the hash table is hit (\m=800000):

! TeX capacity exceeded, sorry [hash size=615000].

Hash table size (hash_extra)

Each command name is also stored in a hash table for fast access.
Its size 15000 can be increased by hash_extra (600000 in TL 2013).

$ max_strings=1000000 hash_extra=1000000 tex --ini test

now works for \m=800000:

Here is how much of TeX's memory you used:
800004 strings out of 998654
5488925 string characters out of 6225568
1074 words of memory out of 5000000
800322 multiletter control sequences out of 15000+1000000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
3i,0n,0p,54b,6s stack positions out of 5000i,500n,10000p,200000b,80000s

Pool size (pool_size)

The memory where the string characters are actually stored is the string pool. Its size is also limited (default for pool_size in TL 2013 is 6250000).

Here is how much of TeX's memory you used:
1500004 strings out of 1598654
10888926 string characters out of 15975568
1076 words of memory out of 5000000
1500322 multiletter control sequences out of 15000+1600000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
3i,0n,0p,54b,6s stack positions out of 5000i,500n,10000p,200000b,80000s

Main memory (main_memory)

With non-trivial definitions also the main memory is used. The following line

\expandafter\edef\csname x\the\i\endcsname{\the\i}%

defines the macros \x1, \x2, \x3 with a definition text that contains the number.

Result for \m=1000000 and

$ max_strings=1600000 hash_extra=1600000 pool_size=16000000 main_memory=10000000 tex --ini test
Here is how much of TeX's memory you used:
1000004 strings out of 1598654
6888926 string characters out of 15975568
7889966 words of memory out of 10000000
1000322 multiletter control sequences out of 15000+1600000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
3i,0n,0p,62b,6s stack positions out of 5000i,500n,10000p,200000b,80000s

But the main memory cannot be increased indefinitely, e.g. using main_memory=15000000 results in an internal fatal error:

Ouch---my internal constants have been clobbered!---case 14

Summary

TeX was written in a time, where memory was very limited and expensive and the computers were very slow. Therefore Knuth has written its own memory management with fixed sized tables that do not dynamically grow, if the maximum size is reached. Some table sizes can be increased before the TeX run. But some implementation limits remain.

LuaTeX

LuaTeX has rewritten the core of TeX, thus some limitations are gone. The latest example with \m=1500000 runs with

$ max_strings=1600000 hash_extra=1600000 luatex --ini test

and uses the following memory (0.76.0):

Here is how much of LuaTeX's memory you used:
1500013 strings out of 1598958
500,14696376 words of node,token memory allocated
58 words of node memory still in use:
nodes
avail lists: 2:1,6:1
1500335 multiletter control sequences out of 65536+1600000
0 fonts using 0 bytes
3i,0n,0p,61b,6s stack positions out of 5000i,500n,10000p,200000b,100000s

I'm writing a textbook for aerospace engineers with many calculation examples. The way I managed to typeset calculation results is based on this talk. I use mostly TeX-style definitions like: \def\myAltitudeMT{5000}. So, as I understand your answer, if I introduce, say, 1000 new macro definitions I should be safe with pdflatex.
–
agodemarAug 9 '13 at 15:15

2

@agodemar: 1000 macro definitions are not much of a problem. But if your book has 10000 pages, each page 100 labels, then you would already have a million macro definitions for the labels.
–
Heiko OberdiekAug 9 '13 at 15:56

2

@agodemar: Modern TeX installations have default memory limits that should be hard to hit with these kind of documents; iff you meet them you easily extend the limits. Don't worry, it will work. :-)
–
Martin SchröderAug 14 '13 at 9:54