TNT Is Not TeX

The curious document above was produced sometime in the spring of 1980 by Don Knuth to show off the typographical prowess of his new programs, TeX and Metafont. The software was then being introduced to the mathematical community through the publication of TeX and Metafont: New Directions in Typesetting, and I was writing a news item about it for Scientific American. At the time it seemed like a quaint, quirky and quixotic project, worth a column of type in the magazine even if nothing came of it in the long run. I would not have guessed that 30 years later TeX would be the foundation of a huge software superstructure—and would still be a part of my own professional life.

TeX is not the oldest software still in widespread use, but it may be the most stable. In the core of the system—the typesetting engine—very little has changed since 1990. And there will be even fewer revisions going forward. The current version of TeX is 3.1415926. Knuth has decreed that on his death the version number should be set equal to π and no further changes should ever be made. “From that moment on, all ‘bugs’ will be permanent ‘features.’”

I think—though this is subject to interpretation—that what Knuth wants to protect from all future meddling is not the text of the program itself, or even the underlying algorithms and data structures, but rather its operational specification. His intent in freezing TeX is to ensure that the same input should always yield the same output. Specifically, any software that calls itself TeX is supposed to pass his TRIP test suite.

I am of two minds about this policy. Mind One agrees with Knuth’s declaration: “Let us regard these systems as fixed points, which should give the same results 100 years from now that they produce today.” It’s comforting to think that all the TeX documents I’ve written over the years will still be readable a century hence. But Mind Two reminds me that in practice I have trouble maintaining TeX documents even for a few months, much less decades or centuries. What about those presentations done with the foils class that stopped working after an upgrade and that I’ve never bothered to fix? Or the articles using the pstricks package that won’t compile under pdflatex? TeX itself may be a fixed point in the software universe, but everything else spins dizzily around it.

The skeptical Mind Two has another argument as well: Under Knuth’s edict it’s not just the TeX markup language that can’t change; it’s also the architecture of the system. Knuth created his flawless soufflés and dæmon diarrhœa at an ASCII terminal wired to a PDP-10, and the only way he could see the product of his labors was to walk down the hall and retrieve hard copy from the AlphaType machine. We are no longer accustomed to such barbarities. TeX has been hauled halfway into the world of modern computing. Front-end software such as TeXShop provides a pleasanter interface. But the core programs still run in batch mode, as they did in the Dark Ages. To make even the smallest change in a document, you still need to throw away all the existing output and run a whole file (or set of files) through the compiler tool chain. Sometimes you have to do it twice. Or four times. Isn’t this ridiculous in a world of event-driven, interactive, multithreaded software? Will we still have to press the Typeset button in 2111?

Mind One replies: Of course not. By then we’ll just throw Moore’s Law at it: Automatically rerun TeX n times for every keystroke in the editor.

At this point Mind Three pipes up. (Did I mention that I’m of three minds?) The problem here, she says, is not that we can’t or shouldn’t alter TeX. It’s the utterly depressing notion that we’re incapable of building anything better, and that TeX will still be the typesetter to beat after another century. Surely, if we just stand tippytoe on the shoulders of Don Knuth, we can see a little farther. Who was the architect who said that every great building should have a bomb in the basement, set to blow itself up after 50 years and thereby clear the land for something greater still? Let’s make a new improved TeX. We’ll call it TNT.

Minds One and Two pounce in unison: You think we haven’t thought of that? What about ε-TeX? NTS? ExTeX? What about LuaTeX…?

• • •

This trinitarian meditation was inspired by a blog post I stumbled upon last week, in which an entity named Valletta Ventures, publisher of TeXPad for the Macintosh, attempts to port TeX (and also LaTeX) to the iPad; in this venture, Valletta Ventures eventually concedes defeat. The failure could be blamed on the scrutineers at the Apple App Store, who insist that every iPad program must be bundled up in a single executable. (My current TeX /bin directory has 342 entries.) But even if we were to let Apple off the hook here, the project still seems truly quaint, quirky and quixotic. Mind One says you shouldn’t expect to run a system as large and complex as TeX on a puffed-up cellphone. But Mind Two says: Why not? The iPad probably has more computational oomph than Knuth’s 1980 PDP-10.

In the end my sympathies lie with Mind Three, who sees the barrier to putting TeX on the iPad not as a lost opportunity but as a thin, bright glimmer of hope on the horizon. Maybe this protected market—the walled garden of Cupertino—will induce some young genius to create the next great mathematical writing system, an iPad app so good it will induce envy in all of us poor TeX users.

Looking at the issue more broadly, I think we often value stability and reliability a little too highly, and innovation too lowly. The world of computer science is overpopulated by walking fossils—not just TeX but also Unix, the Intel 86 architecture, TCP/IP. Quoting myself:

What has everybody been doing for the past 35 years? Can it be true that technologies conceived in the era of time-sharing, teletypes and nine-track tape are the very best that computer science has to offer in the 21st century?

As a remedy for this situation, the bomb in the basement may be a bit extreme. But I wonder if we shouldn’t try something like a reverse patent, where the whole world gets free use of an invention for the first 17 years, but then there’s an escalating schedule of royalties or taxes for those who fail to come up with a brighter idea.

• • •

One final question. When Knuth counts LAZY FOXES in his typographic specimen, where does he get the peculiar number 854.9176302? I would have thought 85491.76320.

> The iPad probably has more computational oomph than Knuth’s 1980 PDP-10

Conservatively, the ipad does about 200 megaflops[1]. The PDP-10 did about 450 kflops[2].

So the iPad, in strictly floating point performance terms, is about 400x as fast as a PDP-10. Even if Knuth’s PDP-10 was much faster than the one in the reference, it was unlikely to be anywhere near the performance of an iPad.

I love TeX as a great typesetting system. However, it has been the only way to write mathematics for such a long time that TeX is considered the best tool for *writing* mathematics (instead of considering it to be the best tool for *typesetting* mathematics).

It’s a bit like calligraphy vs everyday writing — nobody would think less of the content of a quick note (say in the margin of a book claiming a short proof…) and few people write calligraphy in every personal note.

If you are not set on typesetting with TeX-quality or LaTeX-package-extravaganza, then I think there is an excellent tool for quickly writing mathematics — without the full power of TeX but more than enough for an initial draft or a short document.

Namely, what people have become used to on MathOverflow and stackexchange sites (and elsewhere): markdown+MathJax.

It’s light weight, pure text, can be converted back to TeX (e.g. using pandoc). There’s a (very young) opensource editor that I’ve used successfully with students (disclaimer: the author is a friend). It’s not perfect, but I’ve found it extremely useful (especially for writing on wordpress actually).

I agree with your analysis of Tex. I also find the markup language very arcane, and not compositional at all. It is rather hard to make sense of some of the Tex and Latex macros that experts write (And I am saying this as some one who has to work in Latex quite often). Surely, there has to be a better interface to typesetting?

TeX syntax is awful and even Knuth knew it when it was written. When asked for the grammar specification he famously replied “The grammar is what ever TeX will parse.” TeX’s macro system was quirky in 1980 and by todays language standards is downright archaic. The Modern parser theory that makes C/C++/Java feel cluttered and verbose makes TeX look and feel quixotic.

I have to believe that the underlying typesetting engine is still a marvel else we would not still be using it, but please oh please give us a better input language with more power and a better cleaner syntax.

The idea of planned destruction is compelling. What is the name of the famous architect you mention that advocates a bomb in the basement of great buildings that explodes in fifty years to pave the way for novelty?

@Joel: “What is the name of the famous architect you mention that advocates a bomb in the basement…?” I was hoping you would tell me. It’s something I read long ago, and I’ve been unable to retrieve the details either from my own neural network or from the collective memory of Google. I don’t think I made it up. But there’s a chance the architect was Howard Roark.

In the matter of 854.9176302, the question for me is how Don Knuth pronounces the numeral “0.” I don’t know of a word for naught that falls alphabetically between three and two.