Why high-performance computing needs financial engineering

Richard Bookstaber, the quant's quant, suggests that the CPU- and GPGPU-driven …

Richard Bookstaber, one of the original MIT math geeks gone bad (a.k.a. "quants") and the guy who literally wrote the book on how to destroy Wall St. with computers, has been tracking what he calls the "arms race" in high-frequency, computer-automated trading. If he's correct with his recently floated hypothesis that "the days for high frequency trading are numbered," then this would be pretty bad news for Intel, AMD/ATI, and NVIDIA.

To understand why the demise of high-frequency trading would be bad for these companies, you first have to understand that to successfully develop, fabricate, and sell high-powered number-crunching chips, you need at least two market segments: 1) a high-end, high-margin segment where you can launch your top-tier, most expensive products, and 2) a much larger market where you can sell last quarter's and last year's top-tier parts in order to achieve the economies of scale that enable you to keep producing the high-end parts that perpetuate this trickle-down product cycle.

For NVIDIA, Intel, AMD/ATI, and, to a certain extent, IBM, that much-coveted high-end market segment is made up of a collection of niches, and any journalist or analyst who has sat through a GPGPU or high-performance processor presentation for any of these companies can recite from memory the most popular members of this collection: medical imaging, defense, oil and gas exploration, pharmaceutical research, 3D rendering for movies, and finance. The last item on this list is especially popular, and if I had an ounce of diamond-studded platinum for every time I've seen a Black-Scholes options pricing benchmark result in a presentation, I could out-bling Soulja Boy at next year's BET awards.

The thing that always strikes me about these presentations is that this handful of compute-intensive problem domains is typically listed as "examples" of the sort of profitable niches you can target with a vendor's hardware, the implication being that there are just oodles of other such workload families out there that the presenter could adduce if he or she had enough space on the PowerPoint slide. Except that this isn't really the case. Those six items aren't just examples—they're basically the whole list, and this short list needs to grow, not shrink. This is one reason why lopping finance off that list would be very bad for everyone in the teraflops business.

But it's not just that the end of high-frequency trading would cost the teraflops vendors a small but high-margin niche to sell into; they'd lose something even more critical to a publicly traded company's prospects: a growth story.

NVIDIA in particular is a great example of what I'm talking about. The GPU maker's premium Tesla line of math coprocessors is at present a very small percentage of the company's revenues, but it's a mammoth part of NVIDIA's growth story, and for good reason. Before the meltdown turned "quant" into a four-letter insult, GPU hardware had a real future in high-frequency trading. And it probably still does—the real question is, does high-frequency trading have a future?

Is high-frequency trading really a goner?

Bookstaber's problem with high-frequency trading isn't that he has had some Nasim Taleb-style vision of a Godzillion-Teraflop monster rampaging through Manhattan, destroying markets with a hot blast of twenty-sigma events. Rather, he has a two-part criticism of the practice, the first of which has to do with a high-frequency trader's size and role in providing market liquidity. I'm not qualified to comment on that one, so I'll skip to the second part of his criticism, which is about the "arms race" aspect of these kinds of program trades:

A second reason is that high frequency trading is embroiled in an arms race. And arms races are negative sum games. The arms in this case are not tanks and jets, but computer chips and throughput. But like any arms race, the result is a cycle of spending which leaves everyone in the same relative position, only poorer. Put another way, like any arms race, what is happening with high frequency trading is a net drain on social welfare.

In terms of chips, I gave a talk at an Intel conference a few years ago, when they were launching their newest chip, dubbed the Tigerton. The various financial firms who had to be as fast as everyone else then shelled out an aggregate of hundreds of millions of dollars to upgrade, so that they could now execute trades in thirty milliseconds rather than forty milliseconds—or whatever, I really can’t remember, except that it is too fast for anyone to care were it not that other people were also doing it. And now there is a new chip, code named Nehalem. So another hundred million dollars all around, and latency will be dropped a few milliseconds more.

I once heard from some Intellers about a study that the company had done that determined the number of millions of extra dollars per day you could make on various high-frequency trading strategies by going up a speed bin on their Xeon line. I don't remember the details, but it turns out that (according to Intel, at least) trading houses can make a non-trivial amount of money just by moving up on speed bin, because this puts them a few beats ahead of the next buyer.

Bookstaber's remarks above lend credence to Intel's claim about the returns on those processor upgrades, but his (correct, I think) description of this whole game as a negative sum arms race gives high-frequency trading a distinct air of "this can't possibly go on forever." And as we've all learned over the past six months, things that can't go on forever don't.

I'll be the first to admit that I can't actually envision a particular set of circumstances under which the arms race spontaneously grinds to a halt, and the big teraflops vendors suffer the fate of defense contractors in the immediate wake of the end of the Cold War (pre-Iraq II, of course). Bookstaber doesn't seem to have a particular scenario in mind, either, which is why he ends up (only half-seriously) proposing that everyone should just stop doing it for the greater good. Still, it's hard to escape the feeling that, of the five or so compute-intensive application domains that I listed above, finance is the one that seems the most unsustainable in the long-term (right ahead of oil and gas exploration).

Nonetheless, even in the absence of high-frequency trading, if the market can continue to offer up a major new way to turn compute cycles into dollars at least once per decade, then this might be sufficient to fuel the ongoing development of the kind of high-end hardware that Intel and NVIDIA provide. Biotech is the most likely source of the necessary workloads, as many of its problems are moving from the realm of lab science into the realm of computation and data management (DNA sequencing is one example of this).