A Wall Street Software Master Explains NASDAQ's Shut Down With One Simple Concept

Traders gather for the IPO of Phillips 66 Partners LP, on the floor at the New York Stock Exchange, July 23, 2013. Phillips 66 Partners LP, a limited partnership formed earlier this year by Phillips 66, opened at $23.00 a share.
REUTERS/Brendan McDermid
This week the market has seen two massive trading screw-ups — there was Goldman's errant trades on Tuesday, and yesterday the NASDAQ completely shut down in the middle of the day due to a technical glitch.

So what was the glitch? A "connectivity issue" with NYSE's relatively new Arca system, says the NASDAQ. Arca, you should know, is simply a NYSE system that allows traders to trade stocks on other exchanges.

So to explain this problem as a "connectivity issue" is, to put it mildly, very vague. Of course there was something wrong with Arca's connection to NASDAQ, that's all the program is really supposed to do — the real question is what was wrong with that communication?

Fixing these problems is a job for Wall Street's software geniuses — the guys that build the systems upon systems that communicate around the country (and the world), constantly updating themselves and handling massive amounts of data.

That's why Business Insider reached out to one such software savant — Lev Lesokhin at CAST. His firm specializes in analyzing risk for banks and other companies that have to deal with all this data on a constant basis. When the SEC asked for comment about these matters last fall, Lesokhin and CAST (along with NYSE, BATS... though not NASDAQ) were happy to comment (you can read CAST's comments here).

And this is what you as an observer need to know: Financial firms and exchanges need to not only write a system to perform a certain task — they need to write it so that it knows NOT to perform other tasks.

In that sense, Lesokhin said, Knight Capital's trading glitch last year, Goldman's trading loss this week, NASDAQ's halt yesterday all have something in common — these complex systems did something no one predicted they would do, and they were not "robust" structurally enough to handle the surprise scenario.

"These are all highly data intensive systems," said Lesokhin. They pass information, monitor it... if you have a complex system that manages master data, you also have multiple components that can act on that data — change it remove it, or add it. If you're not managing the way that's handled, like you have 2 different components that manage the data, you can run into corruption issues."

Say component A tries to update a ticker, and component B is supposed to act on it afterward. Well, imagine component A is slow, and component B gets there first... or imagine component B just never shows up.

Imagine those components are all coming from different places. You can see the chaos here.

For NASDAQ's part, it's known for having been at the forefront of automating its systems, but that in and of itself can be a problem. It means its systems are old, and have been updated many times. There's a lot of code under that hood, and as new systems like Arca are forced to communicate with it, there can be unforeseen consequences.

Remember that Knight Capital's trading loss was caused by an unforeseen communication between new live code and old dead code within their systems.

So the question is, how often do exchanges and banks update their systems? When has a system been updated so many times that it's become too complex and needs to be scrapped? Why throw out a system that's old if it still works just to accommodate potential problems? These are all questions the SEC was asking last fall and that it's becoming increasingly clear need to be addressed as soon as possible.