ADVERTISEMENT

Major market crashes have occurred, and will continue to occur, because the market no longer has a human interface, MIT Media Lab's Kevin Slavin told Wired Money today. "All the tools designed to make it legible have made it impossible to read, and I think that's weird. Isn't it strange when you add computers to physics we unleash the Higgs Boson, but apply it here and we're further away from understanding the market than we have ever been?"

Slavin cited a series of major errors that have occurred in recent market history that can all essentially be blamed on one thing: we're so desperate to get ahead of the game and predict outcomes, that every time we solve a problem we create a bigger one.

Read more

Sinking like a Pebble: is the Fitbit buyout a sign the wearables market is doomed?

ByAmelia Heathman

It is well-explained, he said, by a well-known phrase: the invention of the ship was also the invention of the shipwreck.

For instance, when developers created Bats, an alternate trading system, and the crash referred to as 2:45 (which wiped 75 percent of the stock exchange in five minutes), data scientists put it down to a system error and did not figure that someone has created a way around their tech. "But it wasn't a bug," said Slavin. "That was a deliberate attack on Bats but we're not quite sure yet. They're better off saying its a bug than an attack. "Physics researchers have estimated there have been 18,000 of these black swan events in the last five years. But that number doesn't sound very black swanny to me."

There are only two way these kinds of algorithms can be built, says Slavin, and both are "sort of scary". They can either be written by humans, or not. "Both are extremely problematic. [If it's the former] it means there's a human idea in there, actually an ideology. A good example is that of Robert Parker, who decides the Parker index for wine." An economist decided to reverse engineer an algorithm for wine tasting based on Parker's parameters and created a solution that would fluctuate less frequently than Parker's own judgement. But he remained at the core of that judgement. "These types of algorithms have somebody inside.

ADVERTISEMENT

Read more

Samsung doesn't have to pay Apple $399 million, Supreme Court rules

Vision detection, for instance, at its core it's trained on Playboy centrefolds. Every algorithm reflects somebody's idea of the world, yet we treat them like math."

Scarier yet, says Slavin, is if there's nobody there.

Displaying a simple genetic algorithm on the screen behind him, he showed how a computer cannot grasp a simple problem, but given enough time to work on it it can figure it out by itself. But it's working without context or explanation, so is just as likely to miss the point of a project completely. "It figures it out not because it's smart, but because it seems smart without underlying sensibilities."

He gave the example of a Department of Defense story that is rumoured to be true, in which an algorithm was developed to automatically detect for tanks in the field. They trained the cameras and took 100 photos, but when it came to testing it was total chaos. It was performing so well, so they went back and had a human look at them, but all photos with tanks on them were on a sunny day, and photos without tanks were from a cloudy day. It turns out what they'd trained it to do was tell the weather. This is what happens without a theory -- it's in the explanation." "We need explanations because for better or worse they're the protocol for the human mind," Slavin says.

A far more tragic example of this missing link between the human and the machine was the case of the Air France flight that went down in 2009. Afterwards one commentator noted "this is the problem -- people make poor monitors for computers".

ADVERTISEMENT

Because the disaster, like so many others, was not driven by human error alone, but by human and system error and fatal flaws in how the two are expected to interact.

More recent examples include the Syrian Electronic Army hack of the Associated Press Twitter feed, which resulted in a tweet that the White House had been attacked and resulted in a massive market drop. "Only about 1,000 people actually believed this -- but any human could look at and go, 'it's probably bullshit'. But that's what happens when you remove human evaluation and judgement from the system -- it renders the system increasingly fragile, even in something as straightforward as the news. It's not just the news that bends, it's the whole world. "The danger we're listening for is the danger we've created. It's the dark version of the singularity. Anything I've learned about giving computers autonomy is that they crash, so I'm not so much worried about them taking over as I am about the present in which they fail dramatically. [We need to change things, and] it might not look like market structure of today, but something built on human time, working together with machines."