Algorithms Gone Wild: 3 Cases of Computers We Trusted Too Much

Remember a few years ago when Facebook (FB) started suggesting you "reconnect" with old friends, and some of its suggestions were exes and ex-friends you really didn't need to be reminded of? That was an algorithm they used -- and fortunately got rid of -- to help people connect with friend's they hadn't had much recent activity with on the site.

Sponsored Links

Because of embarrassing algorithm fails like this one, companies are slowly starting to delegate work away from the computers and back to humans. Twitter for example, has real people always ready to interpret the context of a search when there is a sudden surge in its frequency. When Mitt Romney referenced Big Bird during the presidential debates last year, there was a gush of people searching for "Big Bird." A computer would automatically direct these people to "Sesame Street" related hits, but real humans knew otherwise.

Those automated algorithms remain the most obvious and efficient way to keep up with today's fast-paced world: Computer programs can be better equipped to spot trends or even deal with consumers' intricate demands than flesh-and-blood employees. But what happens when an algorithm fails? Putting a bit too much authority into the virtual hands of a program can -- and has -- led to some epic fails. Before they replace any more workers with software, companies might want to consider the following cases of good algorithms gone bad.

Random Rudeness on T-shirts

No so long ago, business was booming for Michael Fowler's online T–shirt company, Solid Gold Bomb. The tiny, five-worker company had became known for their customizable "Keep Calm and Carry On" T-shirts that let the consumer to modify the now-famous logo in countless ways. Fowler had designed an algorithm that would randomly select short, two-word phrases to follow "Keep Calm and ... ," thus creating a unique shirt that the customer could order. If they didn't like what the computer picked, they could simply try again for a phrase more to their liking.

Slogans like "Keep Calm and Drink Beer" quickly became favorites. But then something went wrong. A few weekends ago, Fowler was relaxing in his Melbourne home when Facebook notifications started pouring in: death threats and accusations of misogyny. Turns out, the algorithm's dictionary hadn't been culled for potentially offensive content, and suddenly customers were being offered shirts with phrases like "Keep Calm and Hit Her." Indeed, some had already been ordered. Fowler canceled them all, and deleted the offensive options -- indeed, deleted almost all of the parody "Keep Calm" shirts -- from the site. He claims these were just computer-generated images and that none had actually been printed.

The company has posted a lengthy apology on its site, but sales have plummeted and Fowler is afraid the company will soon go under if the public can't get past this incident.

Insane Prices on Amazon

Would you spend $23.6 million to buy one copy of a book titled "The Making of a Fly: The Genetics of Animal Design"? Unless you really, really want to create your own fly from scratch, it's probably a bad investment.

However, this book -- now available for a mere $79.99 on Amazon -- reached that unusual selling price courtesy of two unknowingly price-warring third-party merchants on Amazon (AMZN). A UC Berkeley post-doc who actually wanted to purchase it was amused by the absurd prices, and passed the puzzle on to biology professor Michael Eisen, who tried to figure out what was going on.

Turns out, Amazon merchant Bordeebook had programmed its algorithm so that, once each day, it set the price for that book at about 127 percent of what competing merchants charged. In this case, another seller, Profnath, was trying to keep its price for the out-of-print text at about 99 percent of what the competition charged, and it too, used the algorithm to reset its prices daily.

Why would Bordeebook intentionally charge more than the competition? Eisen speculates that Bordeebook didn't actually have the book in hand, and would have had to buy it elsewhere to satisfy a customer's purchase request. Thus, to ensure it would make a profit, it quoted a price higher price than other merchants, hoping its good customer ratings would lure a buyer regardless.

On the other hand, Eisen suggests, Profnath probably had the book, and wanted to keep its price just a hair lower than the competition. The result: Each day, Bordeebook set its price higher, and Profnath followed close behind. Because both were using algorithms, neither was aware of just how wildly overvalued the book on insect anatomy had become. When someone finally noticed and the crazy algorithm-induced bubble popped, the prices deflated down to around $100. Today, several mint condition copies are available on Amazon, all for under $200.

Bungee Jumping With the Dow

Million-dollar books are entertaining, but the Flash Crash was a pricing algorithm implosion too, and one that affected a significantly larger population.

On May 6, 2010, toward the end of the trading day, the Dow plummeted 600 points in just a few minutes, then recovered most of the loss almost immediately. A Securities and Exchange Commission investigation later revealed that a mutual fund had dumped an unusually large volume of contracts in the market during a 20-minute period. That much activity taking place in such a short time triggered a response from the computers of high-frequency traders, which started to aggressively sell too, accelerating the effect and causing the market to crater.

Though the market recovered quickly, some people lost thousands of dollars, and some companies lost millions.

High-frequency traders use computers to trade assets entirely based on algorithms. They are programmed to analyze the behaviors of various markets, buying and selling stocks that they often hold only for a matter of seconds. Regulations have changed since then, but algorithms still rule our stock markets, and those algorithms can make expensive mistakes.