New Release 2.0 on Money 2.0

One year ago, we published an issue of Release 2.0 entitled “When Markets Collide” (download a PDF), in which we considered what Wall Street and Web 2.0 might have to teach one another. Quite a bit, it turned out: the key parallels we uncovered include latency (both have to do their jobs more or less instantly), connectivity (it’s the liquidity of Web 2.0), sensors and actuators (and how to use them), and reputation (stockbrokers are no longer curators — they’re rated).

So it’s a ripe time to consider the status of the relationship. What’s new? What’s changed? The amount of financial data available publicly is astonishing. That doesn’t mean it’s all useful. There’s plenty of data out there, but it’s plenty confusing. You can’t extract alpha until you understand what you’re looking at. As Michael Simonsen, president and CEO of Altos Research, puts it, “free data on the internet is a mess.”

If anyone doubts that financial markets and technology markets are deeply intertwined, consider this: the same day that JPMorgan Chase revealed its “purchase” of Bear Stearns, a Gartner Group analyst released a report showing that “the financial services industry continued to lead all vertical markets in server revenue, as it accounted for 25.3 percent of worldwide server revenue in 2007.” As goes one set of markets, so goes the other.

In this issue of Release 2.0, we consider the Wall Street/Web 2.0 mashup from a number of angles. We talk to Paul Kedrosky, chair of our Money:Tech conference and an influential blogger on the topic (as well as others), about why some on Wall Street hate Web 2.0 — and what Web 2.0 can do to infiltrate Wall Street nonetheless. Entrepreneur Marc Hedlund, now chief product officer for OATV-funded personal finance startup Wesabe, examines what happens when hidden data gets surfaced. Cathleen Rittereiser talks to hedge fund managers to discover what they want from Web 2.0 — and what they’re actually getting. Longtime Radar contributor Nathan Torkington digs deep into prediction markets and spells out both how to manage them and what companies can gain from implementing them.

It’s a truism that alpha lasts longest when it’s hidden. That may have been true in the past, but the growing use of Web 2.0 tools means that less data will stay hidden, and what’s hidden will stay hidden for a shorter period of time. As James Altucher of Stockpickr said at Money:Tech, “When it comes to data nowadays, closed source is a myth.”

At this point, I doubt information is the key to alpha. The “investment” methods that have blown up, from the LTCM debacle in 1998, to the subprime mess today, shows that alpha is really difficult to achieve, and that the huge transactions flow in the financial markets ends up lining the pockets of the players, even if their alpha is -ve.

I don’t mean to disparage the aim of getting unique information, but as Gecko implied in the movie “Wall Street”, that information needs to be unknown to almost everyone else. That really undermines the premises behind Web 2.0, which is about openess, connectivity and collective intelligence.

No doubt Web 2.0 technologies will pervade Wall Street, and no doubt the vendors will not be the traditional players like Reuters and Bloomberg.

Once upon a time, people looked at the basic hierarchy of human needs, concentrated on some basics, and the wise ones picked investments in firms that were building infrastructure that could produce basic needs efficiently.

“Those are very good machines.” “That is a quality farm.” That kind of thing.

The rumor for a couple of decades now is that you don’t need that kind of attention to detail to invest. Just look for patterns in questionable numbers. Or, ask the crowd what patterns they see in questionable numbers. “Just because you can” has come to mean “you must”.

Thomas Lord: “ask the crowd what patterns they see in questionable numbers”

it has always been thus. Maynard Keynes understood that stock picking was like judging a beauty contest..it was about understanding what teh other judges would pick, not necessarily the most beautiful woman.

What has happened in the markets is that long term investing in value has become secondary to trading. This means that returns are based on short term price movements, i.e. guessing what the crowd is doing, even influencing the counter-parties to trade in ways advantageous to you.

The Release 2.0 document draws some parallels between markets and Web 2.0 features, but I think the comparison is somewhat shallow. A better model for understanding where these markets might go is where the value of information resides. This encompasses Tim’s arguments for open source vs proprietary code in the IT stack.

Once it was profitable for brokers to live off commissions. The inexorable decline in commission rates has resulted in proprietary trading with large pools of capital available to big banks. Research is a marketing tool and is used to suck in the punters to buy proprietary, high fee funds or to become the sucker counter party.

Web 2.0 (and 1.0) is being forced into the same path because revenues from content (free views and declining ad rates) are not sufficiently attractive compared to proprietary use of the data. Facebook is a prime example of what happens as we trend in that direction.

From an engineering perspective, there is no social value in “Web 2.0” that could not have been done better by more fully developing multi-media email, mailing lists, and net-news. The functionality in those fundamentally sane technologies got carved up, repackaged, branded N-different-ways-from-tuesday, broken, and sold back as a panopticon in disguise.

However, your language in explaining this is more palatable for many to whom it needs to be.