And who doesn’t love a good poker player? Rafe Furst is a pretty good poker player. He is also in the midst of inventing a new prediction market. It is called Truth Markets and I think the concept is fascinating, though it isn’t live yet.

I met Rafe a few months ago in Las Vegas when I went out for some poker research and a Rock Paper Scissors tournament. I had never heard of Rafe Furst, even though he had just won a World Series of Poker bracelet. I went up to him during the RPS tournament just because he looked like an interesting guy to interview. He had just lost his match but he’d played with a weird intensity that made me want to know more about him. After about ten minutes of conversation — people kept stopping by to pat him on the back — I figured out he’s a big-time poker player. More important, at that moment, was the fact that he been coaching Annie Duke in RPS, and so I stood with him as Annie went on to win the tournament.

He told me a bit about his idea for Truth Markets that day. And then not long ago, he sent this e-mail in reply to a posting I made here about Wikipedia. I’m going to include his entire e-mail here because it not only explains Truth Markets a bit but also provides a good insight into the mind of the guy who invented it. He’s looking for feedback so feel free.

Have you guys checked out the controversy over Digg recently? It’s similar to Stephen’s issue with Wikipedia, but the misinformation is more intentional.

Speaking of which I think the guy who talked about having a trustworthiness rating for individuals is on the right track. Except, anything that distracts users from making immediate changes (such as logging in, or a notion that I have to build my reputation to have an equal voice) could be the death of wikipedia. My approach to improve wikipedia would be to include an “information liquidity” metric along with each page, similar to a stock’s trading volume. Pages could be grey-scale coded based on the page change history, with high-volume pages appearing darker, more solid. Of course this can be gamed, but here gaming has visible artifacts. As far as accuracy goes, I think this would solve Stephen’s objection because the informational backwaters — pages with lower liquidity — would appear visually distinct from the heavily modified. In machine learning there’s a construct called a Boltzmann Machine (aka simulated annealing machine) which describes the dynamics of systems like wikipedia, but it requires a metric like volume/liquidity/energy. An alternative approach to social networking for solving accuracy/trust problems is this one, which I’m very intrigued by, but the system isn’t big enough yet to bear out the promises.

When coming up with the concept for Truth Markets, I originally was looking to create an “information reputation authority” based on the Device Reputation Authority model. I was unable though to get the model to work because of the trust bootstrap issue. Earlier incarnations of truth markets suffered from the same fate, at least in theory. The fatal argument was essentially, with no agreed-upon underlying value, what’s to keep truth claims from becoming tulips? So based on the advice of Robin Hansen and Patri Friedman and a bunch of bloggers, I modified the model to have a quasi-external “authority” to arbitrate truth claims. The interesting thing is that the expiration date for current truth claims is a statistical distribution rather than a hard and fast date. Namely, when you trade a truth claim, all you know is that at some point in the future the issue will be voted on by a random sample of the entire market (there are mechanisms for obviating collusion or any sort of coordinated cheating). My hope is that the uncertainty of exact judgment date is enough to make individual traders act rationally with respect to their wallets at all points in time and thus coordinate price with actual truth value at all times and mitigate bubbles.

Bringing it back to the Digg controversy, I think that a variant of truth markets (call it relevance markets) could fix the problem. Digg is essentially a current popularity poll for news items that works on the aggregate of people surfing as they normally would, but giving a thumbs up rating to anything that they “digg”. The stuff that bubbles to the top has a decidedly techy-libertarian slant, which makes sense because of the self-selecting population of Digg users. The controversy is about whether this truly is wisdom of the crowds, or whether there’s too much mob behavior and undue influence by high-profile Diggers. By substituting “truth” with “relevance” or “importance”, the truth market mechanism could solve the Digg dilemma. In general, the underlying value of the claims in the interlocking market system described in the truth markets document could be anything that people care about: truth, importance, humor, attractiveness, morality, etc.

I’m curious as to what your take on any of this is, especially the viability and utility of truth markets.

COMMENTS: 16

How about a wikipedia (or digg, or …) where users accumulate reputation points (think Cory Doctorow’s “Whuffie”, described in Down and Out in the Magic Kingdom) through “good” behavior and which can be lost through “bad” behavior, where good and bad are defined in some sensible way for the site in question. Their reputation gets attached to their contributions.

In parallel, there’s some sort of market mechanism that operates in terms of the reputation points (rather than money). Perhaps a user can trade some reputation points for a contract position? I’m still kind of new to the idea of this kind of market, so I’m not sure how it would work.

Or would a wikipedia-style article represent a Furstian “Claim” with everyone who contributes to it being a Claimant? The system randomizes judgement windows and judges, as Furst describes, and one can stake one’s reputation (or fractional pieces of it) for or against articles based on one’s trust in the Claimants who contributed to the article. Is that how it would work?

How about a wikipedia (or digg, or …) where users accumulate reputation points (think Cory Doctorow’s “Whuffie”, described in Down and Out in the Magic Kingdom) through “good” behavior and which can be lost through “bad” behavior, where good and bad are defined in some sensible way for the site in question. Their reputation gets attached to their contributions.

In parallel, there’s some sort of market mechanism that operates in terms of the reputation points (rather than money). Perhaps a user can trade some reputation points for a contract position? I’m still kind of new to the idea of this kind of market, so I’m not sure how it would work.

Or would a wikipedia-style article represent a Furstian “Claim” with everyone who contributes to it being a Claimant? The system randomizes judgement windows and judges, as Furst describes, and one can stake one’s reputation (or fractional pieces of it) for or against articles based on one’s trust in the Claimants who contributed to the article. Is that how it would work?

So, in a Truth Market, a person could theoretically make money by correctly identifying true statements as true and false statements as false? In other words, if I’m absolutely sure that a certain religion is not true, could I make money from that? Even if a majority of people believe that religion?

I need some time to study these links later, but the idea intrigues me. The problem is how to objectively and accurately (reliably) determine truth and falsehood.

So, in a Truth Market, a person could theoretically make money by correctly identifying true statements as true and false statements as false? In other words, if I’m absolutely sure that a certain religion is not true, could I make money from that? Even if a majority of people believe that religion?

I need some time to study these links later, but the idea intrigues me. The problem is how to objectively and accurately (reliably) determine truth and falsehood.

BlackSkink, there is a site that works exactly as you describe, called http://www.reddit.com. They let you vote both positively and negatively for each article, and each person’s “karma” is tallied accordingly. You might want to check it out, especially if you are a fan of digg.

BlackSkink, there is a site that works exactly as you describe, called http://www.reddit.com. They let you vote both positively and negatively for each article, and each person’s “karma” is tallied accordingly. You might want to check it out, especially if you are a fan of digg.