Cory Doctorow: Wealth Inequality Is Even Worsein Reputation Economies

I need to confess something: ‘‘Whuffie’’ would make a terrible cur­rency.

In 2003, I published my first novel, Down and Out in the Magic Kingdom, in which all society’s scarcities, even death and energy, have been overcome, and where conflicts over resources – notably, who gets to run Walt Dis­ney World and what they get to do there – are apportioned using a virtual currency called ‘‘Whuffie.’’ Unlike other virtual currencies like Bitcoin, Whuffie isn’t something you buy and sell: it’s a score that a never-explained set of network services calculate by directly polling the minds of the people who know about you and your works, reducing their private views to a number. The number itself is idiosyncratic, though: for me, your Whuffie reflects how respected you are by the people I respect. Someone else would get a different Whuffie score when contemplating you and your worthiness.

The characters in the novel generally love Whuffie, even though it’s destroying them.

Whuffie has all the problems of money, and then a bunch more that are unique to it. In Down and Out in the Magic Kingdom, we see how Whuffie – despite its claims to being ‘‘meritocratic’’ – ends up pooling up around sociopathic jerks who know how to flatter, cajole, or terrorize their way to the top. Once you have a lot of Whuffie – once a lot of people hold you to be reputable – other people bend over backwards to give you opportunities to do things that make you even more reputable, putting you in a position where you can speechify, lead, drive the golden spike, and generally take credit for everything that goes well, while blaming all the screw-ups on lesser mortals.

If this sounds familiar, it’s because that’s how money works. Inherit (or luck into) a large fortune, and give a couple million to a good cause – never mind that it will affect your quality of life not at all – and you’ll be lionized as a hero. The great and the good will invite you onto their podiums – but a poor person who takes in a foster kid gets virtually no recognition, even if fostering involves real sacrifice on their part.

The story of ‘‘meritocracy’’ – a society that migrates wealth, status, and decision-making power into the hands of the most capable – is seductive. Rich people love the idea of meritocracy, because the alter­native is that their lion’s share is unfair, the product of luck, or, worse, cheating. But many of meritocracy’s losers love it, too. In the words of John Steinbeck, ‘‘Socialism never took root in America because the poor see themselves not as an exploited proletariat but as temporarily embarrassed millionaires.’’

Meritocracy is a tautology, of course. There’s no objective measure of ‘‘merit’’ so there’s no way to know whether your society is meritocratic or not. Every famous, powerful, rich person owes their status to a com­bination of skill, luck, and persistence. The best luck of all is to be born to fortunate circumstances, well fed and well educated and well loved. We know for a fact that billions lack some or all of these forms of luck, and among those people are innumerable potential Stephen Hawkings and Steve Jobses and Albert Einsteins. The fact that Jobs was born to a Syrian refugee and that Hawking struggles with a debilitating illness just shows you how fickle luck is – unless you believe that evolution produced exactly one brilliant tech entrepreneur in the ranks of Syrian refugees and one brilliant scientist with ALS, then you have to believe that the others just didn’t get quite so lucky.

It’s bad enough when the meritocratic delusion takes root in a money-driven economy, but reputation’s one percenters are even more toxic. They can go spectacularly bankrupt, financially ruining their investors, and promptly raise another fortune to gamble on.

Reputation is a terrible currency.

Currencies need to serve as units of account – so you can price every­thing from vintage Star Wars figures to anti-fungal cream and calculate their total worth. They need to serve as media for exchange, so that someone who has Ken­ner Star Wars figures and needs anti-fungal cream can convert one to the other. They need to serve as stores of value – so you can convert your action figures to something more stable that you can use in your dot­age, in case Star Wars ceases to be cool in another 50 years.

Reputation is pretty much useless for any of these things. Instead, they’re literally popularity contests: ‘‘more people like me than you, so I win and you lose.’’ In theory, this kind of jerky behavior will cost you reputation – but in reality, many people are delighted to treat such jerks as ‘‘strong, de­cisive people who tell it like it is.’’

The Internet has been trying to figure out how to make reputation work for decades now. Those scores that appear next to Ebay sellers’ names and on the profiles of ‘‘shar­ing economy’’ workers profile pages – Uber, Lyft, Airbnb – attempt to establish a basis for strangers to trust one another.

Ebay’s reputation system is one of the oldest surviving ones, and it’s a good example of how explicit reputation systems fail to solve their major problem. Most people who buy and sell on Ebay do a good job of it, because most people aren’t crooks. A few people do very badly, and get downranked and eventually punted off the system – something that a normal complaints tipline would handle just as well.

But reputation is useless as a hedge against the real nightmare of a setup like Ebay: the long con. It doesn’t cost much, nor does it take much work, to build up sleeper identities on Ebay, fake storefronts that sell un­remarkable goods at reasonable prices, earning A+++ GREAT SELLER tickmarks, even for years, until one day, that account lists a bunch of high-value items on the service, pockets the buyers’ funds, and walks off.

Reputation works badly and fails badly – it’s a lose-lose situation all around.

But that hasn’t stopped companies from doubling down on reputation. Given the role reputation has played in the ‘‘sharing economy’’ bubble, it was inevitable that some would-be titans would offer companies based on nothing but reputation.

One notorious example is Peeple, the vaporware app launched in Sep­tember 2015, which (it was announced) would let you rate other human beings on a scale of one to five. If you wanted to highlight the dystopian nature of Whuffie, you need go no further than this vision for Peeple. If it ever took off, it’d be a lever that the likes of Gamergate could use to destroy your’s employment and personal life, possibly permanently, just by mass-one-starring you.

This was forcefully pointed out to Peeple’s relentlessly defensive founders at such enormous length that it provoked a ‘‘pivot’’ and now the company promises that if it ever ships anything, its app will be opt-in only, and you’ll have the ability to censor any negative reviews. This would render it worse than useless, like a Yelp in which no negative reviews were allowed.

But Peeple is a modest effort compared to ‘‘Citizen Scores,’’ the for-now-voluntary service run by the Chinese government in partnership with Tencent (a huge social media and games company) and Alibaba (China’s answer to Amazon). Your citizen score is visible to everyone the government wants – buying socially approved items, undertaking approved leisure activities, adhering to rules and regulations, and socializing with other high-score individuals. Of course, not doing these things makes your score go down. Just being friends with low-scoring individuals drags your own score down, creating a powerful incentive to conform.

Mandatory Citizen Scores are being phased in over the next decade, and with other ‘‘soft’’ tools of control developed by China, it promises to be more powerful than any overt coercion.

Years ago, China transitioned from relying primarily on the Great Firewall of China to censor messages it didn’t want people to see, to using something called the ‘‘Fifty Cent Army.’’ This Army of hyper-patriotic social media users gets paid half a Yuan (fifty cents) for every pro-government post they write, with an emphasis on discrediting people and reports that put the government in bad light. As anyone who’s ever tried to figure out the he said/she said campaigns run by the US climate denial lobby can attest, doubt is much more powerful than outright sup­pression. When someone reports on state cor­ruption – whether that’s a black child murdered by a US cop, or a Chinese dissident tortured by a politburo operative – and the media is flooded with unsourced reports, innuendo, and accusations about the victim’s low character and alleged past misdeeds, the whole issue quickly dissolves into a muddle.

Citizen Scores are a similar soft-power move: rather than arresting you for being friends with dissidents, the politburo will just downrank you – and everyone will see that your rank has been decreased, and will know that befriending you will endanger their own score.

Citizen Scores are a near-perfect expression of reputation economics: like most other forms of currency, they are issued by a central bank that uses them to try and influence social outcomes. In this case, those outcomes are perfect obedience to the state.

“If it ever took off, it’d be a lever that the likes of Gamergate could use to destroy your’s employment and personal life, possibly permanently, just by mass-one-starring you.”

I’ve been in Gamergate since the start, and I don’t remember any ops to destroy someone’s personal life. It was mostly just boycotts and charity drives. But you know, mass-downvoting and 1-star spamming existed long before they did.

Insightful and informative–I wasn’t even aware of Peeple or Citizen Scores. I’ve been a fan of Whuffie (and your work in general) for over a decade, so it’s great to see you revisit this subject with such authenticity and the benefit of hindsight.

Coincidentally, I recently started work on a personal project in this arena, with some spins that will hopefully address some of the problems you discuss above. I’ve been researching similar reputation systems (and their missteps) to prevent repeating the mistakes and false starts of pioneers. If I manage to get it past the alpha stage, I’ll let you know. I would love to get your $0.02.

I think the main issue is that when people have a sense of ownership over whatever commons – whether it is a reputation system, a crop of land, a message board, or elsewhere – they respect it. When the commons has enough people that you have no sense of influence over it or stake in it and no sense of the consequences of your actions and how they directly affect other participants – no sense of responsibility to it – many people will then opt to treat it like crap. And yet, we have seven billion people, and the question of how to run a planet, despite our neurological limitations in this regard.

p.s. — Sorry meant to say “a chunk of land”, somehow it came out crop…also obviously running it is not necessarily the big conundrum unless we’re really power hungry, but the question of how to make systemic improvements that benefit the whole system is, attempting to ‘run’ it in some fashion being one of many possible approaches, developing various abstract economic systems another. Personally I’m a fan of designing machines that eat garbage and ruined ecosystems and turn them into healthy ecosystems and shelter and food for people, and then letting people press the ‘copy’ button as often as they care to. Or, something.

I’ve designed a reputation system that doesn’t have the claw you claim is endemic to all reputation systems:

github.com/neyer/respect

Specifically, the idea that a mass of one star-reviews would ruin a person. Look up the ‘soundness’ metric. Under my system, a sudden mass of one star reviews on a person would just lead the people giving the 1-star reviews to have lower crediblity.

Reputation is nothing less than the tragic irony of the human species. We rely on it because it promises to protect us against bad actors. However, as you described brilliantly, the bad actors are the quickest to realize that reputation is just another tradable asset, and to amass it– and the best at doing so. Suddenly, what was supposed to be the human society’s immune system (and may have worked when we lived in fairly permanent tribal arrangements and our social lives involved no more than a couple hundred people in a lifetime) is now a weapon of the people that immune system evolved to protect us against.

If you look at Donald Trump, he stands a nonzero chance of becoming our next president. That’s worrying. Or, to stay within the tech industry, look at Y Combinator, the ethically-challenged but now unduly powerful incubator that began around a so-called “AI programmer” (who wrote a couple good books on Lisp, but wasn’t really an AI programmer) monetizing his own reputation. It’s getting absurd– and I don’t know what the solution is.

But we need some way to add the dimension to our transactions that is neglected by currency. Currently, currency is great for all those ‘things’ that are fungible. But what about all those IMPORTANT things that make us want to go on living. Love, caring, security, hope, freedom, confidence, etc. When you do something that gives me hope for example, how do I reward you? With money? Does that put a price on hope? Now we have an equation that can relate dollars to twinkles and to hope. How much hope can a twinkie buy? Shouldn’t you prevent a human brain from substituting twinkles for hope? How can we encourage the consumer of the hope you provide to also become a provider of hope for someone else?If there are too many consumers of hope and too few producers, we feel ‘hopelessness’. The whuffie and things like it fail because we are aware of them. And inevitably some will struggle to game the system. What if there was a currency deposited into our account by an invisible third party – two ways to achieve this 1) establish big brother 2) anonymize every transaction and have it reviewed by another human for a price.- sort of a human block chain?

A simple metric of one’s notoriety is a very shallow and fragile conception of reputation. In the real world, reputation is a reflection of everything that is known about an individual, and is based on a complex system of human values. A quantified reputation system is only as valuable as it is useful. Systems like these are being worked on, and they are a lot more complex than the currencies in use today. The essential problem here is that you’re oversimplifying what reputation actually is. Here is the beginning of an attempt to develop such a system:

Isn’t reputation what open source runs on? Is the inefficiency of a reputation economy a weakness for open source? And does this fact have any significance for the obscene gender gap in open source? I think digging into reputation economy and gender would be an interesting line of research.