Karl Marx said that the world would be divided into people who owned the means of production—the idle rich—and people who worked for them. In fact it is increasingly being divided between people who have money but no time and people who have time but no money. The on-demand economy provides a way for these two groups to trade with each other.

...

The “transaction cost” of using an outsider to fix something (as opposed to keeping that function within your company) is falling. Rather than controlling fixed resources, on-demand companies are middle-men, arranging connections and overseeing quality. They don’t employ full-time lawyers and accountants with guaranteed pay and benefits. Uber drivers get paid only when they work and are responsible for their own pensions and health care. Risks borne by companies are being pushed back on to individuals—and that has consequences for everybody.

Good piece from The Economist on the implications of the on-demand economy. It's given workers more flexibility at the cost of stability, and an increasingly freelance workforce means delivery of so much of welfare (like health insurance) through employers is more and more inefficient.

One way to see it is to look at the two gym brands commonly cited as the fastest-growing in America: CrossFit and Planet Fitness. Both are expanding like crazy. CrossFit has gone from having 13 affiliate gyms in 2005 to 10,000 today. And Planet Fitness has more than tripled in size over the past five years.

But aside from their bang-up growth rates, the two could not be more different. The former is expensive and intense, appealing to competitive individuals ready to commit thousands of dollars and many hours to working out. I go to a CrossFit gym a few blocks from my apartment. It costs $160 a month, often more than I spend on groceries. The latter is perhaps most famous for giving out free pizza — a fact it embraces and publicizes, no less. Its monthly fees are normally around what a movie ticket costs.

...

It is the middle that is growing more slowly, with some chains struggling to demonstrate their value to consumers — your Bally Total Fitness, now all but defunct, or Curves. It is the gyms with considerable but not intolerable monthly fees and decent amenities, but no sheen of luxury or promise of extraordinary results.

Or perhaps this is some variant of the smiling curve, with high end gyms offering the value add of the company of the other well-heeled folks embracing a high cost public signal of their willingness to spend on their physical fitness (itself a signal of many other things, including self-discipline, health, and matching level of vanity).

Here at interfluidity, we are not in the business of useless economics, so we will adopt a very conventional utilitarianism, which assumes that people derive the similar but steadily declining welfare from the wealth they get to allocate. Which brings us to our first result: If our single producer and our single consumer begin with equal endowments, and if the difference between consumer and producer surplus is not large, than the letting the market clear is likely to maximize welfare. But if our producer begins much wealthier than our consumer, enforcing a price ceiling may increase welfare. If it is our consumer who is wealthy, then the optimal result is a price floor. This result, a product of unassailably conventional economics, comports well with certain lay intuitions that economists sometimes ridicule. If workers are very poor, then perhaps a minimum wage (a price floor) improves welfare even of it does turn out to reduce the quantity of labor engaged. If landlords are typically wealthy, perhaps rent control (a price ceiling) is, in fact, optimal housing policy. Only in a world where the endowments of producers and those of consumers are equal is market-clearance incontrovertibly good policy. They greater the macro- inequality, the less persuasive the micro- case for letting the price mechanism do its work.

...

Let’s consider another common case about which many economists differ with views that might be characterized as “populist”. Suppose there is a limited, inelastic supply of road-lanes flowing onto the island of Manhattan. If access to roads is ungated, unpleasant evidence of shortage emerges. Thousands of people lose time in snarling, smoking, traffic jams. A frequently proposed solution to this problem is “congestion pricing”. Access to the bridges and tunnels crossing onto the island might be tolled, and the cost of the toll could be made to rise to the point where the number of vehicles willing to pay the price of entry was no more than what the lanes can fluidly accommodate. The case for price-rationing of an inelastically supplied good is very strong under two assumptions: 1) that people have diverse needs and preferences related to the individual circumstances of their lives; and 2) willingness to pay is a good measure of the relative strength of those needs and values. Under these assumptions, the virtue of congestion pricing is clear. People who most need to make the trip into Manhattan quickly, those who most value a quick journey, will pay for it. Those who don’t really need the trip or don’t mind waiting will skip the journey, or delay it until the price of the journey is cheap. When willingness to pay is a good measure of contribution to welfare, price rationing ensures that those more willing to pay travel in preference to those less willing, maximizing welfare.

Unfortunately, willingness to pay cannot be taken as a reasonable proxy for contribution to welfare if similar individuals face the choice with very different endowments. Congestion pricing is a reasonable candidate for near-optimal policy in a world where consumers are roughly equal in wealth and income. The more unequal the population of consumers, the weaker the case for price rationing. Schemes like congestion pricing become impossibly dumb in a world where a poor person might be rationed out of a life-saving trip to the hospital by a millionaire on a joy ride.

When people say that a price-based scheme for rationing water is most efficient, they mean that prices will deliver the most efficient distribution of dollars and water. The idea is that how much people are willing to spend on something is a good proxy for how much they care about it, or how important it is to their well-being. Different people like different things, but you can buy all kinds of different stuff with dollars, and seeing what people choose to spend their money on tells you a lot about their preferences.

But dollars aren't a perfect proxy for well-being, because money means different things depending on how rich or poor you are. To a middle class American, $5,000 is a really big deal. To a multi-millionaire like Mitt Romney or Hillary Clinton, it's totally trivial — the value of their stock portfolios bounces up and down by that much all the time. To a person living paycheck-to-paycheck with no access to credit beyond very expensive payday loans, $5,000 could be a life-changing amount.

The technical term here is the "declining marginal utility of money." A given dollar produces less happiness in the pockets of a rich person than a poor one. That means that in a society with substantial economic inequality, an efficient distribution of dollars and water isn't going to be the same as an efficient distribution of happiness and water. This is what we're seeing in the North Carolina water case — the dollars are just a lot more important to the poor than the rich, so all the burden of adjusting to reduced water usage falls on them.

The reflexive response when it comes to many folks in tech on issues like Uber's surge pricing and net neutrality is to bow down before the power of the free market, and I count myself generally in that camp. Generally, it is an optimal scheme for efficient allocation of scarce resources.

I'm also generally sympathetic to companies pricing their products according the market. Apple charges a hell of a lot more for its phones than it costs them to manufacture them, but they've earned that surplus by producing a great product that people want. Uber charges surge pricing during certain times and enough people are willing to pay the multiple to get a ride because of the sheer convenience of the experience.

However, let's not blindly accept that it's welfare-maximizing for society without a skeptical analysis. As Waldman notes:

And there are lots of choices besides “whatever price the market bears” and allocation by waiting in line all day. Ration coupons, for example, are issued during wartime precisely because the welfare cost of letting the rich bid up prices while the poor starve are too obvious to be ignored. Under sufficiently high levels of inequality, rationing scarce goods by lottery may be superior in welfare terms to market allocation.

Those who spent the most money saw the lowest level of inflation, the ONS concluded.

This could be explained, in part, by prices of package holidays and education barely rising over recent years.

Full article here. It's not just absolute but effective income inequality that is seeming to rise.

Among some sizable number of people I follow on Twitter, ad-supported products and services in tech are seen as evil. ”You get what you pay for!“ and “If you're not paying, you are the product” and variants thereof are common dismissals or denunciations of any ad-supported product.

They lament the proliferation of free apps in mobile app stores, turn their noses up at Facebook, lament the ad-free days of Twitter. Look at anyone who writes such things and a few things become clear. They're almost always fairly well-off (middle to upper middle class on up) and so spending a few extra bucks is no big deal to them. Also, they almost always generalize off of one egregious example.

Ad-supported business models have enabled many people without the financial means otherwise to access many products and services. You may not think it's that important for someone who's poor to access Instagram without paying, but that's a very privileged stance, one that ignores how many people in other countries use such services.

Many businesses can only achieve the scale necessary to be useful with a free, ad-supported business model. Facebook is just one example. Sure, it means that many companies that set off in that direction will fail—scale businesses require, well, massive scale—but a high extinction rate for those who attempt to build a scale business is to be expected.

Finally, it's hardly clear that either a pay or ad-supported business is more friendly to customers. Derek Powazek had a great post on this a few years ago.

I don't mind paying for services I love. For example, I'm strongly anti-piracy when it comes to people who can afford the things they pirate, no matter what reasons they come up with to justify their behavior.

I'm happy, though, that the things I enjoy come in a mix of pay and ad-supported models. As I've noted before, I just hope ad-supported businesses embrace the natural evolution to native ad units more and more in 2015, especially all the old media sites I enjoy but whose user experiences are being destroyed by their advertising unit selection.

I don't now how I got from an article on income inequality to a discussion of ad-supported business models, but all my New Year's Eve alcohol consumption seems to have connected some strange regions of my brain this morning.

Superficial observers will see further evidence that economists can't shut up about selfishness. But on reflection, the logic of collective action is compelling evidence for the power of altruism. How so? Because actual human beings often engage in collective action despite the strong selfish case for inaction! Many people give blood without the slightest recompense. Many people voluntarily join the army when they see their country in danger, despite high risk and low wages. Many people donate to charity even though eligibility for charity has nothing to do with their donation history. If altruism is not their motive, what is?

Sure, true believers in ubiquitous selfishness can grasp at straws to protect their dogma. Perhaps people donate blood for the free cookie, join the army because they might run for office one day, or give to charity in order to make business connections. Or maybe millions of average joes are clueless enough to believe that the blood supply, the safety of the free world, and the availability of charity hinge on whatever they personally choose to do.

Anything is possible, but that doesn't mean that anything is plausible. Once you grasp the logic of collective action, basic economics strongly supports a conclusion that economists rarely advertise: Genuine altruism is all around us. Benevolence doesn't explain why bakers bake bread for paying customers, but it does explain why blood donors give blood to strangers for free.

Which goes to show that while movies can teach important economic lessons, that might not be the best way to go about making a good movie (Begin Again is by the director of Once, and it is basically Once with actors instead of musicians as leads; if you choose to see one, see Once).

One of the difficulties of building innovative payment services in the US is that we're a nation that loves our credit cards. We may totally overvalue the random rewards and points and other benefits the credit card companies give us, but since the merchants often cover a lot of the cost of those benefits, we'll take all the rewards we can get, as financially irrational as that may be.

[Aside: Some cards have annual fees, and the credit card users that carry a balance each month subsidize other users who pay off their balance in full each month, but many credit card rewards would be acquired more cheaply just by paying for them directly. A free lunch remains a rare thing.]

This presents a cost problem for payment startups: if you build a payment service that leverages your users' credit cards, you have to pay companies like Mastercard, Visa, American Express, and so on their fees. But if you charge merchants an additional markup on top of this fee when one of your users pays with your service, merchants have very little incentive to accept your payment method.

You could just pass through the fee, but then you have to make your profit elsewhere. Or you could be bold and charge less than the credit card fees, but then your entire business is a loss leader. The more you sell, the more you lose. Ask Square how that model has worked out from a cash flow perspective. Paypal was able to reverse the bleeding by making it near impossible for its users to pay with a credit card instead of with an eCheck or any Paypal balance they might be carrying. I know this because I tried to switch to using a credit card when logged into Paypal once and ended up in this strange endless loop where I kept adding a credit card to use and then not being able to find it. I tried several more times in a row until I decided to just take a rock and hit myself in the head repeatedly because it was less painful and frustrating.

An echeck costs Paypal some negligible amount, I recall it being something like $0.01 from my time at Amazon, and using the Paypal balance costs Paypal nothing, of course. That means whatever Paypal charges the merchant on that transaction becomes profit. It's not easy to get all the way there, though. You have to encourage enough usage for users to want to give you their checking account information so they can withdraw any balances they might have. Once you have that, you can enable money to flow the other direction, as an eCheck, too.

Payments, as a multi-sided market, will always present entrepreneurs with this chicken-egg conundrum. To get merchant adoption, you need a huge number of consumers carrying your payment method, but to get consumers to want to use your payment method over their beloved credit cards, you need a ton of merchants to accept that payment method.

Which brings me to Apple Pay, which launched today. I didn't realize today was the public launch until I was at Whole Foods buying breakfast this morning and saw this at the checkout counter on the payment terminal:

I hadn't downloaded the iOS 8.1 update yet or added any credit cards to Apple Pay on my phone, so I didn't use it just then, but I went back later after I had added a credit card and tried it out, and it was painless. Held my phone up to the terminal, Apple Pay popped up on my screen and asked me to verify with Touch ID, and that was it.

There are many reasons to think Apple Pay might succeed where so many other alternative payment methods have failed.

First, Apple is building off of the existing credit card system rather than fighting customer inertia. As noted above, so many US consumers love their credit cards and the rewards they get from them. Apple Pay doesn't ask them to give that up.

Another thing about credit cards: most consumers find them easy to use, not that much of a hassle to bring out and use. To surpass them, an alternative has to be as easy or easier to use or magnitudes for the consumer or much cheaper for the merchant. Apple Pay fulfills the first of those requirements once you've added the cards to Apple Pay on your phone, something that can be done as easily as snapping a photo of the credit card. Touch ID has finally reached an acceptable level of reliability, so the overall user experience is simple and solid.

Perhaps most importantly, Apple Pay starts on day one with a good whack at both the chicken and the egg. On the merchant side, Apple managed to corral an impressive number of partners, from credit card companies (the big 3 of Visa, Mastercard, and American Express) to banks (all the biggest like Bank of America, Chase, Citi, Wells Fargo). On the merchant front, Apple Pay's marketing page claims over 220,000 stores which actually makes this the weakest link of these three groups but still a decent starting point for day 1. That will be the hardest nut to crack, but more on this point later.

On the consumer side, whether that's the chicken or the egg, Apple has a massive and growing installed base of iOS and iPhone users who can use this system. The only other technology company I can think of who could tackle this space is Amazon given their huge database of consumer credit cards, but they lack the device or app installed base to fulfill on a good user experience for enough users.

Apple Pay offers some additional benefits, like added privacy, something Apple has been touting across the board as one of their key consumer benefits, though I'm still not sold it's a huge selling point for most average consumers. Still, it's worth noting that Apple doesn't keep records of your transactions, and you don't have to hand over your credit card to some waiter or clerk who then has access to the card number, expiration, and security code. Again, I think this is of minor psychological comfort for the vast majority of consumers, but it's at least not a negative.

But more than anything, my excitement for Apple Pay stemmed from this post by Uber:

The beauty of Apple Pay is that it simplifies Uber’s signup process to a single tap. If you have an eligible credit card already added to Apple Pay, you don’t need to enter it again to ride with Uber. Instead, merely place your finger on the Touch ID sensor of your iPhone 6 or iPhone 6 Plus, and your Uber is on its way. No forms, no fuss. We’re calling this new Uber feature Ride Now, and it’s the product of a close collaboration between Uber and Apple over the past few months.

Place your finger on Touch ID to confirm payment, and your Uber is en route!

The rest of the Uber experience remains exactly the same. A receipt for your ride, with the fare breakdown and trip route, is sent to the email you have for Apple Pay; riders rate their drivers at the end of each trip. Existing Uber users are unaffected and can continue using Uber as before.

Innovation in payments in the US has been difficult because of the entrenched incumbent stack, but Apple has just moved up the stack and innovated above it all. What they've done here is abstracted away the credit card number entirely. Extrapolate out into the rosiest future, and perhaps someday the only time you might have to remember your credit card number is when you get it in the mail and input it into Apple Pay. Who really cares what the number, expiration date, and security code are. If you can prove you are who you say you are with Touch ID, that's a more efficient way to prove your identity and grant authorization for the payment.

But we're a long ways away from that day, and for now, 220,000 stores is actually not close to a majority of merchants. The Uber example, though, demonstrates the near-term potential.

I long ago memorized all my credit card numbers and security codes just so I wouldn't have to deal with the hassle of pulling the cards out every time I had to punch the number in for an online transaction. Still, it's a hassle, especially on my phone, to have to either enter my credit card details or to go round trip to 1Password to remember my crazy long, random, difficult-to-memorize iTunes password.

Apple Pay reduces that pain by a lot. A whole lot.

Even if its near term impact is restricted to making payments in apps on my phone, that's a big deal, and perhaps sufficient incentive to me to actually upgrade to one of the new iPads with Touch ID.

When Amazon first received its 1-click patent, one of the first and only companies to license the patent was Apple. It paid off for both sides. For Amazon, Apple's license strengthened the patent, allowing them to enforce it against companies like Barnes & Noble (for the record, I don't believe in software patents like these, but that's a topic for another day). For Apple, the 1-click license allowed them to enable users to purchase songs off of iTunes with 1-click, one part of a superior experience that spanned iPods and the iTunes music store that catapulted them to the digital music throne. Can you imagine the painful it would have been to go through multiple steps to purchase each single?

What Apple has done with Apple Pay is extend 1-click purchasing to the mobile app world and many real world stores as well. Or maybe we should call it 1-touch purchasing.

Years later when we look back on WWDC 2014, I suspect Apple Pay will be the most important announcement by a wide margin.

Metcalfe's Law states that a value of a network is proportional to the square of the number of its nodes. In an area where good soils, mines, and forests are randomly distributed, the number of nodes valuable to an industrial economy is proportional to the area encompassed. The number of such nodes that can be economically accessed is an inverse square of the cost per mile of transportation. Combine this with Metcalfe's Law and we reach a dramatic but solid mathematical conclusion: the potential value of a land transportation network is the inverse fourth power of the cost of that transportation. A reduction in transportation costs in a trade network by a factor of two increases the potential value of that network by a factor of sixteen. While a power of exactly 4.0 will usually be too high, due to redundancies, this does show how the cost of transportation can have a radical nonlinear impact on the value of the trade networks it enables. This formalizes Adam Smith's observations: the division of labor (and thus value of an economy) increases with the extent of the market, and the extent of the market is heavily influenced by transportation costs (as he extensively discussed in his Wealth of Nations).

I learned that and more from this post on how critical horses were to the industrial revolution. Because Europe had horses to move natural resources while China relied on human porters, the 1800's saw Europe surge past China. Later, non-European countries like Japan just skipped the horses and went to steam engines to play another round of leapfrog.

We continue to see leapfrogging all over the world with a variety of technologies, like cellular technology (skipping landlines) and near field payments (hopping past credit cards). To take a more recent example, it would not surprise me if we first saw widespread deployment of drone delivery technology in countries other than the U.S., where regulations and solid alternatives exist. It's not surprising to hear that Amazon is looking to test drone delivery in India first.