Tuesday, August 19, 2014

Well, I'll be doing a staycation of sorts, if by staycation we mean the United States. In the end, it came down to a $250-300 flight out of my local airport (which is a short bus ride down the street), or paying $1000+ to head down to O'Hare and spend 15+ hours in air transport, switching planes and layovers. Then there's the disorientation and language barrier. Laziness and cheapness took over. Although, I have to say, the gambling, prostitution and endangered species markets of Mong La were awfully tempting. And apparently India needs some tourism help.

So I'll be on the West Coast. I'll be in Los Angeles tomorrow, staying in Topanga Canyon for a week. After the 27th, I'll need to find a new place, and I'll be driving north along the coast to San Francisco. I'll be leaving from San Francisco on the 31st.

So if anyone wants to kick it West Coast Style, let me know. My itinerary is totally open. If you know of a good place to stay between L.A. and San Francisco, or know of a good spot to stay in S.F. I'd appreciate the tip.

Monday, August 18, 2014

Now that all the robots have been trained and are being trained to build our stuff for us, what’s next? What age are we entering?

I kind of want to complain at this point that economists are kind of useless when it comes to questions like this. I mean, aren’t they in charge of understanding the economy? Shouldn’t they have the answer here? I don’t think they have explained it if they do.

Instead, I’m pretty much left considering various science fiction plots I’ve heard about and read about over the years. And my conclusion is that we’re entering “the age of service.”

The age of service is a kind of pyramid scheme where rich people employ individuals to service them in various ways, and then those people are paid well so they can hire slightly less rich people to service them, and so on. But of course for this particular pyramid to work out, the rich have to be SUPER rich and they have to pay their servants very well indeed for the trickle down to work out. Either that or there has to be a wealth transfer some other way.

So, as with all theories of the future, we can talk about how this is already happening.

I noticed this recent Bloomberg View article about how rich people don’t have normal doctors like you and me. They just pay out of pocket for super expensive service outside the realm of insurance. This is not new but it’s expanding.

Here’s another example of the future of jobs, which I should applaud because at least someone has a job but instead just kind of annoys me. Namely, the increasing frequency where I try to make a coffee date with someone (outside of professional meetings) and I have to arrange it with their personal assistant. I feel like, when it comes to social meetings, if you have time to be social, you have time to arrange your social calendar. But again, it’s the future of work here and I guess it’s all good.

More generally: there will be lots of jobs helping out old people and sick people. I get that, especially as the demographics tilt towards old people. But the mathematician in me can’t help but wonder, who will take care of the old people who used to be taking care of the old people? I mean, they by definition don’t have lots of extra cash floating around because they were at the bottom of the pyramid as younger workers.

“Did you know that there are nearly 102 million working age Americans that do not have a job right now? And 20 percent of all families in the United States do not have a single member that is employed. So how in the world can the government claim that the unemployment rate has “dropped” to '6.3 percent'?”

“Well, it all comes down to how you define who is 'unemployed'. For example, last month the government moved another 988,000 Americans into the 'not in the labor force' category.”

102(!) million working age Americans that do not have a job right now? 20%(!) of all families in the US do not have a single member that is employed? Do you find these numbers exaggerated? Maybe not, if you read the following story by the "eye-witness" Yanis Varoufakis.

What is going on in front of our eyes?

Yanis Varoufakis in his article "What if the capital of future doesn't need us?", describes what he saw in Austin-Texas and what that means:

“While I was watching from my window in Austin-Texas, I saw a big cloud of dust deep in horizon. Two days ago, I was walking in that area and I was surprised by the view of the big factory where bulldozers and machines were continuously working, producing the dust. From the front side of the building under construction it was obvious that (fortunately) they were not building a new trade center or apartment blocks. No, it was a big industrial center.”

“Although I didn't notice it the first time, after a few seconds I realised that something was missing from this factory: people! Specifically, I counted three. All of them were wearing helmets and protection suits and were located in a small office in a space outside with a few computers, while they were covered by a tent like those used by the army. Ten bulldozers, three cranes and more or less ten moving tools, at least from what I could see, were moving without drivers, operators, workers generally.”

“When I returned to my office, I went straight to find a colleague who knows well what's going on. He informed me that the workplace I saw, was the new factory of Apple to produce MacBook Pro. It was true that, it was constructed through almost complete automatization. The materials had been selected through a way with which, the automatic machines - therefore robots connected to eachother through a local wireless network (intranet) - to be able to construct without human interference - even the hydraulic structure of the building will be constructed by plumpers-robots. A factory that under normal conditions should employ thousands of workers is functioning with the presence of less than one hundred souls...”

How long a society could tolerate an unemployment rate of 30% - 50% without uprising that could bring an "instability" to the system? When unemployment rate for the young people has reached 60%? When pensions and wages have been cut? How much the unemployment benefit should be cut in order to allow people to survive and consume without uprising? How many could be left totally helpless without any kind of social benefit? How much the mainstream media propaganda affects people in order to stay scared inside their homes without uprising?

The new nightmarish facts in Greek experiment could be proved quite useful for the big companies that hyper-automatize production and seek to remove the human labor costs.

Robots make up for China's manual labor shortages (Want China Times) To solve the problem of chronic labor shortages and soaring labor costs, manufacturers in the Pearl River Delta in southern China are lining their production belts with robots and lining the pockets of the robotics industry with explosive growth, reports the Chinese-language Economic Daily. Labor 'shortages' in a nation of over a billion people !!?

Let’s Get Rid of Wage Labor (Filip Spagnoli) I’m serious: make it illegal. But not before we have a universal basic income. A UBI will encourage self-employed or cooperative ventures freely chosen by those who engage in it. In the absence of a UBI, many of us have a job not because the activities associated with the job allow us to pursue our goals, but because the job comes with a salary and because this salary can buy the necessities of life. We then either pursue our goals during our leisure time, or convince ourselves that the goals of our jobs are somehow also our own goals (the burger-flipper telling himself that “making kids happy is all I want”)...

Voltaire once said that “work saves a man from three great evils: boredom, vice and need.” Many people endorse this sentiment. Indeed, the ability to seek and secure paid employment is often viewed as an essential part of a well-lived life. Those who do not work are reminded of the fact. They are said to be missing out on a valuable and fulfilling human experience. The sentiment is so pervasive that some of the foundational documents of international human rights law — including the UN Declaration of Human Rights (UDHR Art. 23) and the International Covenant on Economic, Social and Cultural Rights (ICESCR Art. 6) — recognise and enshrine the “right to work”.

But what about the right not to work? Although the UDHR and ICESCR both recognise the right to rest and leisure, they do so clearly in the context of a concern about overwork. In other words, they recognise the right to work under fair and reasonable conditions. They do not take the more radical step of recognising a right to opt out of work completely, nor to have that right protected by the state. But maybe they should? Maybe the right not to work is something that a just and humane society should recognise?

That, at any rate, is the argument developed by Andrew Levine in his article “Fairness to Idleness: Is there a right not to work?”. In this post, I want to take a look at that argument. In broad outline, Levine defends the claim that a right not to work is entailed by the fundamental principles of liberal egalitarianism (of a roughly Rawlsian type). He does so, not because he himself endorses liberal egalitarianism, but because he wishes to highlight the more radical implications of that view.

The idea that humans are meant to work 40 hours a week is a relatively recent innovation. During the Industrial Revolution, factory workdays stretched anywhere from 10 to 16 hours. The eight-hour-day movement came about in reaction to those conditions in the early 19th century. In 1817, the Welsh social reformer Robert Owen was calling for "Eight hours labor, eight hours recreation, eight hours rest." By 1886, the U.S. Federation of Organized Trades and Labor Unions declared that eight hours constituted a legal day's labor.

Yet for all our innovations, workers may have actually been better off in pre-industrial times, when they already knew how to structure a sustainable, lighter work schedule without the help of robots. In her book The Overworked American, Juliet B. Schor explains: "Before capitalism, most people did not work very long hours at all. The tempo of life was slow, even leisurely; the pace of work relaxed." Medieval labor was broken up multiple times a day for meals and refreshments. A full "day of work" constituted only "half a day," Schor writes. In 14th-century England, servile laborers worked only 175 days out of the year, and farmers and miners just 180. Even with these schedules, they were able to sustain themselves.

So why don't we aim for something like this now, cutting down our commitments instead of dreaming up machines that will enable us to labor better instead of less? The problem is systemic, as Marx suggested in his 1867 Capital: "Capital is dead labor, that, vampire-like, only lives by sucking living labor, and lives the more, the more labor it sucks." The more labor we produce, the more demand for labor we drive.

Wadhwa and Summers' dream of a robot-run utopia is not a new one, of course. It's a myth that exists in the Jetsons and in Pixar's Wall-E, which increasingly resembles a cautionary, rather than fairy, tale. In 1960, the Japanese architectural critic and poet Noboru Kawazoe sketched his similar vision: "Soon the time will come that everything will be done by machine. The only thing we have to do will be dreaming." That time hasn't arrived, but we can certainly dream.

Sunday, August 17, 2014

The experts on both sides, according to Pew, did agree on several main points. First, our political and educational structures are not equipped to handle the technological shift in employment. They also predicted that a more robotic workforce would redefine what “work” is for humans and what constitutes a “job.”

The future of jobs, however, the experts explained, is not absolute. We humans still have the ability to shape the path of robotics, Artificial Intelligence and employment.

“Increasingly we will see work places, institutions and societies debate the benefits of new technologies and these debates will include the social impacts of adoption,” Ben Fuller from the International University of Management in Windhoek explained. ”The important thing to remember is that we have a choice to adopt one hundred per cent, partially or not at all.”

“The biggest exception will be jobs that depend upon empathy as a core capacity — schoolteacher, personal service worker, nurse. These jobs are often those traditionally performed by women. One of the bigger social questions of the mid-late 2020s will be the role of men in this world.”

— Jamais Cascio, technology writer and futurist

Most grim:

“The degree of integration of A.I. into daily life will depend very much, as it does now, on wealth. The people whose personal digital devices are day-trading for them, and doing the grocery shopping and sending greeting cards on their behalf, are people who are living a different life than those who are worried about missing a day at one of their three jobs due to being sick, and losing the job and being unable to feed their children.”

“The technodeterminist-negative view, that automation means jobs loss, end of story, versus the technodeterminist-positive view, that more and better jobs will result, both seem to me to make the error of confusing potential outcomes with inevitability. A technological advance by itself can either be positive or negative for jobs, depending on the social structure as a whole. This is not a technological consequence; rather, it’s a political choice.”

— Seth Finkelstein, software programmer and consultant

Most utopian:

“How unhappy are you that your dishwasher has replaced washing dishes by hand, your washing machine has displaced washing clothes by hand or your vacuum cleaner has replaced hand cleaning? My guess is this ‘job displacement’ has been very welcome, as will the ‘job displacement’ that will occur over the next 10 years. This is a good thing. Everyone wants more jobs and less work.”

Last fall economist Carl Benedikt Frey and information engineer Michael A. Osborne, both at the University of Oxford, published a study estimating the probability that 702 occupations would soon be computerized out of existence. Their findings were startling. Advances in data mining, machine vision, artificial intelligence and other technologies could, they argued, put 47 percent of American jobs at high risk of being automated in the years ahead. Loan officers, tax preparers, cashiers, locomotive engineers, paralegals, roofers, taxi drivers and even animal breeders are all in danger of going the way of the switchboard operator.

Whether or not you buy Frey and Osborne's analysis, it is undeniable that something strange is happening in the U.S. labor market. Since the end of the Great Recession, job creation has not kept up with population growth. Corporate profits have doubled since 2000, yet median household income (adjusted for inflation) dropped from $55,986 to $51,017. At the same time, after-tax corporate profits as a share of gross domestic product increased from around 5 to 11 percent, while compensation of employees as a share of GDP dropped from around 47 to 43 percent. Somehow businesses are making more profit with fewer workers.

The conventional economic wisdom has long been that as long as productivity is increasing, all is well. Technological innovations foster higher productivity, which leads to higher incomes and greater well-being for all. And for most of the 20th century productivity and incomes did rise in parallel. But in recent decades the two began to diverge. Productivity kept increasing while incomes—which is to say, the welfare of individual workers—stagnated or dropped.

Plenty of economists disagree, but it is hard to referee this debate, in part because of a lack of data. Our understanding of the relation between technological advances and employment is limited by outdated metrics. At a roundtable discussion on technology and work convened this year by the European Union, the IRL School at Cornell University and the Conference Board (a business research association), a roomful of economists and financiers repeatedly emphasized how many basic economic variables are measured either poorly or not at all. Is productivity declining? Or are we simply measuring it wrong? Experts differ. What kinds of workers are being sidelined, and why? Could they get new jobs with the right retraining? Again, we do not know...

Erik Brynjolfsson, Andrew McAfee, and Michael Spence offer some informed speculation on how they see the course of technology evolving in "New World Order: Labor, Capital, and Ideas in the Power Law Economy," which appears in the July/August 2014 issue of Foreign Affairs (available free, although you may need to register).

Up until now, they argue, the main force of information and communications technology has been to tie the global economy together, so that production could be moved to where it was most cost-effective. As they write: "Technology has sped globalization forward, dramatically lowering communication and transaction costs and moving the world much closer to a single, large global market for labor, capital, and other inputs to production. Even though labor is not fully mobile, the other factors increasingly are. As a result, the various components of global supply chains can move to labor’s location with little friction or cost."

But looking ahead, they argue that the next wave of technology will not be about relocating production around the globe, but changing the nature of production--and in particular, automating more and more of it. If the previous wave of technology made workers in high-income countries like the U.S. feel that their jobs were being outsourced to China, the next wave is going to make those low-skill workers in repetitive jobs--whether in China or anywhere else--feel that their jobs are being outsources to robots.

Momentum Machines was founded four years ago and aside from receiving the occasional libertarian shout-out — “Meet the robot that’s a minimum wage killer” — the company has mostly been low-profile. But at some point in the not too distant past, the website added a kind of disclaimer that aims to nullify criticism that the company might be destroying jobs.

It is encouraging to hear optimism expressed about the possibility of training displaced line cooks to become robot design engineers, although I expect the educational resources necessary to replace entry-level minimum wage jobs with higher-on-the-food-chain, yet-to-be-automated-away engineering design jobs will be considerably greater than any single San Francisco start-up can muster.

But as for the confidence that “technology like ours actually causes an increase in employment”? Certainly that’s been true for a long time, but as the sophistication and pace of technological change continue to accelerate, at least a few economists are now beginning to wonder whether old axioms still hold true. One thing that’s certainly not in doubt — disrupting the fast food industry and erasing countless jobs will provide a direct test of this thesis!

So how best to brace ourselves for that hiccup on the road to utopia? Here's where Drum and Andreessen part ways. In Andreessen's vision, we "create and sustain a vigorous social safety net" for the economically stranded. Sounds great, but how do we pay for it? He veers into late-night infomercial territory here: "The loop closes as rapid technological productivity improvement and resulting economic growth make it easy to pay for the safety net." The machine will pay for itself!

In other words, robots make everything faster, easier, and better, so humans will make more money selling goods and services, and we'll all end up with more dimes to spare for those still finding their feet in the robot-powered economy. So we shouldn't listen to the "robot fear-mongering" about machines coming to eat our jobs—the robot revolution is also a personal-tech revolution, and iPhones and tablets are new reins on the global economy:

"What never gets discussed in all of this robot fear-mongering is that the current technology revolution has put the means of production within everyone's grasp. It comes in the form of the smartphone (and tablet and PC) with a mobile broadband connection to the Internet. Practically everyone on the planet will be equipped with that minimum spec by 2020. What that means is that everyone gets access to unlimited information, communication, and education. At the same time, everyone has access to markets, and everyone has the tools to participate in the global market economy."

Yet plenty of people are less worried about job-stealing robots than the people who will own the robots. As technologist Alex Payne points out, using a smartphone doesn't mean you've got your hands on the "means of production." Using a robot will never be fractionally or profitable as owning a robot, or a robot factory, or the data center that stores the information collected by the robot. "The debate, as ever, is really about power," argues Payne. And it's no secret that a narrow segment of white and Asian males currently occupies nearly all the ergonomic chairs at that table.

Drum has no doubt that robots are in fact coming to eat our jobs, and it's the folks with the social and financial capital to buy robots that will call the shots: "As this happens, those without money—most of us—will live on whatever crumbs the owners of capital allow us." If the robot-owning 1 percent of tomorrow is anything like today's, then there is little indication that they're willing to share their spoils. Take a look at this chart of productivity versus worker wages over the last 60 years. Productivity has been shooting up, helped in no small part by greater efficiencies thanks to technology. But worker pay hasn't been rising alongside these productivity gains...

So where's all the extra money, the "resulting economic growth" from all this "rapid technological productivity improvement" that Andreessen promises? It's parked in the pockets of the 1 percenters. Here's how the share of income is divided between capital owners—the people who own the technology—and labor:...Drum says these metrics are a few of the economic indicators that make up the "horsemen of the robotic apocalypse" in which "capital will become ever more powerful and labor will become ever more worthless." The other indicators are fewer job openings, stagnating middle-class incomes, and corporations stockpiling cash instead of investing it in new goods and factories.

Software firm Autodesk, founded in 1982, creates virtual design tools used by millions of architects and designers every day. Last year alone, the company produced revenues of $2.3bn. British vice president Pete Baxter is responsible for its architecture, engineering and construction operations in Europe, Asia and the Middle East.

He believes architects have little to fear from artificial intelligence. "Yes, you can automate. But what does a design look like that's fully automated and fully rationalised by a computer program? Probably not the most exciting piece of architecture you've ever seen."

Technology won't destroy the profession, but it will, he says, democratise it. "There's a paradigm shift now: the one-man architect working from home with a bright idea now has access to an infinite amount of computing power in the cloud. That means a one-man designer, a graduate designer, can get access to the same amount of computing power as these big multinational companies. So suddenly there's a different competitive landscape."

It’s a question that economists and workers have been asking since at least the Industrial Revolution. And in the past, the answer has generally been a straightforward “no.” Automation makes certain low-skill human jobs obsolete, sure, but it also ushers in new categories of high-skill employment, from engineering and equipment operation to banking and blogging. Its greatest effect is to increase productivity, which should raise incomes and stimulate demand for new products and services.

Yet the current jobless recovery, along with a longer-term trend toward income and wealth inequality, has some thinkers wondering whether the latest wave of automation is different from those that preceded it.

Replacing manual labor with machines on farms and in factories was one thing, the worriers say. Those machines were dumb and highly specialized, requiring humans to oversee them at every stage. But the 21st century is witnessing the rise of far smarter machines that can perform tasks previously thought to be immune to automation.

Today’s software can answer your calls, organize your calendar, sell you shoes, recommend your next movie, and target you with advertisements. Tomorrow’s software will diagnose your diseases, write your news stories, and even drive your car. When even high-skill “knowledge workers” are at risk of being replaced by machines, what human jobs will be left? Politics, perhaps—and, of course, entrepreneurship and management. The rich will get richer, in other words, and the rest of us will be left behind.

About that Pew study:

The results of the survey were fascinating. Almost exactly half of
the respondents (48 percent) predicted that intelligent software will
disrupt more jobs than it can replace. The other half predicted the
opposite. The lack of expert consensus on such a crucial and seemingly straightforward question is startling...

The New Luddites (Slate) What if technological innovation is a job-killer after all?

The solution is not higher minimum wages. The solution is not a tax on robots like Paul Krugman wants. The solution is not a guaranteed income.

The solution is to eliminate the Fed, eliminate fractional reserve lending, and give the free market a chance to create jobs at its own pace, without all this government and central bank interference.

The alternative "solution" and not one I support, is to kill off a lot of needless people by starting WWIII.

So eliminate the Fed and all our problems are solved? Why no mention of the gold standard? Why does anyone listen to these idiots? I guess we know the Libertarian view. WWIII here we come.

Robots are taking all the jobs. But are we, the average, moderately skilled humans, screwed, or aren't we? Let me just get it out of the way now: We are, unless there are drastic, immediate changes to education and economic systems around the world.

The dominant narrative going around today about Pew Research's new report on artificial intelligence and the future of jobs is that experts can't really decide whether automation is going to make working obsolete, that it's really a toss up whether robots will simply create new jobs in other sectors as they destroy ones in other.

That's true, in one sense: The 1,896 futurists, CEOs, journalists, and university professors questioned for the report were split in half over robots will "displace significant numbers of both blue- and white-collar workers," with 52 percent of respondents agreeing that "human ingenuity will create new jobs, industries, and ways to make a living, just as it has been doing since the dawn of the Industrial Revolution."

But there's one major caveat: The respondents overwhelmingly agree that this lovely future where robots do the work and humans design the robots and everyone has leisure time and lots of money only exists in a fantasy future where the school systems pump out a shitload of Elon Musks and Sergey Brins—or, at the very least, people who can reliably work at the companies those guys own.

If the education system doesn't change to start pumping out technologically savvy, creative people as the rule, not the exception, the rise of robot workers is "certain to lead to an increase in income inequality, a continued hollowing out of the middle class, and even riots, social unrest, and/or the creation of a permanent, unemployable 'underclass,'" the Pew report concludes.

Yes, historically, technology has killed certain types of jobs while creating others. But what we're seeing happen right now isn't merely a redistribution of unskilled jobs to other sectors over the course of a couple decades, or the outsourcing of factory workers to other countries or cities with better tax breaks.

Instead, it's wiping out entire industries, entire swaths of the economy, in years, not decades. And it's killing white collar jobs as frequently as it's killing blue collar ones.

Two hugely important statistics concerning the future of employment as we know it made waves recently:

1. 85 people alone command as much wealth as the poorest half of the world.

2. 47 percent of the world's currently existing jobs are likely to be automated over the next two decades.

Combined, those two stats portend a quickly-exacerbating dystopia. As more and more automated machinery (robots, if you like) are brought in to generate efficiency gains for companies, more and more jobs will be displaced, and more and more income will accumulate higher up the corporate ladder. The inequality gulf will widen as jobs grow permanently scarce—there are only so many service sector jobs to replace manufacturing ones as it is—and the latest wave of automation will hijack not just factory workers but accountants, telemarketers, and real estate agents.

That's according to a 2013 Oxford study, which was highlighted in this week's Economist cover story. That study attempted to tally up the number of jobs that were susceptible to automization, and, surprise, a huge number were. Creative and skilled jobs done by humans were the most secure—think pastors, editors, and dentists—but just about any rote task at all is now up for automation. Machinists, typists, even retail jobs, are predicted to disappear.

And, as is historically the case, the capitalists eat the benefits. The Economist explains:

"The prosperity unleashed by the digital revolution has gone overwhelmingly to the owners of capital and the highest-skilled workers. Over the past three decades, labour’s share of output has shrunk globally from 64% to 59%. Meanwhile, the share of income going to the top 1% in America has risen from around 9% in the 1970s to 22% today. Unemployment is at alarming levels in much of the rich world, and not just for cyclical reasons. In 2000, 65% of working-age Americans were in work; since then the proportion has fallen, during good years as well as bad, to the current level of 59%."

Those trends aren't just occurring in the US, either. That second stat up there is from an Oxfam report entitled Working for the Few, just out this week. It was launched in tandem with the beginning of the World Economic Forum in Davos, in an effort to get the gazillionaires attending it to consider the gravity of their wealth. It finds that "those richest 85 people across the globe share a combined wealth of £1 [trillion], as much as the poorest 3.5 billion of the world's population." Yes, you read that correctly: The 85 richest people have $1.64 trillion between them, the same amount of money as 3.5 billion of the world's less fortunate souls.
The trend extends beyond a few handfuls of the planet's most mega-tycoons, of course: "The wealth of the 1% richest people in the world amounts to $110tn (£60.88tn), or 65 times as much as the poorest half of the world." And they and their corporations are building robots that will have the net effect of letting them keep even more of that capital concentrated in their hands.

...[I]f robotics companies play a direct role in rapidly rendering entire types of professions obsolete, will both government and society expect them to take more responsibility for helping those who have been displaced?

One example of where this is already happening is in San Francisco, where the company Momentum Machines has been working for several years to develop a robotic system that can make the perfect custom hamburger (below)—cooking the burger, slicing and placing the toppings, even, eventually, grinding meat to order—at the rate of 360 burgers an hour.

The company's aim is to replace the line cook at fast food restaurants. Initially, it plans to set up its own chain of restaurants; eventually, it expects to sell its burger-robot to competitors.

The company recognizes that when a restaurant brings in its system, jobs will be eliminated; it wants the men and women who lose their jobs to become engineers and work to design more automated systems. On the website, the company states: "We want to help the people who may transition to a new job as a result of our technology the best way we know how: education. Our goal is to offer discounted technical training to any former line cook of a restaurant that uses our device. We will certainly need more engineers to design new devices and technicians to service a growing line of automated restaurant solutions. These are the minds that can do this job."

It also is asking for ideas about other ways it can "help with the transition" as robots replace workers.

There appears to be a new economic order on its way. It goes by the emoticon-friendly name of the “sharing economy,” and its golden promises are legion. Sharing one set of power tools among neighbors sounds upbeat and sensible. And defraying some of your enormous rent by periodically letting strangers review your hospitality and the fluffiness of your towels is most enticing. To teachers without tenure or retirees without pensions, supplementary income might be a lifesaver.

But in some ways, that new order is crueler than even the dreariest cubicle farm. Corporate America has viewed labor as its biggest liability for decades now, and data-driven CEOs are sharpening the spear. Former GE Chairman Jack Welch once said the ideal business would involve placing every factory on a barge so that it could be towed to wherever the law was most hospitable to capital. Venture capitalist Peter Thiel is actually trying to implement this. Less fantastically, so is Uber’s Travis Kalanick. His hostility towards human drivers, whose contractual employment is but an expendable waystation on the way to a post-human transportation grid, is well-documented. Kalanick breezily told an audience at the CODE Conference in May that as self-driving cars become a possibility, yes, Uber’s workforce will be history: “I would say to them this is the way the world is going. We have to find a way to change with the world.”

Let’s be clear: the sharing economy isn’t exactly the same thing as artisanal guilds. And the potential phase-out of part-time Uber drivers won’t be like the disappearance of centuries-old trades, neither for its overall economic importance nor the more romantic reason of seeing age-old skills die off. Almost nobody becomes an Uber driver unless they have to—and that goes double for TaskRabbits. These are shitty jobs that college-educated people are forced into because their professions have been gutted, and any job stability they might once have enjoyed has been eviscerated. What’s being destroyed here might have little sentimental value, but it’s the type of contingent, piecemeal, on-call or freelance income people turn to when traditional means of employment have failed them. And even its days are numbered.

You don’t even have to be replaced by an actual robot to feel the heat. The New York Times’s Jodi Kantor had an superb article on how major retailers use advanced software to schedule employees based on a number of factors, keeping those companies “nimble” (to use a favored buzzword) but all but guaranteeing that workers’ schedules are capricious, erratic and outright destructive of their personal lives. Such invisible automation choreographs workers in precise, intricate ballets, using sales patterns and other data to determine which of [Starbucks’] 130,000 baristas are needed in its thousands of locations and exactly when. Big-box retailers or mall clothing chains are now capable of bringing in more hands in anticipation of a delivery truck pulling in or the weather changing, and sending workers home when real-time analyses show sales are slowing. Managers are often compensated based on the efficiency of their staffing.

In other words, the better you are at messing with your underlings’ lives, the bigger your bonus. Flexibility, the article notes, is usually considered a boon by professionals with office jobs; for some, the right to telecommute once each week is practically expected in the terms of their employment. But as it’s frequently a tool for maximization of profit at the expense of workers’ income stability, its disruptive side may not appear in economic statistics. No jobs are created or destroyed if a store’s employees are ordered to work 35 hours one week and 20 the next, even if the actual employees are barely hanging on. Unemployment statistics look to become even more hollowed-out and skewed than they already are.

Techno-utopians aren’t confronting their immiserating dystopia very honestly. If anything, disruption is celebrated to a fault, and if Airbnb quietly tosses hotel chambermaids to the wolves, well, that’s unappealing grunt work, anyway. The Pew Internet study’s most pessimistic forecasters have lots of fascinating tidbits to chew on, but they essentially agree that every job that can be automated will be, a trend that will go far beyond peripheral or admin or “pink-collar” positions such as X-ray technician or paralegal—something tech’s cheerleaders are reluctant to broadcast, for obvious reasons.

In general, McAfee says, Uber is amazing. But the experience of his driver shows we have entered a mostly uncharted phase of economic history:

My driver said he’d been with Uber ever since he’d graduated from his master’s program in IT project management last year. This profession was, according to him, going through hard times. In the wake of the great recession steady jobs had been replaced by short-term contracts, and there weren’t even a lot of these to be had. As a result he was now competing against much more experienced people for each new gig that came up, and he hadn’t had a lot of success since graduating.

So to cover his monthly fixed costs of student loan payments (on more than $100k in debt), rent, and healthcare he was driving for Uber. A lot. He estimated that he spent more than 60 hours a week behind the wheel. This allowed him to pay his bills, but not to build up any real savings.

To which I say good for him, and for Uber. This is a guy who could be sitting around waiting for the dream job he’d gone to school for, collecting unemployment, defaulting on his loans, and/or dropping out of the labor force for good. Instead, he was working hard at a job that was available.

He goes on to recount a subsequent conversation with futurist Ray Kurzweil on the subject of what the rise of Uber will mean for everyone else:

As far as I can tell [Kurzweil] thinks that there are no real challenges accompanying today’s rapid tech progress. He predicted that there will be plenty of jobs, and that they’ll be fulfilling ones that allow people to pursue their passions. Well, my driver couldn’t find work doing what he went to school for, and he didn’t describe driving people around as his passion. My read of the evidence is that good, secure, fulfilling jobs are declining as we head deeper into the second machine age, not spreading throughout the economy.

Those of you who've been reading this blog for a while (and thanks to those who have) know that the replacement of the workforce with machines has been a theme since nearly the very first post. Over the years, I've tried to bust the silly myths surrounding this - that new jobs will magically appear even though de-industrialization has turned vast swaths of the country into depopulated dystopias, that automation consists of walking, talking robots rather than algorithms, that creativity is irreplaceable even though the vast majority of jobs are rote and repetitive and always have been since the Neolithic revolution.

I've seen this topic get much more attention. Over the past week, it's gotten a lot more attention thanks to a recent Pew Report looking at the future of automation. Like most discussions of the topic, it comes up with no definite conclusions. It seems the "experts" once again have no idea what they are talking about ("give me a one-handed economist!"). I would hazard that the experts arguing that new jobs will magically appear are applying the economists' standard logic of "X has always happened before, therefore it will always happen." By contrast, the economists predicting something else have actually spent time and effort studying the machines they are taking about!

Now comes this excellent explanatory video that you may have already seen (it seems to have gone viral). It does such a good job in 15 minutes that hardly anything more need be said:

I've been collecting links for months now, so I think it's time to dribble these out.About the video above:

While the first wave of automation took the form of mechanical muscle, says Grey, the next generation is about artificial intelligence, “mechanical minds,” and smart bots that can teach themselves things they aren't pre-programmed to do. That type of vitality will kick the robot economy into high gear.

"General purpose" robots, like Baxter, Rethink Robotics’ adaptive industrial robot, cost less than the average human salary and have a nearly endlessly versatile skillset, Grey says. "Baxter today is the computer of the 1980s," says the video. "He's not the apex, but the beginning."

In other words, as robots get more useful, demand goes up, prices go down, and they get smaller, more user-friendly and start to proliferate, just like computers have over the last 30 years.

The rather depressing video makes a strong case for why just about zero jobs are safe, and it's high time we wise up to that fact. But the general population is still split on the idea. A recent Pew Research report found that experts are split 50/50 as to whether artificial intelligence will create or destroy jobs. It also, more frighteningly, found that unless our current education system drastically changes, the rise of robot workers is certain to lead not to a post-scarcity age of mass leisure, but "to an increase in income inequality, a continued hollowing out of the middle class, and even riots, social unrest, and/or the creation of a permanent, unemployable 'underclass.'"

The Reddit discussion thread about Grey's explainer video was similarly divided between dystopian and utopian future-predictors. But the video didn't bother to speculate as to whether we should welcome or dread automation, only that it's inevitable, and indeed is already here in nearly every sector of the economy.

"This is an economic revolution," it says. "You may think we've been here before, but we haven't. This time is different."

While the video may make you feel sad for the horses, you have to remember that there are just fewer of them. This is because horses were not labour but, in fact, capital. So when they were replaced by oil and stream powered capital, it was a substitution of capital goods or, more to the point, fuel. So for the bourgeoisie who were invested in horses, that substitution was bad news for them. For the bourgeoisie invested in wheels, it was another matter.

Labor is also capital - has he never heard the term "human capital?" Really?

But it is instructive to consider how that substitution arose. Basically, for horses to compete, their ‘rental cost’ to productivity ratio had to be sufficiently low that they could be employed in competition with other capital that had their own ‘rental cost’ to productivity ratio (usually worse than horses) plus a quality advantage. So as the lot of other capital improved (both in productivity and quality), horses ‘rental cost’ had to drop. There came a point, however, where the rental cost couldn’t drop far enough to make horses competitive and that was it for them.

Economist technobabble meaning machines became cheaper and more efficient to use than horses.

So let’s now translate this for the robots versus humans equation. There are lots of differences but let’s start with the humans as horses point. For a human to be displaced by a robot in a job the robot (a) has to have a quality advantage and (b) the human productivity must be so low that even if the wage drops to minimum (by law or subsistence or opportunity cost), they will be uncompetitive. What that means is that the routine jobs listed in the CGP Grey video are vulnerable (as they have been to capital substitution in the past) but the other jobs (including journalism and actually being a lawyer) are less so. To think that they are means you have to have a robot technology that is actually higher quality in doing those jobs. The video points to certain tasks being vulnerable but, at the moment, it does not look like the higher cognitive bits (such as they are) are in trouble. But I’m a technology optimist, so who knows? I’m just saying that the quality advantage needs to appear.

But what of the humans that have no quality advantage with respect to robots for what they are currently skilled at? The Brynjolfsson-McAfee line is that they will need to acquire other skills and, more to the point, it is not clear the market can work (or at least work fast enough for our liking) to make that happen. So there is a danger of the humans going down the lot of horses there.

What skills are those? The idea that people are not "creative" enough is ludicrous - many of us already have no outlet for our creativity (please talk to your local frustrated architecture school grad. Or music, or art or litrature or...). As the video points out, we can't have an economy based on painting and poetry as currently constituted.

There are, however, two important differences. First, unlike the horses, the humans are also useful as consumers. They are the people who will value the products the robots (and other humans) produce. Think about that for a moment. For each person who is disengaged from society because of a robot, if you cut them off from consumption as well (by say not giving them any money), that is a unit of demand gone. So this pool of unemployed are left outside the system and do not interact in any way with the robot-employed economy.

So? We've already been doing that for decades. Has he been to Appalachia? Any inner-city slum? Whatever remains of the middle class will sell to each other, and the temporarily increasing middle classes overseas. The rest can go to hell, since not having a job will be spun as individual fault, and people will believe it.

If that sounds unsustainable, it is. There is a contradiction in the story. You have a person who values the product produced by robots by more than the ‘total cost’ in terms of resources to supply them with that product. You have to feel pretty ill about the prospects of capitalism to suppose that such an opportunity (certainly at scale) will go unexploited.

I do feel so ill about the prospects of Capitalism. See above. Has he been to Detroit? And since when has "sustainability" had anything to do with capitalism?

How will this happen? The most obvious way is that a collective agency will step in to ensure those people can pay for the product so that normal market based prices will be formed and transactions will take place. That agency is obvious in democratic society: the government. And before you think that this is some leftist notion (not that there’s anything wrong with that), I’m not theorising that here: I have faith that if it is in the interests of both business and consumers that money go from the employed to unemployed, it will. It will happen. So there may be unemployment but that does not mean that the humans adversely affected will become horse meat.

Would that be the same government that is increasingly staffed with Tea Party members (social insurance is a hammock! Dependence on government!! Self-reliance!!!), where food stamps are curtailed, unemployment extensions vetoed, taxes are cut on the rich, and health care expansions for the poor refused? The same government that is dysfunctional and been de-funded for decades (the national debt!!)? The same government where campaigns are sponsored exclusively by the wealthy ownership class? The same government where even "leftists" constantly offer paeans to the untrammeled "free" market? That government??

But there is another mechanism which goes back to the title of this post. The presumption is always that the bourgeoisie rather than the proletariat owns the machines. But why should that be the case?

Because that is the very definition of capitalism you bloody idiot! Just because communism failed does not mean you need to disregard even common-sense observations about our economy because they were made first by Marx. If the means of production were common-pool we would not be living under capitalism. The problem is, ownership has consistently become more, not less concentrated over the last several decades. Why would that change? If anything, automation will increase profits to those who can afford to purchase them first, leading to even more concentrated ownership through mergers and acquisitions. Indeed, this is already happening.

Thursday, August 14, 2014

Well, I've decided to stop working for the rest of August. It's just hard to sit in front of a screen and give a shit here in late summer, and I don't know why anyone has to do it. Work's slow enough, and I have the hours (having taken none so far this year), so it makes sense to take off rather than charge my project for me to sit and be miserable.

I get the same reaction of shock when I tell outside people I'm leaving for the rest of the month. "That's a long time," is the universal comment here in America, even though no one would bat an eyelash for that in the entire rest of the friggin' planet. I've just got to get out of this country somehow.

I don't what it is this time of year. I just want to tune out everything, including the Robin Williams suicide, the police state rollout in Fergustan, and the Islamic state in Syria. Much of the government is on vacation, and as with health care, it's interesting to the note how the elites in America give themselves perks and benefits that they steadfastly refuse to extend to the people they govern.

Anyone have any suggestions where to go? I've been paralyzed by too many choices. South America or Europe? I was considering Ireland, but I'm also contemplating London, Amsterdam, Prague, maybe even elsewhere. Any readers want to put up a blogger in their house? Anyone in Europe want to have a meetup?

Obviously posting will be a bit unpredictable and intermittent from now until Labor Day.

Every August, Paris now sees another rapid transformation. Tourists rule the picturesque streets. Shops are shuttered. The singsong sounds of English, Italian, and Spanish float down the street in place of the usual French monotone. As French workers are required to take at least 31 days off each year, nearly all of them have chosen this month to flit down to Cannes or over to Italy, Spain, or Greece, where the Mediterranean beckons and life hasn’t stopped like it has here.

Some might call it laziness, but what French people are really doing by vacationing for the entirety of August is avoiding the tipping point of overwork. Just as the city transforms overnight, so do French work habits—and this vacation time pays dividends. That’s because, even though the amount of time you work tends to match how productive you are as if on a sliding scale, length of work and quality of work at a certain point become inversely related. At some point, in other words, the more you work, the less productive you become.

For example, working long hours often leads to productivity-killing distractions. Such is an instance of the saying known as Parkinson’s law, which states that work expands so as to fill the time available for its completion. Work less, and you’ll tend to work better.

It has long been known that working too much leads to life-shortening stress. It also leads to disengagement at work, as focus simply cannot be sustained for much more than 50 hours a week. Even Henry Ford knew the problem with overwork when he cut his employees’ schedules from 48-hour weeks to 40-hour weeks. He believed that working more than 40 hours a week had been causing his employees to make many errors, as he recounted in his autobiography, My Life and Work.

It seems silly that many work long hours simply for the sake of having worked long hours. Perhaps the reason people overwork even when it is not for “reward, punishment, or obligation” is because it holds great social cachet. Busyness implies hard work, which implies good character, a strong education, and either present or future affluence. The phrase, “I can’t; I’m busy,” sends a signal that you’re not just an homme sérieux, but an important one at that.

There is also a belief in many countries, the United States especially, that work is an inherently noble pursuit. Many feel existentially lost without the driving structure of work in their life—even if that structure is neither proportionally profitable nor healthy in a physical or psychological sense.
In his 1932 essay “In Praise of Idleness,” the British philosopher Bertrand Russell corrects this idea, writing, “A great deal of harm is being done in the modern world by belief in the virtuousness of work.” Rather, “the road to happiness and prosperity lies in an organized diminution of work.”
That is to say happiness is ultimately not found in late nights spent at work, but in finding a way to work less, even if that means buying fewer things or recalibrating your perspective such that having free time no longer suggests moral shortcomings.

We Americans work hard. Weekends are more like workends. We sleep with our smartphones. And we think vacations are for wimps. So we don’t take them. Or take work along with us if we do.

But what if taking vacation not only made you healthier and happier, as a number of studies have shown, but everyone around you? And what if everybody took vacation at the same time? Would life be better, not just for you, but for the entire society?

Yes, argues Terry Hartig, an environmental psychologist at Uppsala University in Sweden. Yes, indeed. When people go on a relaxing vacation, they tend to return happier and more relaxed. (The operative word here being relaxing, not frenzied whirlwind.) Traffic? A smile and nod instead of flipping the bird. An upset at the office? A deep breath and a focus, not on the drama, but on the task at hand.

And those mellow, good vibes, he said, spread “like a contagion” to everyone you come in contact with...To test his theory, Hartig and his colleagues studied monthly anti-depressant prescriptions in Sweden between 1993 and 2005. In a recently published study, they found that the more people took vacations at the same time, the more prescriptions dropped exponentially. That was true for men and women, and for workers as well as retirees.

Summer, by far, was the happiest time – or at least saw the steepest declines in anti-depressant prescriptions. It’s no surprise why: Since 1977, Swedish law has mandated that every worker have five weeks of paid vacation every year. And workers can take four consecutive weeks off in the summer. Europeans, with their 20 and 30 days of paid vacation every year, live longer and spend less on health care than Americans, Hartig said.

But that kind of widespread, vacation-induced health and euphoria is unlikely to hit the United States anytime soon. “Collective restoration,” Hartig said, is only possible if the entire population can coordinate time off. And the only way to do that, he argues, is through national policy.

The US is the only advanced economy with no national vacation policy. (Unless you count Suriname, Nepal and Guyana.) One in four workers, typically in low-wage jobs, have no paid vacation at all. Those that do, get, on average, 10 to 14 days a year.

American workers don’t take all their vacation days, leaving, by some estimates, 577 million unused days on the table every year. And even when they do, many say they take work along with them. (All those unused days add up to $67 billion in lost travel spending and 1.2 million jobs, according to a recent report by Oxford Economics, an economic forecasting group.) The closest that Americans may come to collective restoration, Hartig said, is the quiet week between Christmas and New Years, when large swaths of the population leave the office behind.

William Howard Taft didn’t want Americans to have to go on vacation alone. In 1910, he proposed giving American workers two to three months of paid vacation every year. The naturalist John Muir said better than compulsory schooling, the U.S. should consider compulsory vacationing. In 1938, Congress proposed the 40-hour work week, a minimum wage and two weeks of paid vacation. In both instances, the vacation proposals died.

Daimler employees can head to the beach this summer without worrying about checking emails, sparing their partners and children the frustration of work-related matters intruding on the family vacation.

The Stuttgart-based car and truckmaker said about 100,000 German employees can now choose to have all their incoming emails automatically deleted when they are on holiday so they do not return to a bulging in-box.

Wednesday, August 13, 2014

On January 6, 2004, Paul Craig Roberts and US Senator Charles Schumer published a jointly written article on the op-ed page of the New York Times titled “Second Thoughts on Free Trade.” The article pointed out that the US had entered a new economic era in which American workers face “direct global competition at almost every job level–from the machinist to the software engineer to the Wall Street analyst. Any worker whose job does not require daily face-to-face interaction is now in jeopardy of being replaced by a lower-paid equally skilled worker thousands of miles away. American jobs are being lost not to competition from foreign companies, but to multinational corporations that are cutting costs by shifting operations to low-wage countries.” Roberts and Schumer challenged the correctness of economists’ views that jobs off-shoring was merely the operation of mutually beneficial free trade, about which no concerns were warranted...in response to a question about the consequences for the US of jobs off-shoring, Roberts said: “In 20 years the US will be a Third World country.”

It looks like Roberts was optimistic that the US economy would last another 20 years. It has only been 10 years and the US already looks more and more like a Third World country. America’s great cities, such as Detroit, Cleveland, St. Louis have lost between one-fifth and one-quarter of their populations. Real median family income has been declining for years, an indication that the ladders of upward mobility that made America the “opportunity society” have been dismantled. Last April, the National Employment Law Project reported that real median household income fell 10% between 2007 and 2012.

Republicans have a tendency to blame the victims. Before one asks, “what’s the problem? America is the richest country on earth; even the American poor have TV sets, and they can buy a used car for $2,000,” consider the recently released report from the Federal Reserve that two-thirds of American households are unable to raise $400 cash without selling possessions or borrowing from family and friends.

Although you would never know it from the reports from the US financial press, the poor job prospects that Americans face now rival those of India 30 years ago. American university graduates are employed, if they are employed, not as software engineers and managers but as waitresses and bartenders. They do not make enough to have an independent existence and live at home with their parents. Half of those with student loans cannot service them. Eighteen percent are either in collection or behind in their payments. Another 34% have student loans in deferment or forbearance. Clearly, education was not the answer.

Jobs off-shoring, by lowering labor costs and increasing corporate profits, has enriched corporate executives and large shareholders, but the loss of millions of well-paying jobs has made millions of Americans downwardly mobile. In addition, jobs off-shoring has destroyed the growth in consumer demand on which the US economy depends with the result that the economy cannot create enough jobs to keep up with the growth of the labor force.

Between October 2008 and July 2014 the working age population grew by 13.4 million persons, but the US labor force grew by only 1.1 million. In other words, the unemployment rate among the increase in the working age population during the past six years is 91.8%

[...]

In the ten years since Roberts and Schumer sounded the alarm, the US has become a country in which the norm for new jobs has become lowly paid part-time employment in domestic non-tradable services. Two-thirds of the population is living on the edge unable to raise $400 cash. The savings of the population are being drawn down to support life. Corporations are borrowing money not to invest for the future but to buy back their own stocks, thus pushing up share prices, CEO bonuses, and corporate debt. The growth in the income and wealth of the one percent comes from looting, not from productive economic activity.

This is the profile of a Third World country.

The De-industrialization of America (Paul Craig Roberts) Third world country indeed. Perhaps that's why our politics increasingly resembles that of a banana republic and our rural areas look more like post-crash Russia (or tribal Afghanistan). It's something to keep in mind when "experts" talk about automation. According to them, America's de-industrialization was just fine, because people just went out and got new jobs in the "service economy" and the unemployment rate was only five percent! Now they are spinning the same "new jobs will magically appear" story in regards to automation. But the above story offers a different take, one actually based in the real world most of us live in. The term "Clerisy" is coined by this excerpt from a forthcoming book:

Despite America’s egalitarian roots, the prospect of mass downward
mobility has been embraced widely by some business oligarchs and much of
the Clerisy. The future being envisioned is one dominated by automated
factories and computer-empowered service industries that will continue
to pressure both jobs and wages in the future. In this scenario,
productivity will rise, but wages may stagnate or decline. This leads
some to propose that the American middle and working classes has become
economically passé. Steve Case, founder of America Online, has even
suggested that future labor needs can be filled not by current residents
but by some thirty million immigrants.

Arguably the first group to feel the downward pressure has been blue
collar workers, whose lot has declined over the past few decades. After
World War Two, as the United Autoworkers’ Walter Reuther noted, “the
union contract became the passport to a better life” that was creating
“a whole new middle class.” But with the shifting of industry overseas
and the decline of private sector unions, the path for blue collar
workers to enter the middle class has become more difficult.

Although they often claim to defend the middle class, the political
stance adapted by the Clerisy, as well as the tech oligarchs and the
investors, tends to worsen this trajectory. Environmental concerns
impose themselves most against basic industries such as fossil fuels,
agriculture and much of manufacturing. These employ many in highly paid
blue-collar fields, with average salaries of close to $100,000. In the
last decade, top U.S. firms, notes the liberal Center for American
Progress, have cut almost three million domestic jobs. Automation also
leads to the diminution of traditional white collar professions as well
as the shift of high-end service jobs offshore.

Overall, it has become increasingly common to regard the middle class
as threatened and even doomed. Indeed, as early as1988 Time magazine
featured a cover story on the “declining middle class,” which at that
time was considerably more healthy than today. After the great
recession, the American blue-collar worker has been pitied, but
certainly not helped by the clerisy, which believes that there is no
hope for manufacturing or similar outmoded jobs in an information age.
Blue collar workers were described in major media as “bitter,”
psychologically scarred” and even an “endangered species.” Americans,
noted one economist, suffered a “recession” but those with blue collars
endured a “depression.”

This perspective extends across ideological lines. Libertarian economist Tyler Cowen suggests that
an “average” skilled worker can expect to subsist on little but rice
and beans in the future U.S. economy. If they choose to live on the East
or West Coast, they may never be able to buy a house, and will remain
marginal renters for life. Left-leaning Slate in
2012 declared that manufacturing and construction jobs, sectors that
powered the yeomanry’s upward mobility in the past, “aren’t coming back.
Rather than a republic of yeoman, we could evolve instead, as one
left-wing writer put it, living at the sufferance of our “robot
overlords,” as well as those who program and manufacture them, likely
using other robots to do so.

Contempt for the middle class is often barely concealed among those most comfortably ensconced in the emerging class order. Financial Times columnist Richard Tomkins declared that
the middle class, “after a good run” of some two centuries, now faces
“relative decline” and even extinction. This historical shift towards
mass downward mobility elicited only derision, not concern: “Classes
come and classes go” and that when the middle orders disappears about
the only ones that will be sorry to see them go might be the “middle
classes themselves. Boo hoo.”

Monday, August 11, 2014

One common justification for unemployment and inequality that's often trotted out is that there are plenty of jobs, it's just that people aren't studying the right things (id est, 'the economy would be fine if it weren't for the damn people living in it!'). If everyone just got a STEM degree (or the latest STEM-H adding health care), the argument goes, then there would be enough jobs to go around for everybody. If you don't get a STEM degree for whatever reason, well, then it's YOUR FAULT and you deserve to unemployed and paid minimum wage. If you major in art or literature or philosophy, then god help you. But the real picture is somewhat more complicated.

It’s puzzling that so few American students graduate with engineering and science degrees, even though these majors would grant them much higher salaries. Maybe the reason is that these majors require so much homework.

But maybe students will take comfort from a study suggesting that the choice of major may be influenced by the way some people’s brains are wired.

The study used a survey of Princeton’s incoming freshman class of 2014 to examine correlations between major choice and neuropsychiatric disorders. It was conducted by Benjamin C. Campbell, a researcher at Princeton’s Neuroscience Institute, and Samuel S.-H. Wang, a molecular biology professor at Princeton.

Students answered questions about their academic discipline intentions, as well as whether they, their immediate family members or grandparents had one or more of a number of neurological and mental disorders, including Alzheimer’s, attention deficit hyperactivity disorder (A.D.H.D.), autism spectrum disorder, bipolar syndrome, epilepsy, Parkinson’s and schizophrenia.

Most of the illnesses the researchers asked about didn’t seem to have any relationship with which academic discipline students were drawn to. But a few did.

Students pursuing STEM degrees (science, technology, engineering, math) were more likely than other students to report having a sibling with an autism spectrum disorder. (Of the 1,077 students who responded to the survey, 16 aspiring technical majors and four aspiring non-technical majors said they had siblings with an autism spectrum disorder.)

Additionally, students intending to major in the humanities were more likely to say that they, an immediate family member or their grandparent had been diagnosed with a major depressive disorder, bipolar disorder or substance abuse problems.

Intellectual interests, it seems, do have some relationship to mental and neurological disorders. At least one earlier study, based on family histories of 30 creative writers, had similarly found a connection between literary creativity and mental illness. And the findings resonate with high-profile examples of brilliant artists who suffered from mental illness (Ernest Hemingway, Kurt Cobain, Virginia Woolf, Edvard Munch, etc.).

Prior studies have also supported the link between autism and familial interest in STEM studies.

Perhaps swayed by statistics about the shortage (and correspondingly high wages) of engineers and scientists in the United States, in the last decade nearly one incoming freshman in 10 have said they expected to major in engineering. (Over all, about a third of incoming freshmen said they planned to major in any of the science and engineering fields.)

But the share who actually complete degrees in engineering has been about half that. Certain demographic groups planning to major in the natural sciences also had relatively high dropout rates.

What accounts for the high attrition rates? Maybe some of it has to do with aptitude, or encouragement, or good role models and mentors. But Philip Babcock, an economist at the University of California, Santa Barbara, suggests that a lot of it has to do with homework...

As my colleague Christopher Drew wrote in an article in November, STEM fields (science, technology, engineering, mathematics) have also had less grade inflation than the humanities and social sciences have in the last several decades. Given the study habits shown above, this probably isn’t surprising; courses with higher grading standards will often require students to study harder to get an A.

So maybe students intending to major in STEM fields are changing their minds because those curriculums require more work, or because they’re scared off by the lower grades, or a combination of the two. Either way, it’s sort of discouraging when you consider that these requirements are intimidating enough to persuade students to forgo the additional earnings they are likely to get as engineers.

Ralph Stinebrickner and Todd Stinebrickner say lots of kids come into college thinking they want to major in science, but then quit because it's too hard:

"Taking advantage of unique longitudinal data, we provide the first characterization of what college students believe at the time of entrance about their final major, relate these beliefs to actual major outcomes, and, provide an understanding of why students hold the initial beliefs about majors that they do. The data collection and analysis are based directly on a conceptual model in which a student’s final major is best viewed as the end result of a learning process. We find that students enter school quite optimistic/interested about obtaining a science degree, but that relatively few students end up graduating with a science degree. The substantial overoptimism about completing a degree in science can be attributed largely to students beginning school with misperceptions about their ability to perform well academically in science."

This is important to keep in mind when you hear people talk about the desirability of increasing the number of students with STEM degrees. To make it happen, you probably either need better-prepared 18-year-olds or else you have to make the courses easier. But it's not that kids ignorantly major in English totally unaware that a degree in chemistry would be more valuable.

Scott Zeger really wants to get away from the idea intro science classes are weeder courses. Jam-packed lectures that about half of all Johns Hopkins students take are some of the most difficult to teach and take, Inside Higher Ed reports today. This stat was startling:

"At Johns Hopkins, more than 60 percent of incoming freshmen in 2006 indicated an interest in a STEM (science, technology, engineering or math) career. Of those students, 57 percent earned a degree in one of those fields."

Zeger wants to make such courses more interactive, using tricks like breaking chocolate bars into pieces to illustrate a point about clinical trials, for instance. I certainly remember taking intro science courses. They were boring and difficult and didn’t do much to entice folks like me to pursue a career in science. It could be a way to get more folks into the pipeline to begin with.

Summary: Another day, another astonishing bogus crisis (the STEM shortage) in which well-meaning Americans labor against their own interests to further enrich the 1%. The true nuggets of insight in the news media reveal so much, but accomplish nothing unless they spark action.

That the shortage of STEM workers was bogus was quite obvious from the start, as Klein explains. It is Econ 101:

If there were a big, general shortage of these workers, you would expect to see their wages rising. That hasn’t happened.

There would be relatively low and declining unemployment rates compared with people of similar educational levels. Hasn’t happened.

There should be faster-than-average employment growth, which is occurring in some occupations but not others.

Scores of articles pointed out these obvious facts (include two posts on the FM website; see below). But nothing stops the needs of the 1%; we eagerly bow to their truth and march to their tune.

The shortage of STEM workers: another bogus crisis crafted to benefit the 1% (Fabius Maximus) It's a ginned up crisis. If there really were a shortage and we really wanted Americans to have these jobs, we wouldn't charge an arm and a leg for a STEM degree. In fact, we want to confine these jobs to the "right" people, and any shortages can made up by importing people from the global elite labor pool - Americans be damned. The STEM shortage is just another way of blaming American workers for their own destruction by the plutocracy.