Thursday, March 31, 2011

4. Darkness on the Edge of TownThe wheels started to come off in the 1970's, prompted in part by several oil crises. At the same time, the United States began to face steep international competition from other nations, particularly Japan, who had developed highly-efficient, rational, and highly automated manufacturing systems based upon the ideas of American industrial engineers such as Taylor and W. Edwards Deming. To compete, U.S. factories began to automate and consolidate, a trend that picked up steam with the financialization of the American economy in the 1980's under Ronald Reagan. America began to undergo the process of deindustrialization. For the first time since the war, joblessness became a major concern. As factory jobs disappeared, politicians told an anxious public that the loss of factory jobs was nothing to fear. As American companies became more "competitive", they argued, they would create new jobs in other areas through "innovation." The capital freed up by automating repetitive factory jobs would be invested in new industries that would easily reabsorb the displaced workers, they claimed. All that was needed were low taxes and deregulation.

Manufacturing had absorbed the mass of dispossessed workers as agriculture became mechanized. With manufacturing gone, what was to absorb those displaced workers? Some touted a "service economy" where low-wage service work would absorb them. But the low status and wages of these jobs led to a downward spiral of living standards. Industrial cities like Cleveland and Detroit started to fall apart. Many white Americans got into their cars and fled to the suburbs, where they took clerical and office jobs. For many African-Americans who had lost manufacturing jobs and had no access to transportation, things did not turn out so well. The nation's urban areas became ghettos, with crime and drugs taking their toll. Many towns throughout the Midwest were founded entirely on manufacturing, converting local raw materials like wood, stone, agricultural products or metal ore into products and shipping them off via railroad or seaway. America had once been the world's factory floor. Without access to education,these towns became decimated, with only low-wage service jobs or health care jobs available. Some fled to cities to look for work while others took civil service jobs or enlisted into the military as a way out. Wal-Mart became the nation's largest employer, taking over from General Motors. Food and durable goods were cheap and plentiful, but the costs for housing, education, transportation and health care skyrocketed. Banks extended easy credit to make up for lost income, and people sank heavily into debt to pay for what they could no longer buy with their stagnant wages.

Then came the double-whammy - the Internet and globalization. China, with its billion-plus population provided an unbeatable combination - unlike many developing countries, it was thoroughly industrialized thanks to the legacy of Mao, and it had a vast, inexhaustible pool of cheap, easily exploited labor. The Internet and computers allowed vast supply chains to stretch around the world, with the coup-de-grace provided by discount retailers such as Wal-Mart putting pressure on suppliers of goods to lower prices at all costs or lose contracts to those who could. This perfect storm hollowed out American manufacturing. During the nineties, the vast pool of cheap Chinese labor substituted for automation. Due to the relative difference in currencies, even if American workers were paid a dollar an hour, they could still not compete on price. America entered what was termed the "post-industrial economy."

Eventually, American politicians no longer even paid lip-service to a revival of American manufacturing. In the nineties, president Bill Clinton told the American people that "these jobs are gone, and they aren't coming back." Clinton told unemployed workers that they needed to retrain for new "high-tech" jobs. Economists trotted out a series of buzzword-centric economic models like the "knowledge economy," the "information economy," and the "experience economy." Economists claimed that these new economic models would take the place of manufacturing and provide stable employment for the masses. Yet none of these economic models panned out. A growing wealth gap between workers and the investor class grew to obscene levels. Corporations took advantage of the vast labor pool to increase profits. Workers were increasingly left out in the cold. The jobs that were created since the 1970‘s were predominantly low-paying jobs of lesser quality than those in the past, at least for the majority of workers.

In 1996, iconoclastic economist Jeremy Rifkin published The End of Work. Rifkin contended that a combination of higher productivity and automation would eventually displace workers in every field as computers became cheaper and artificial intelligence became more refined. Rifkin provided an extensive history of labor dislocations, including an analysis of Technocracy’s arguments during the Depression years. He contended that the amount of workers had outstripped the need for them back in the Depression, and that only a series of bubbles, from the postwar consumer/suburbanization bubble to the financial bubble to the Dot-com bubble to the credit bubble (and, eventually, the housing bubble), had provided the illusion of an economy that could provide enough jobs for everyone. Yet, he predicted, the drive for automation would continue unabated, eventually destroying the purchasing power needed to sustain all those bubbles and inexorably ushering in an era of permanent unemployment for the masses. Nearly every page could be quoted as relevant to our discussion, but I will only offer this snippet from chapter 1:

"While earlier industrial technologies replaced the physical power of human labor, substituting machines for body and brawn, the new computer-based technologies promise a replacement of the human mind itself, substituting thinking machines for human beings across the entire gamut of economic activity. The implications are profound and far-reaching. To begin with, more than 75 percent of the labor force in most industrial nations engage in work that is little more than simple repetitive tasks. Automated machinery, robots, and increasingly sophisticated computers can perform many of not most of these jobs. In the United States alone, that means that in the years ahead more than 90 million jobs in a labor force of 124 million are potentially vulnerable to replacement by machines. With current surveys showing that less than 5 percent of companies around the world have even begun to make the transition to the new machine culture, massive unemployment of a kind never before experienced seems all but inevitable in the coming decades. Reflecting on the significance of the transition taking place, the distinguished Nobel laureate economist Wassily Leontief has warned that with the introduction of increasingly sophisticated computers, ‘the role of human as the most important factor of production is bound to diminish in the same way that the role of horses in agricultural production was first diminished and then eliminated by the introduction of tractors’"
The End of Work, p. 6

The timing of Rifkin’s message could not have been worse; the book came out right as America was in the throes of the dot-com boom, and mainstream economists proclaimed that the Internet was the long-promised "innovation" that would provide full employment for anyone who could learn HTML or become a Microsoft Certified Professional. Newly created Internet startups attracted billions in investment, and ordinary people were starting multi-million-dollar companies in their basements. It was a time of exuberant optimism. Of course, the dot-com bubble burst and made a mockery of such predictions. Internet companies folded left-and-right, and while a few survivors remained, billions of dollars were lost, with the resulting employment fallout. More importantly the Internet also allowed office jobs to be accomplished anywhere in the world, allowing even clerical and professional jobs to be performed by low-cost labor half a world away. Outsourcing was now coming for the suburban white-collar worker.

But the nation bounced back quickly--the next bubble was just around the corner, and this one was the biggest yet. The housing bubble was built on nothing more than rising asset values, financial fraud, and easy credit. The jobs that were created were almost all related to finance, real-estate sales, and services to those who used their homes as cash machines. Then in 2008 it all came crashing down. All of the job gains over the past decade vanished, leaving a decade of no net job growth, despite an increasing population, massive immigration from Mexico, and numerous rounds of tax cuts. Unemployment soared, and even creating enough jobs for new entrants into the labor force became an impossible task. Even if job growth were to return to the levels of the 1990‘s, it would take a decade just to bring the unemployment down to the level it was before the housing crash, and no one seriously expected that to happen. Without another bubble, Americans were told that high unemployment levels were simply a "new normal" and that government was powerless to change the situation. Rifkin’s stark predictions, presented twelve years too early, were finally starting to hit the mark.

Wednesday, March 30, 2011

We've been heading this way for quite some time - almost the entire history of the Industrial Revolution, in fact. For most of human history the vast majority of human beings have had to work the land in some way to provide adequate sustenance. The British Agricultural Revolution changed all that by drastically reducing the amount of workers needed for agricultural production. These displaced workers filed into overcrowded and squalid cities, where they became the surplus labor for the newly emerging method of factory production, powered by England's vast coal deposits and inventions such as the steam engine and the power loom.

Factory work came to dominate the world in the nineteenth and early twentieth centuries. The earliest factories, like Richard Arkwright's cotton mills were actually set up to produce textiles. The steam engine kicked off a race to develop more and more mechanical devices to do work, while the emergence of precision tooling and measurement meant that these inventions could be mass produced on a huge scale by relatively few workers. The application of steam powered engines, and eventually electrically powered engines, allowed a few workers with machines to do jobs that would have taken years of human labor. It has been estimated that a single barrel of oil contains the equivalent of ten years of human labor. Henry Ford pioneered the application of mass production and interchangeable parts (both prior inventions) to sophisticated mechanical devices like automobiles, and engineers like F.W. Taylor strived to eliminate waste and create maximum worker efficiency in the new science of industrial production. Inventions such as the electric light allowed factories to run 24 hours a day.

In the 1920's, The Technocracy Movement pointed out that as workers became more efficient, less of them were needed. As the amount of goods per worker increased, small amounts of workers could produce massive amounts of goods; enough for everybody and more than people could reasonably purchase. With such a surplus of goods, they argued, prices would have to fall, destroying the purchasing power needed to consume those goods. Their grim analysis seemed to be proven right when the nation's economy went off the rails in 1929, even as the United States was still flush with oil and coal and factories stretched from coast to coast. The Technocrats' radical solution was to plan economic production by putting engineers in charge, and to mass produce goods based on available energy reserves rather than money capital, distributing them without the burden of a price system. The Technocrats noted that the price system only functions if food and goods are reasonably scarce, not abundant, and that business interests would prefer scarcity to abundance, as it leads to higher profits. They also pointed out that the incredible efficiency of the production line meant that were far more workers looking for work than there were jobs available. It's worth noting that this analysis was made well before the advent of advanced robotics, bar codes, or digital computers, and when almost half of all Americans still lived on family farms.

While it had its supporters, the Technocrats' radical solution, with it's quasi-socialistic undertones alienated much of the general public. The American people were not willing to abandon democracy and put engineers in charge. Unfortunately, it also meant that their astute analysis of the wage and employment situation fell by the wayside. The economy's woes were blamed entirely on an errant financial market, with the underlying problems swept under the rug. The nation turned instead to Roosevelt’s New Deal, which advocated government creation of jobs to provide opportunities to the unemployed and pump purchasing power back into the economy (based on the work of Keynes). It did not address overproduction and fundamental labor surpluses, however, and the nation lingered on in the doldrums of the Depression for a decade.

We all know what happened next. World War 2 came along and the government seized control of the economy and provided full employment. Essentially, what they did was similar to what the Technocrats advised, only they needed a war to make it happen. Goods were rationed. Prices were no longer a problem, as the government was picking up the tab, and overproduction was not an issue since the end result of all these products was to be blown up. Factories worked overtime, and full employment was achieved. The United States' massive weapons production was decisive in defeating the Axis powers, whom they could vastly outproduce.

After the war was over, there was a very real concern that the economy would just go right back into the depression it had emerged from. The solution proposed by the nation's political and corporate elites was to create a consumer economy - a large middle class paid well enough to afford the vast economic output produced by the nation's factories and generate enough economic activity to put everyone to work. The middle class would be encouraged to consume by advertisements beamed into their homes by television, and this was coupled with vast suburbanization which required new house construction, automobiles, roads, furniture, household appliances, services, conveniences, schools, supermarkets, etc. The nation's factories, the only ones to emerge unscathed from the war, dominated world production and provided stable, well-paying employment for millions with excellent benefits. The nation became dependant upon housing starts and automobile production, which still dominate economic discussion to this day.

Yet there was some unease. The novelist Kurt Vonnegut, who was working in the newly created field of corporate public relations for General Electric in 1952, was given a tour of a factory where an early computer operated milling machine was cutting rotor blades for jet engines and turbines, something extremely difficult for traditional machinists to accomplish. The experience prompted him to pen his first novel, Player Piano, depicting a world where all labor was displaced by machines, and conflict ensued between the engineers and managers who ran production, and the displaced laborers who found themselves superfluous to society. The novel deals more with the moral issues presented by this society, rather than the practical ones. Indeed, Vonnegut's displaced workers are fairly well cared for by a generous welfare state that would be impossible to imagine in the America of today. Vonnegut's extreme scenario was consigned to the realm of speculative fiction. The American economy at the time was booming, and any displaced factory workers and machinists easily found new jobs as salesmen, office workers, carpenters, radio announcers, ad copywriters, franchise owners, truck drivers, etc. During this time one income was sufficient to maintain a comfortable lifestyle, and concerns about overproduction and automation were quickly forgotten.

Tuesday, March 29, 2011

Economists tend to distinguish between cyclical unemployment, which are temporary periods of job loss, and structural unemployment, which is a mismatch of skills between available employees and job openings. What they tend to ignore is technological unemployment. Anything that does not involve Bayesian utility maximizers operating in a rational market tending towards equilibrium is out of bounds. This has only become more pronounced as economics dealt with more and more abstract mathematical models of the economy. Automation simply does not fit into their world view - it's the stuff of science fiction and best left to engineers and physicists. They would much rather deal only with issues like interest rates, money supply, trade flows, and tax policy. As Martin Ford put it:

So far, I have not seen a great deal of deep thought given to how the future economy will work. Most people—and nearly all economists—make the obvious assumption about that: they assume the economy will essentially work the way it has always worked. The basic principles that govern the economy are seen as being relatively fixed and reliable. Economists look to history and find evidence that the free market economy has always adjusted to impacts from advancing technology and from resource and environmental constraints, and they assume that the same will always occur in the future. Crises and setbacks are temporary in nature: in the long run, the economy will rebalance itself and put us back on the path to prosperity.

One person who is taking it seriously is engineer Marshall Brain, who is often associated with the Singularity Movement and is the founder of the popular Web site How Stuff Works, (and should not to be confused with Brain from Pinky and the Brain). He was one of the earliest to sound the alarm on automation’s effects on the workforce. His Web site, called Robotic Nation, has been around a long time, and it argues that robots will eventually become advanced enough to perform nearly all necessary tasks to make society function. As he says in his FAQ:

I firmly believe that the rapid evolution of computer technology will bring us smart robots starting in a 2030 time frame. These robots will take over approximately 50% of the jobs in the U.S. economy over the course of just a decade or two. Something on the order of 50 million people will be unemployed.

The economy may adjust and invent new jobs for those 50 million unemployed workers, but it will not do so instantaneously. What we will have is a period of economic turmoil. All of those unemployed workers will be in a very bad spot. The economy as a whole will suffer from this turmoil and the downward economic spiral it causes. No one will benefit when this happens.

Brain has spoken and written extensively on these issues in a wide variety of forums. He has also written a speculative short fiction story called Manna, detailing a possible version of a society where work is taken over by robots. He presents two possible scenarios. In the first scenario, set in the United States, extensive automation produces a dystopia where workers are worked like dogs by automated "bosses" that run every conceivable business and monitor their every move. Employees that fail to produce adequately are digitally blacklisted, and wind up in internment camps constructed of cheap foam materials where robots keep watch to make sure they don't escape. Profits are maximized by a small class of fabulously wealthy business owners who control the entire economy. In the other scenario, set in Australia, robots preside over an egalitarian society providing every conceivable want. Goods are distributed equally via a credit system and based on resources available. Work is voluntary, and people are free to engage in whatever captures their interest. Everything is free. There are some "singularity" elements like space elevators, and a virtual-reality internet plugged right into people's nervous systems. While some of Brain’s speculations may be over the top, his basic concerns are becoming more and more a reality.

Another lonely voice of concern is Silicon Valley engineer and entrepreneur Martin Ford. He has written a book called The Lights In The Tunnel detailing his concerns about the economy of the future. His blog, Econfuture, is essential reading. Ford has also written for The Huffington Post and the Atlantic Monthly dealing with issues surrounding automation. Like Brain and unlike most economists, he does not believe that jobs displaced by technology are going to automatically be replaced in other areas:

"The biggest problem with the conventional wisdom is the number of jobs we are talking about. In the U.S. we have a workforce of around 140 million workers. The majority of these jobs are basically routine and repetitive in nature. At a minimum, tens of millions of jobs will be subject to automation, self-service technologies or offshoring. The automation process will never stop advancing: computer hardware and, perhaps most importantly, software will continue to relentlessly improve. Therefore, simply upgrading worker skills is not going to be a long-term solution; automation will eventually (and perhaps rapidly) catch up. If you are willing to look far enough into the future, the number of impacted jobs is potentially staggering."

One of the few economists to seriously study the issue is economist David Autor of MIT. In a paper published in 2009, Autor came to an ominous conclusion: >technology is rendering middle class jobs obsolete. Autor’s work found that the job growth over the past years has been predominantly on the high-end and low-end of the wage scale, and his findings indicate that automation is a major cause of this phenomenon.

Economists group all such arguments under the term "Luddite Fallacy." The Luddites were groups textile artisans who were opposed to the use of mechanized looms in factories fearing they would cause the loss of their occupations and consequently mass unemployment. Beginning in 1811 they smashed the labor-saving knitting machines in protest, clashing with the British army in the process. Obviously, the Luddites lost, and industrialization proceeded apace. The vast economy produced by subsequent mechanization eventually produced employment for the displaced labor force. If we had listened to the Luddites, the argument goes, we would not have the marvelous technological achievements of today, nor the affluence we all enjoy.

The conventional economic view states that as you can produce more output per worker, the costs of those outputs go down. As the costs go down, so do the prices, increasing the demand for those goods and allowing them to be supplied to more markets. The increased demand causes more workers to be hired, thus causing automation to maintain or even increase employment. Furthermore, even if workers are lost from one sector, lower prices increase demand in other sectors, and those sectors will absorb unemployed workers. You simply need to match labor with what the economy needs. While workers could be reduced or eliminated in certain operations such as steel milling or automobile manufacture, it would never be the case that need for workers in the entire economy were diminished. Economists simply do not believe that automation causes unemployment, period. Economist Alex Tabarrok summarizes the thinking this way: "If the Luddite fallacy were true we would all be out of work because productivity has been increasing for two centuries." A commenter on Paul Krugman's blog put it more succinctly: "Humanity is a beehive, there will always be work." Note, however that this is based on a series of assumptions, of which these are but a few:

1. Our demands are insatiable.
2. The future will be similar to the past.
3. There are jobs that cannot be automated.
4. Our resources are infinite.
5. All displaced workers will find employment somewhere else.
6. There will always be new industries to absorb population growth.

It appears that economic "science" is little more than taking what had happened in the past, and assuming without any evidence that it will necessarily happen in the future. After all, all those wool weavers found other jobs, didn't they? But are conditions really the same? Can these questions even be realistically addressed in these abstract economic models that are used to dismiss the arguments? What if the underlying assumptions of economics have fundamentally been altered? Martin Ford again:

Among economists and people who work in finance it seems to be almost reflexive to dismiss anyone who says "this time is different." I think that makes sense where we’re dealing with things like human behavior or market psychology. If you’re talking about asset bubbles, for example, then it’s most likely true: things will NEVER be different. But I question whether you can apply that to a technological issue. With technology things are ALWAYS different. Impossible things suddenly become possible all the time; that’s the way technology works. And it seems to me that the question of whether machines will someday out-compete the average worker is primarily a technological, not an economic, question.Economists did not always ignore technological unemployment. John Maynard Keynes was perhaps the most influential economist of the twentieth century. His General Theory was the guidepost for combating the Great Depression. In one passage, he wrote:

For the moment the very rapidity of these changes is hurting us and bringing difficult problems to solve. Those countries are suffering relatively which are not in the vanguard of progress. We are being afflicted with a new disease of which some readers may not yet have heard the name, but of which they will hear a great deal in the years to come–namely, technological unemployment. This means unemployment due to our discovery of means of economising the use of labour outrunning the pace at which we can find new uses for labour.

This was in the 1930's, when a computer like Watson was beyond the imagination of all but speculative science fiction writers. Even though he acknowledged the impact increasing mechanization and the resultant productivity had on workers, Keynes still believed that aggregate demand was the main problem affecting the economy, and his theories were centered around stimulating demand. Despite this, Keynes clearly believed the future would be different from the past. Towards the end of his career, Keynes wrote an essay entitled Economic Possibilities for our Grandchildren, where he put forward the claim that we would someday have the material abundance to abolish scarcity. He believed that productivity gains would eventually lead to more leisure time and greater affluence for all. He saw this leading to a profound social change, where commercial values of wealth acquisition and material goods would give way to a more general appreciation of art, scientific discovery, and social relationships, and that this would be a necessary outgrowth of increasing automaton and efficiency.

Today, economists scoff at his naïveté. Keynes clearly did not foresee the automobile-driven globalized consumer economy. We didn't slow down, we just bought more and found lots of new people to sell to. Consequently, his arguments concerning automation and productivity were dismissed inteir entiretly, as well as Keynes’ ideas about a future where increased productivity was traded for leisure. If we have not brought about Keynes' vision, it is not because it is impossible, but rather it is because of the choices we have made. Those choices have been specifically designed to increase scarcity and preserve the intersts of the powerful. In fact, we work much harder for much less than we did thirty just years ago, despite the increasing aggregate wealth of our society.

It cannot be overstated that the mainstream economics profession entirely missed the economic crash and downturn of 2008 that we are continuing to experience, as well as the crash of 1929 and the Great Depression. In fact, every downturn has been missed by mainstream economists, otherwise they could theoretically have been prevented. Economists were assuring us that the economy was stable and the underlying fundamentals were sound up until the very eve of the crash! How much can we trust mainstream economists?

Monday, March 28, 2011

There is only one condition in which we can imagine managers not needing subordinates, and masters not needing slaves.

This condition would be that each (inanimate) instrument could do its own work, at the word of command or by intelligent anticipation, like the statues of Daedalus or the tripods made by Hephaestus, of which Homer relates that

"Of their own motion they entered the conclave of Gods on Olympus"

as if a shuttle should weave of itself, and a plectrum should do its own harp playing.

-Aristotle, Politics.

Any labor which competes with slave labor must accept the economic conditions of slave labor.

-Norbert Weiner, cybernetics pioneer.

1. I Robot.

Recently my local library installed a series of automatic check-out stations. Instead of handing your books to a circulation aide (that’s what they’re technically called), you scan your card, enter a PIN (personal identification number), scan the items, and print your slip with the due dates. This comes many months after the grocery store I used to go to regularly changed all the express checkout lanes to self scan lanes, where you scan the bar code of items yourself, insert cash into the machine (or more commonly, swipe a card), receive your receipt, and bag your own groceries. Work that used to be done for you was now either done by you or a computer, with the savings from eliminated workers’ wages theoretically passed along to you in the form of lower prices. At this point, it is worth noting that the concept is somewhat self-defeating, as the process is currently so unfamiliar that in each of these situations staff were needed on hand just to walk the customers through the process of successfully checking out. Of course, it would have been simpler for these workers to just check the customers out themselves, not to mention less stressful for the customers. Thus, although it seems like there are no real net savings using these machines for now, the corporations deploying them are betting that after a certain length of time and once they become commonplace enough, customers will no longer need such hand-holding, and these procedures will just become the standard way things are done with no special attention being paid to impersonal, automated check-outs. And they’re probably right; these machines, once rarities, are cropping up everywhere: groceries, hardware stores, libraries, gas stations, etc. ATM machines have been around for decades; you no longer need to interact with a teller to do financial transactions (and you pay extra for the privilege, making banking the only business that charges you more for using automation). When I call customer support, an automated menu with a robotic voice reads me my options and asks me to speak it into the receiver to indicate my choice. You no longer need to interact with an attendant at any of these businesses. At one time, vending machines were an oddity and service over the phone was unthinkable; now they are commonly accepted.

Some of these developments in self-service have been theoretically possible since the barcode was invented, but they have become a lot more visible lately. The Recession has actually spurred the drive for automation; the last recession in 2000-2001 also saw such a wave of automation, and with each new wave, the computers get more sophisticated. There’s been a lot of discussion and hand-wringing over outsourcing, but scant attention had been paid to what is called technological unemployment, another horseman of the labor apocalypse.

But thankfully the discussion is finally starting. The prominent economist andNew York Times columnist Paul Krugman wrote a column March 7, 2011 entitled Degrees and Dollars. In the column, Krugman noted an earlier story in the Times about new software that allows legal documents to be intelligently perused by a computer program for relevant information, eliminating the need for large teams of junior lawyers and paralegals to sift through copious documents on large, complex cases:

Computers, it turns out, can quickly analyze millions of documents, cheaply performing a task that used to require armies of lawyers and paralegals. In this case, then, technological progress is actually reducing the demand for highly educated workers.

And legal research isn’t an isolated example. As the article points out, software has also been replacing engineers in such tasks as chip design. More broadly, the idea that modern technology eliminates only menial jobs, that well-educated workers are clear winners, may dominate popular discussion, but it’s actually decades out of date.

The fact is that since 1990 or so the U.S. job market has been characterized not by a general rise in the demand for skill, but by “hollowing out”: both high-wage and low-wage employment have grown rapidly, but medium-wage jobs — the kinds of jobs we count on to support a strong middle class — have lagged behind.

In the article, Krugman argues that seeing education as a panacea for falling wages is out of date, as many medium skill jobs are being automated out of existence as fast as low-skilled jobs. Krugman also dealt with the falling demand for brains issue in this blog post. In it he mentions Watson, IBM’s “intelligent” computer that only weeks prior defeated some of the best contestants on the television game show “Jeopardy.” In order to compete on Jeopardy, a computer has to not merely sift through data, but “understand” normal questions, context, relevancy, even puns, slang and wordplay. Jeopardy champion Ken Jennings, one of the contestants who was bested by Watson, wrote a perceptive article for Slate magazine where he said the following:

IBM has bragged to the media that Watson's question-answering skills are good for more than annoying Alex Trebek. The company sees a future in which fields like medical diagnosis, business analytics, and tech support are automated by question-answering software like Watson. Just as factory jobs were eliminated in the 20th century by new assembly-line robots, Brad and I were the first knowledge-industry workers put out of work by the new generation of "thinking" machines. "Quiz show contestant" may be the first job made redundant by Watson, but I'm sure it won't be the last.

Krugman still thinks that many low-wage jobs like janitors are less likely to be automated:

Most of the manual labor still being done in our economy seems to be of the kind that’s hard to automate. Notably, with production workers in manufacturing down to about 6 percent of U.S. employment, there aren’t many assembly-line jobs left to lose. Meanwhile, quite a lot of white-collar work currently carried out by well-educated, relatively well-paid workers may soon be computerized. Roombas are cute, but robot janitors are a long way off; computerized legal research and computer-aided medical diagnosis are already here.

His optimism over the creation of low-wage jobs is probably misplaced; they are not being created either. According to labor statistics, the unemployment rate for those earning over 150,000 a year is 3 percent, while it is 31 percent for county's poorest households. In fact, janitors may not even be safe from self-cleaning automated toilets. Also that same week, Boing Boing posted a video of people riding in one of Google’s experimental self-driving cars navigating through an obstacle course. Goodbye cab and truck drivers. A robot has been invented to fold laundry. Goodbye housekeeping staff. Surely low-paid agricultural work is safe. Think again. The Institute of Agricultural Machinery at Japan’s National Agriculture and Food Research Organization, along with SI Seiko, has developed a robot with stereoscopic vision that can select and harvest strawberries based on their color.Future developments are predicted for tomatoes, grapes and other plants. Automation is already used extensively in dairy farming (milking machines, etc.), and M.I.T. is working on prototype bots that can monitor, feed and harvest tomato plants. According to one commentator:

“The automation of agriculture could prove to be a pivotal development in the early 21st century, akin to the adoption of combustion engines in the early 20th century. Just as horses were eventually replaced by tractors, humans may find themselves replaced by robots in the remaining realms of agricultural labor in which they still hold sway.”

Construction workers and the trades aren’t immune, either. New housing starts are seen as barometers of economic health, yet there is a drive to construct entire houses in factories via automation, ensuring higher quality and lower costs than building houses ad-hoc outdoors in unpredictable weather conditions. BLDGBlog recently posted an article entitled “The Robot and the Architect are Friends,” exploring using robots to translate building designs into reality without teams of construction workers. The article stated:

"Swiss architects Gramazio & Kohler "have a vision: architecture using robotics to take command of all aspects of construction. Liberated from the sidelines, the profession would be freed to unleash all its creative potential—all thanks to its obedient servants, the robots"

In fact, the New York Times is doing a whole series of reports chronicling the rise of recent artificial intelligence breakthroughs called “Smarter Than You Think.” It is sobering reading. It seems like there is already no task that is beyond being performed by a computer. Even relatively complex tasks are not far out of reach. Having long ago mastered chess, they are even playing poker against human opponents. NASA is already sending robotic astronauts into space. Highly paid television journalists are not immune, either. The Internet blogger Randall Parker at Future Punditwrote an article based on the same New York Times piece in which he imaginatively describes a legal system almost entirely devoid of people. He goes further than Krugman in describing an automated workforce of the future. His article may seem a bit fanciful, yet decision making software is a reality, and it is getting more sophisticated all the time. There's even a computer that composes beautiful music!I could go on and on. The sense throughout the last decades of the twentieth century was that automation only affected low-skilled blue-collar factory workers. College-educated workers thought, “if only those losers would simply go to school and hit the books, they wouldn’t have to worry. It’s their own damn fault.” Now we know that even many jobs that require extensive education can be eliminated just as easily.*

So we have a continually growing population thanks to the fruits of industrialization that can feed itself by using a mere 2.7 percent of it's laborers. This continually growing population needs to sell it's labor in return for wages to survive, yet, every business in the economy strives for maximum "efficiency", which is defined by getting the most work done with the lowest amount of inputs possible. Closely related is productivity, which is the amount of work produced per worker. Any economist will tell you that rich societies are the ones with the highest productivity, and businesses are always striving to increase productivity. In the U.S. worker productivity has soared, especially since the advent of the computer (with the resulting gains not shared with the workers). Yet productivity gains also lead to the need for less workers. Every incentive in a money/wage economy is to reduce the amount of human labor! That's why there are so few people involved in agriculture, and why food is so (relatively) cheap. So you have an economy that needs to continually create more jobs for more and more people, yet provides every incentive to produce with less and less workers. And let's not forget, since the 1970's or so, both men and women are expected to work, and in many cases, must work, so now we need to create enough jobs for all of them. The question is not so much, "how can we make this system work?", it's "how has this system worked for so long?"In the past, technology merely allowed workers to work more effectively and efficiently, it did not actually eliminate the needs for workers themselves.

Some have pointed out that automation does not eliminate the need for lawyers - even if legal research is done by computers, you still need lawyers, and even if medical diagnosis is done by computer, you still need doctors. As Krugman noted, computers excel and doing routine tasks, "cognitive and manual tasks that can be accomplished by following explicit rules.” But this misses the point. The fact that we need some doctors and lawyers and janitors doesn’t matter. The economy needs to be continually creating more jobs just to accommodate the people constantly entering the workforce. In fact, the economy needs to add roughly one million jobs every year, just to keep the unemployment rate from increasing!That's some 120,000 jobs every month, year in and year out. With automation constantly decreasing the need for “cognitive and manual tasks," this becomes more and more of a receding horizon. Even a slight decrease in available jobs adds up after months and months. Every month we miss our jobs target, the amount of unemployed workers is increasing, and making up for it becomes harder and harder. We're fighting a losing battle here.

While we may not like repetitive jobs with clearly defined, rules, the fact is that these jobs provide the majority of employment in our economy. They provide the entry-level jobs that allow one to climb the career ladder. An economy without them amounts to a ladder with no rungs between the bottom and the top. The lawyers doing the grunt work today will be the senior associates in twenty years. The interns doing medical diagnosis now will be the future doctors. The architects building models now will eventually design entire buildings. At least, that’s how it used to work. Once those basic jobs are not there, who will get to be doctors and lawyers of the future, and how will that be decided? The sad fact is, jobs that require true thought and creativity are very limited, and competition for them is intense. It’s always been this way, whether we acknowledge it or not. Such jobs are highly desirable, and thus occupied by those with the requisite money and social connections. Competition for such jobs is getting more and more intense, which is no doubt leading to the stratospheric rise in the cost of education in both time and money. Competition to get into the best schools currently begins before children are even five years old!In fact, there is so much competition for jobs in the design and entertainment industries that employees need to literally work for free just to get a foot in the door! Similar free internships are required at highly desirable workspaces like investment firms and media companies, where interns hope to make a killing or be discovered as the next new celebrity journalist or on-air personality. Already, highly-paid and creative positions are dominated by the children of the wealthy and privileged. People will simply graduate right to the top. It leads to an even more class-stratified caste system than we have today. Are we heading toward a new aristocracy?We have what I call a creative surplus, a concept I hope to explore in more detail. What it means is that our workforce has far more creative ideas than society has the ability to realistically implement at this time. Even if you are one of those intelligent, creative individuals, don’t count on making any money off of it. There is only a need for so many fashion designers, starchitects, or New York Times columnists. The massive amount of “free” creativity floating around the Internet (including this essay) is testament to that. Our workforce is far more skilled and educated now than in the nineteen fifties, yet unemployment is much higher.

So today I'm finally going to publish my series on automation. This was prompted by a few experiences - in particular the installation of automatic checkout terminals at my library, and a recent Paul Krugman column discussing automation. These things coming at a time of increasing unemployment really made me sit up and take notice. I had heard arguments for automation over the years but I always dismissed these people who asserted that we'd all be replaced by robots as technophilic daydreamers - after all, weren't these the same people who said we'd be living in moon colonies in the year 2000 and be riding around in flying cars? But, having done research for this article, now I'm not so sure. I think we need to start this discussion. Anyway, this article was one of the main reasons I wanted to start blogging again. It was complicated by the fact that literally every day I came across new data and articles that were relevant to my thesis, and kept adding more and more information and revising my arguments. I could probably go one doing that for months, but I think at some point you have to say it's good enough and put it out there. That's the downside of not having deadlines! Although I wrote it as one piece, due to it's length, I'm publishing it as a series. So, I hope you enjoy, and please do read the links, they are really essential notes for the topic, and hopefully educational as well.

I have a whole host of future topics, it's just finding the time to commit them to paper (digital paper, as it were). Many of them have to deal with the real sources of our problems, what kind of world we'll be living in, what the future may look like, and how we can cope with it. There will also hopefully be suggestions on how we can create a better world, or at least a better life for ourselves. I'm currenlty working a post about Japan that I hope to have ready for next week. In the meanitime, enjoy the current articles.

Sunday, March 27, 2011

If I were to tell you that I had invented a piece of technology that would make your life a whole lot more convenient, and even had the power to transform every aspect of our society, but it would case the deaths of tens of thousands of people each year, would you allow it to be implemented? What if told you that this technology would change life as we know it, and that everyone would want this technology, but one in every thousand people would have to die to make it a reality, with millions more hurt or maimed, or suffering corollary diseases like asthma? If it were put to a popular a vote, would you vote for it or against it? What if I told you a million people were going to die every day around the world because of this technology, with another 50 million hurt or maimed? Would you change your vote? Be honest.

Of course, such a technology already exists. You probably already have it at your house. It's called an automobile. A simple search of statistics will tell you that somewhere between roughly 40-50,000 people have died on America's roadway's each and every year. Similarly, there are over a million deaths worldwide every day related to cars, according to the WHO. Do you now support banning cars? Why not?

Almost everyone knows someone who has been seriously hurt or injured in an accident. There are over 10 million collisions every year. Some of those result in severe injury or paralysis. They are so common that lawyers in every city get rich off of litigating such cases.

I think that's the context against we must view the nuclear plant meltdown in Japan. While almost everyone knows someone who has been affected by a car crash directly or indirectly (I know I do), not many of us know anyone who has been effected by a nuclear disaster. The deaths of even the worst case scenario in Japan pale in comparison to the amount of people killed worldwide thanks to cars. Yet, we do not give up our cars. We even freely choose to use them when there are safer public transportation options available, even though our chances of death are much, much higher. The number of car drivers worldwide is increasing dramatically, with the resulting increase in deaths, yet there is none of the worry that there is over developing nuclear power.

The behavioral economist Dan Ariely tells us that after the September 11 attacks, people turned away from flying out of fear and chose to drive instead to get to their destinations. This uptick in driving caused an uptick in traffic fatalities too, ultimately causing more deaths than the attacks themselves.

I say this not to endorse nuclear power in any way, but only to remind people of the choices and tradeoffs that we make for the use of technology, which are all too often forgotten. As with many issues, it's all how we think about it.

Saturday, March 26, 2011

Several years ago Milwaukee voters approved an ordinance requiring employers to provide sick days. These are fundamental rights which every worker in nearly every country in the world enjoys, outside the United States. It's a sad fact that local municipalities have to pass these regulations, as higher levels of government have been purchased entirely by corporate interests, with an eye to running government to ensure maximum corporate profits rather than citizen well-being. After a two year challenge, the courts voted to uphold the law. Yet giving workers rights is deeply offensive to the Tea Party Republicans who now control every branch of Wisconsin State Government. They immediately leaped into action. From the Milwaukee 9 to 5 press release.

The group called on state Assembly Members to vote against the bill that would strip local municipalities of some of their legislative power. The Sick Days Scam (AB41) would preempt local governments and voters from enacting the Milwaukee paid sick day legislation, and in doing so, open the door for the State Legislature to overturn a range of legislation passed in towns and cities throughout Wisconsin. The state Senate already passed the bill with no debate when the Democratic senators were still absent in early March.

“This bill is inconsistent with Wisconsin’s tradition of local municipalities having discretion, whether through direct legislation or their own power, to shape these matters,” said Kathleen Dolan, Professor of Political Science at the University of Wisconsin-Milwaukee. “It is also inconsistent with the Republican ideology that says, ‘Leave the states alone, one size does not fit all, top down is not always the best thing.’ Here they are trying to impose a position on localities who may want to determine their own needs.”

Wisconsin has a rich history of local governance, in which municipalities enact legislation that best meets the needs of their communities. In 2008, nearly 70% of Milwaukee voters approved a law to provide paid sick days for workers in the city. The law would provide 120,000 Milwaukee families who do not have paid sick days of the freedom to take care of ill family members without fear of losing their jobs or a paycheck.

So they can ignore millions of children without health insurance, but they immediately leap into action when workers are actually given rights. Remember, these are theoretically public servants.

But that's not even the most interesting thing here. As the press release notes, one of the cornerstones of the Republican party is supporting "local control." Traditionally, Republicans wanted to keep power local without interference, the idea being that local governments are closer to the people and have a better idea of what needs to be done than distant governments. It's a philosophy that theoretically makes sense, and one that I can agree with. The Milwaukee ordinance is entirely in keeping with this.

Yet, suddenly, local control is no good, apparently. Republicans are perfectly happy with larger units of governments overruling local control if it gives corporations and the rich more power over the citizens. Their actions seem to be the very picture of big government interference in local decison making. Yet I'm sure all the "small government" Republicans will be all for it. Turns out big government is actually just fine if it gives corporations more power, it is only a problem if it attempts to restrain corporate power. This just shows what some of us already know - there is no core governing philosophy of the Republican party except to increase the power of the rich and corporations to make profits on the backs of the rest of us.

What's amazing is that for Republicans, organizations that fight for workers' rights are demonized, while legislators in the pockets of big corporate interests like the Koch Brothers are considered the good guys, even if those Republicans are workers themselves. Do they really believe these Republican have their best interests at heart? In their ideal world, we are all working for minimum wage. I think a lot of these people are naive. They may have sick days in their cubicle paradise in the suburbs, but do they honestly not think that that sick days are eventually going to be stripped from them as well? Eventually, all workers will be affected by the race to the bottom, even suburban Republicans who work for a paycheck.

Note: sorry for all the damn political posts, but there are just so many outrages on a daily basis it's hard to keep up. These are also quick to write. Soon, I'll finally be getting to more of the philosophical discussions I hope to have here.

Thursday, March 24, 2011

One would think that in a country where Wall Street Bankers and executives crashed the economy through widespread fraud, where the 400 wealthiest people have as much wealth as half of the citizenry, where 40 million people have no health insurance and 1 in 5 children live in poverty, and where corporations sit on the highest profits in history yet the unemployment rate keeps increasing, that there would be some kind of animosity. And you'd be right - there is intense animosity toward working civil servants, the poor, and unions. Just the past few days, we're seeing results of the most full-throated baldfaced attacks on the working classes since before the Great Depression. What's amazing is how coordinated these attacks are, and how little coverage they are receiving in the mainstream (i.e. corporate-owned) press.

In Congress, the Republicans have proposed legislation that says going on strike will make a workers' family - including children - ineligible for food stamps:

Lawmakers seek limits on Wages: via Engineering News Record:
The union rights controversy isn’t confined to Wisconsin or teachers. Newly seated Republican majorities in several budget-strapped states have swung legislative wrecking balls at some of the pillars of the building trades, including prevailing wages and project labor agreements.

In Ohio, where newly elected Gov. John Kasich (R) has pledged to cut costly regulations, new Republican lawmakers have provided a substantial majority in the state Senate. A bill originating in the Ohio House of Representatives would prohibit state funding on any local government project built under a project labor agreement. On prevailing wages, open-shop contractors are “working with the governor on extending a 1997 ban on prevailing wages for K-12 schools to universities,” says Bryan Williams, government affairs director of the Associated Builders and Contractors of Ohio.

"Maine Gov. Paul LePage has ordered the removal of a 36-foot mural depicting the state's labor history from the lobby of the Department of Labor headquarters building in Augusta…. Don Berry, President of the Maine AFL-CIO, issued a statement… 'It's a spiteful, mean-spirited move by the Governor that does nothing to create jobs or improve the Maine economy.'"

Looks like the Tea Party Revolution is marching on. The question is, why is anyone but the richest 10 percent voting for it? Does anyone seriously believe the problem with America is people get paid too much and have too many benefits? If so, you may want to reread my first paragraph.

Wednesday, March 23, 2011

Yesterday, I noted the contradiction in two New York Times articles - one which highlighted the attempts by the Obama administration to get 8 million more college graduates by 2020, the other describling the inability of young college graduates to find work. I concluded that we are attempting to increase the education level of the unemployed.

Tuesday, March 22, 2011

Sometimes, placement of articles in the newspaper form the definition of irony. Two of the most popular stories on the New York Times Web site March 22 provide a case in point.

An article entitled "Incentives Offered to Raise College Graduation Rates describes efforts by the Obama administration to offer "incentives" like grants to raise the college graduation rate in the U.S. According to the article, the idea is to crank out 8 million more college graduates by 2020. What these graduates are going to do, nobody knows. Somehow, educating more people will magically create a demand for them, I guess. It hasn't worked so far- our workforce is more educated than at any time in history, yet there are between four and six applicants for every job opening! It's a Red Queen's race - running faster and faster just to remain in place. No, all it does is ensure that our future unemployed workers are better educated than our unemployed workers of today.

And if you need proof, look no further than the article by Matthew Klein, published the same day, entitled "Educated, Unemployed and Frustrated." Klein eloquently makes the point that young graduates with advanced degrees today cannot find jobs! Klein, a research associate at the Council on Foreign Relations, draws an analogy between the educated youths who cannot find suitable employment in the Middle East who are fomenting revolutions and the large cohort of young, unemployed college graduates in America and Europe. He writes:

About one-fourth of Egyptian workers under 25 are unemployed, a statistic that is often cited as a reason for the revolution there. In the United States, the Bureau of Labor Statistics reported in January an official unemployment rate of 21 percent for workers ages 16 to 24. My generation was taught that all we needed to succeed was an education and hard work. Tell that to my friend from high school who studied Chinese and international relations at a top-tier college. He had the misfortune to graduate in the class of 2009, and could find paid work only as a lifeguard and a personal trainer. Unpaid internships at research institutes led to nothing. After more than a year he moved back in with his parents.

Millions of college graduates in rich nations could tell similar stories. In Italy, Portugal and Spain, about one-fourth of college graduates under the age of 25 are unemployed. In the United States, the official unemployment rate for this group is 11.2 percent, but for college graduates 25 and over it is only 4.5 percent.

I wrote an extensive article on the effects of automation on the workforce. That article was in part inspired by Paul Krugman's column, Degrees and Dollars. In that article, Krugman makes reference to automation, specifically the automation of routine mid-range legal work, as being one reason that job creation even for those with college degrees has stalled. He points out that automation is now eliminating jobs that used to require college graduates, while low-end jobs like janitors and maids are still in demand because they can not be automated as easily (thus far). And that's not even bringing into the discussion the offshoring of work to places like Russia, Brazil and India, which also disproportionately effects middle-class college-level jobs. The point Krugman's article makes is that more education is not going to magically solve our unemployment problems. As he put it, there is a "falling demand for brains." Why should people go heavily into debt to attend college when the only jobs our economy is creating are sales clerks and short-order cooks? Message to the Arne Duncan: no "incentives" are going to matter unless that changes.

So what's with the incentives to get people into college? That's been our only job creation strategy since the end of the Second World War. The G.I. Bill made college affordable to everyone in the hopes of stimulating the economy, and the Vietnam war caused students to enroll as a way of avoiding the draft. Suddenly college became a requirement for every job, no matter how simple of routine. All this did was dilute the value of a college degree from an "advanced" level of knowledge to a "basic" level of knowledge. Now college became a requirement for everybody, from a secretary to a call-center worker. Due to this, colleges could name their price, and raised tuition every passing year far in advance of inflation. That money was plowed into sheer waste like administrative bloat, sports teams, oversize salaries for tenured professors and high-level administrators, and palatial buildings by famous architects. It also led to a major profit center for banks as going into debt for higher education became standard and students mortgaged their future salaries. College debt has surpassed credit card debt, and cannot be discharged even in bankruptcy. It has turned the U.S. workforce into indentured servants. Maybe this is the real reason for getting everyone into college.

We still have this idea that all you need to get a job is "more education," as if there are just so many jobs floating around that people are just too dumb to fill. There are already more educated people than there are jobs for them. We have no sufficient outlets for the amount of creativity in our society. There are all sorts of good ideas out there, but no money to implement them. What we have is a creative surplus. There are all sorts of problems that require creative and innovative solutions: global warrming, peak oil, aquifer depletion, failing agricultural yields, pollution, urban sprawl, political conflicts, the list goes on and on. Yet our advanced knowledge and learning is not put towards the purpose of solving our real problems. This is the tragedy of our time. "More college" is not going to fix this. It goes back to our cardinal religion that the invisible hand will magically sort it all out. News flash: the invisible hand is invisible because it doesn't exist.