It could be the most significant economic puzzle of our time: As a declining share of economic growth goes to our workers, are people becoming less valuable to businesses? And why?

Reuters

The mysterious and growing divide between the rich and the rest in just about every wealthy country on Earth, including the U.S., is really two mysteries wrapped in one. The first mystery is why real wage growth has sped up at the top and slowed down for everybody else. But the second, more recent, and more fascinating problem is why labor's share of the winnings in developed economies has been in decline. It's not just that middle-class wages are falling behind the rich. Overall wages are falling behind something else -- capital.

People are becoming less valuable to companies. Why?

Simply put, the world shrank. Two seemingly unrelated inventions -- the microprocessor and the shipping container -- conspired to create a global market for all assets, including people. A century of achievements in computing power and shipping ushered in an era of global trade so expansive that it completely disaggregated the process of doing business (especially in manufacturing), allowing firms to treat finished goods as a bundle of globally sourced components and services.

As more than a billion new workers flooded a global labor market open to multinational companies, capable workers became less scarce, and therefore, less valuable. In order to understand how all of this happened, we have to dig into recent economic history, exploring the rise of digital technology and global trade.

THE UNIVERSAL LANGUAGE

20th century mathematicians were probably aware of the potential for computers to disrupt the market for human labor. It is, however, extremely unlikely that anyone could have predicted the convoluted manner in which this potential was realized. The obvious candidate would have been artificial intelligence -- if machines were able to mimic or perhaps exceed human intelligence, scores of people would suddenly find themselves useless. Though the specter of true AI still hangs over the 21st century, the great technological disruption of the 20th century was the advent of digital technology. The power of digital technology to reliably store and reproduce information led to a sea change in the way human beings communicate, making worldwide communication cheap, reliable and instantaneous.

Although no single invention can take full credit for the current ubiquity of digital devices, the microprocessor is generally credited as the core driver of smaller, more powerful digital devices. Microprocessors are computational engines that temporarily store and operate on information, doing the physical work of computation, taking information as input, changing it, and transmitting the altered information as output. The surface of a microprocessor is covered in tiny electrical switches called transistors -- the workhorses of the microprocessor. The "speed" of a microprocessor depends on the number of transistors it contains. In the early days of computation, transistors were large and computers were slow.

Transistor technology improved rapidly, leading to microscopic transistors, and smaller yet vastly more powerful microprocessors. This greater power expanded the scope of tasks that computers could perform, placing computers at the heart of a variety of tasks previously performed by human beings, including, ironically, the design and manufacture of microprocessors. In addition to growing to control complex manufacturing processes, modern computers are able to store, process, and reproduce complex forms of sensory information, such as sounds, and moving images.

This dramatic reduction in scale, coupled with a new ability to realistically reproduce sensory information, transformed the computer from a tool of industry and academics into a consumer item. So while the notion of a "bit" was the fruit of a mathematical inquiry, its ubiquity is perhaps better understood as the product of market forces -- the demand for a common unit of information. This demand from consumers and businesses alike for the capacity to store and transmit bits prompted the development of an omnipresent network of payments, phone calls, and electronic messages that fundamentally changed the way human beings conduct business and go about their daily lives. Suddenly, a single, global network could transmit payment information, photographs, scientific data, and current events, all on a nearly real time basis, creating geographically independent access to an ocean of transactions, human knowledge, culture, and experience.

THE BOX

Just as computers led to the proliferation of a common unit of information, world trade has been transformed by a common unit of shipping capacity -- the container.

Prior to containerized shipping, each piece of cargo had to be individually loaded onto and unloaded from shipping vessels using manual labor, forcing ships to spend substantial amounts of time idle in port. Every second a ship spends idle in port is a second the ship is not doing what ships are meant to do, which is shipping goods and generating revenue for the ship's owner. The key to containerized shipping is that goods are packed once at the point of production in a standardized "box." That box is left packed until it reaches its final destination. This allows containerized goods to be transferred from ship to rail to road and back, all without much manual labor. Eliminating the lag caused by manually packing and unpacking cargo, containerization dramatically reduced the amount of time it takes to load and unload ships, leading to a spike in the amount of time ships spend at sea, from around 50% of the ship's life to around 90%. Containerized volumes grew rapidly since the 1980s, at about 10% per year -- three-times faster than total seaborne trade.

Though the overall volume of global trade has grown over the last few decades, the share driven by developed economies like the U.S. has decreased. Today, developing nations account for half of global trade and half of its growth.

The infrastructures of global trade and digital communication broke the shackles of geography, making finished goods complex assemblages generated by a network of component products and services - e.g., the iPhone consists of several independently manufactured components sourced from the U.S., Italy, Taiwan and Japan that are assembled in China and then marketed globally. Since the points of production and service along a supply chain are now geographically independent, the pool of labor that can be utilized to perform any task along the chain is global, effectively increasing the supply of labor -- and decreasing the value of workers in developed economies.

THE RACE FOR INTELLECTUAL CAPITAL

The new infrastructure of global trade made it economical to deploy capital across the world to utilize enormous pools of labor in developing nations. The flow of money into developing nations operated like a magnet, causing enormous populations to leave farmlands for cities and transition from peasantry to paid labor. More than one billion non-farm jobs were created globally since the 1980's, according to the McKinsey Global Institute's report, "The World at Work: Jobs, pay and skills for 3.5 billion people." Nine out of every ten jobs were created in developing nations, and 44 percent were created in China and India alone. Now that wages are rising in both China and India, capital is moving elsewhere, chasing cheaper labor in South Asia and Africa.

The freedom of capital to move throughout the world in search for labor has fundamentally changed the balance of power between labor and capital, and technology and trade continue to expand the scope of tasks that can be performed at a distance, although truly local labor (construction, hair-cutting, locksmithing) probably won't escape overseas. This dynamic is already exerting sizable, downward pressure on the value of labor relative to capital in developed nations. But despite more efficient, more complete markets, labor is inherently divided into skill sets, placing a fundamental limit on its fungibility, leading to growing disparities between the wages of high skill workers and low skill workers.

Global trade would be impossible without the human beings that do the physical and intellectual work of everyday business. Like the transistors that blanket a microprocessor, human beings are scattered about the surface of the Earth, operating like tiny economic switches, moving and consuming objects, building and abandoning relationships, and ultimately, both deliberately and inadvertently contributing to a collective engine that determines the distribution of our planet's resources and our labors. But unlike the static abilities of a transistor, human beings are malleable, and can be cultivated to generate transformational, physical power.

Over the coming decades, as developing markets grow and mature, and new markets develop, there will be unprecedented demand for power of all varieties, from combustion to computation, and with it, unprecedented demand for the types of highly developed human intelligences that can unlock and utilize these powers. As a result, there will likely be a global shortage of high skill workers and a global surplus of low skill workers. This imbalance in the supply of skill sets is likely to exacerbate the power imbalance created by the prevailing dominance of technology and capital over labor, leading to even greater wage disparities between high earners and low earners, and further decreases in the overall value of labor relative to capital in developed nations. Nations that cultivate the brainpower of their populations will be rewarded with funds channeled from a global pool of capital aggressively searching for the brightest minds across the entirety of the human species. Nations that don't will be punished, finding themselves saddled with populations that simply cannot compete in this new, remarkably complex, and dynamic world.

THE DEEPEST BENCH

The United States is the world's preeminent economic engine, generating a remarkably large and diverse set of products and services used all over the world. Nonetheless, we are subject to the same market forces gradually reshaping economic realities across the developed world. As the home of household brands like Intel, Facebook, and Google, and the world's greatest scientific research institutions, we are, arguably, uniquely positioned to exploit, rather than be exploited by these forces.

So far, however, we have done a terrible job.

The wage gap between high-skill and low-skill labor in the U.S. mirrors the broader developed world. Similarly, the value of capital relative to labor mirrors trends observed in the broader developed word, with the relative share of income generated by labor in the U.S. declining over the last several decades. As noted above, these trends will probably continue and become more pronounced as greater imbalances in the global supply of labor develop over the coming decades.

If, however, the supply of labor is truly global, then there is in theory nothing keeping the U.S. from dominating the market for high skilled labor -- not only fulfilling its own demand but generating excess and exporting its intellectual capital abroad to fill the global deficit. There are, however, substantial practical barriers, in particular our education and immigration policy. The U.S. is simply not producing enough college graduates in the sciences, technology, engineering, and mathematics ("STEM"). We can either produce more STEM graduates domestically or encourage more immigration of STEM graduates from abroad, or perhaps utilize some combination of both. Doing nothing is not an option, unless we are willing to voluntarily decay into an increasingly polarized and overall poorer society. A polarized, poorer society that will likely lead to even greater political instability as the primary revenue generator of the U.S. government, taxing labor, becomes inherently less lucrative.

Change is an inevitable aspect of the human experience, but progress is not. Things can always get worse. And unless we take action to adapt to the new reality that surrounds us, we will be swept away by it. If, however, we are honest about the risks we face and the actions we can take to address them, we can instead opportunistically exploit the disruptions brought about by technology and global trade, make a conscious decision to produce the type of population that will succeed in this new reality, and, perhaps, inadvertently incubate the types of scientific revolutions that led the United States to economic, military, and political dominance in the first instance.

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.

The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.

The piece begins by detailing how Clinton helped the global bank.

“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

A controversial treatment shows promise, especially for victims of trauma.

It’s straight out of a cartoon about hypnosis: A black-cloaked charlatan swings a pendulum in front of a patient, who dutifully watches and ping-pongs his eyes in turn. (This might be chased with the intonation, “You are getting sleeeeeepy...”)

Unlike most stereotypical images of mind alteration—“Psychiatric help, 5 cents” anyone?—this one is real. An obscure type of therapy known as EMDR, or Eye Movement Desensitization and Reprocessing, is gaining ground as a potential treatment for people who have experienced severe forms of trauma.

Here’s the idea: The person is told to focus on the troubling image or negative thought while simultaneously moving his or her eyes back and forth. To prompt this, the therapist might move his fingers from side to side, or he might use a tapping or waving of a wand. The patient is told to let her mind go blank and notice whatever sensations might come to mind. These steps are repeated throughout the session.