The truly revolutionary impact of the Information Revolution is just beginning to be felt. But it is not "information" that fuels this impact. It is not "artificial intelligence." It is not the effect of computers and data processing on decision-making, policymaking, or strategy. It is something that practically no one foresaw or, indeed, even talked about ten or fifteen years ago: e-commerce—that is, the explosive emergence of the Internet as a major, perhaps eventually the major, worldwide distribution channel for goods, for services, and, surprisingly, for managerial and professional jobs. This is profoundly changing economies, markets, and industry structures; products and services and their flow; consumer segmentation, consumer values, and consumer behavior; jobs and labor markets. But the impact may be even greater on societies and politics and, above all, on the way we see the world and ourselves in it.

At the same time, new and unexpected industries will no doubt emerge, and fast. One is already here: biotechnology. And another: fish farming. Within the next fifty years fish farming may change us from hunters and gatherers on the seas into "marine pastoralists"—just as a similar innovation some 10,000 years ago changed our ancestors from hunters and gatherers on the land into agriculturists and pastoralists.

It is likely that other new technologies will appear suddenly, leading to major new industries. What they may be is impossible even to guess at. But it is highly probable—indeed, nearly certain—that they will emerge, and fairly soon. And it is nearly certain that few of them—and few industries based on them—will come out of computer and information technology. Like biotechnology and fish farming, each will emerge from its own unique and unexpected technology.

Of course, these are only predictions. But they are made on the assumption that the Information Revolution will evolve as several earlier technology-based "revolutions" have evolved over the past 500 years, since Gutenberg's printing revolution, around 1455. In particular the assumption is that the Information Revolution will be like the Industrial Revolution of the late eighteenth and early nineteenth centuries. And that is indeed exactly how the Information Revolution has been during its first fifty years.

The Railroad

The Information Revolution is now at the point at which the Industrial Revolution was in the early 1820s, about forty years after James Watt's improved steam engine (first installed in 1776) was first applied, in 1785, to an industrial operation—the spinning of cotton. And the steam engine was to the first Industrial Revolution what the computer has been to the Information Revolution—its trigger, but above all its symbol. Almost everybody today believes that nothing in economic history has ever moved as fast as, or had a greater impact than, the Information Revolution. But the Industrial Revolution moved at least as fast in the same time span, and had probably an equal impact if not a greater one. In short order it mechanized the great majority of manufacturing processes, beginning with the production of the most important industrial commodity of the eighteenth and early nineteenth centuries: textiles. Moore's Law asserts that the price of the Information Revolution's basic element, the microchip, drops by 50 percent every eighteen months. The same was true of the products whose manufacture was mechanized by the first Industrial Revolution. The price of cotton textiles fell by 90 percent in the fifty years spanning the start of the eighteenth century. The production of cotton textiles increased at least 150-fold in Britain alone in the same period. And although textiles were the most visible product of its early years, the Industrial Revolution mechanized the production of practically all other major goods, such as paper, glass, leather, and bricks. Its impact was by no means confined to consumer goods. The production of iron and ironware—for example, wire—became mechanized and steam-driven as fast as did that of textiles, with the same effects on cost, price, and output. By the end of the Napoleonic Wars the making of guns was steam-driven throughout Europe; cannons were made ten to twenty times as fast as before, and their cost dropped by more than two thirds. By that time Eli Whitney had similarly mechanized the manufacture of muskets in America and had created the first mass-production industry.

These forty or fifty years gave rise to the factory and the "working class." Both were still so few in number in the mid-1820s, even in England, as to be statistically insignificant. But psychologically they had come to dominate (and soon would politically also). Before there were factories in America, Alexander Hamilton foresaw an industrialized country in his 1791 Report on Manufactures. A decade later, in 1803, a French economist, Jean-Baptiste Say, saw that the Industrial Revolution had changed economics by creating the "entrepreneur."

The social consequences went far beyond factory and working class. As the historian Paul Johnson has pointed out, in A History of the American People (1997), it was the explosive growth of the steam-engine-based textile industry that revived slavery. Considered to be practically dead by the Founders of the American Republic, slavery roared back to life as the cotton gin—soon steam-driven—created a huge demand for low-cost labor and made breeding slaves America's most profitable industry for some decades.

The Industrial Revolution also had a great impact on the family. The nuclear family had long been the unit of production. On the farm and in the artisan's workshop husband, wife, and children worked together. The factory, almost for the first time in history, took worker and work out of the home and moved them into the workplace, leaving family members behind—whether spouses of adult factory workers or, especially in the early stages, parents of child factory workers.

Indeed, the "crisis of the family" did not begin after the Second World War. It began with the Industrial Revolution—and was in fact a stock concern of those who opposed the Industrial Revolution and the factory system. (The best description of the divorce of work and family, and of its effect on both, is probably Charles Dickens's 1854 novel Hard Times.)

But despite all these effects, the Industrial Revolution in its first half century only mechanized the production of goods that had been in existence all along. It tremendously increased output and tremendously decreased cost. It created both consumers and consumer products. But the products themselves had been around all along. And products made in the new factories differed from traditional products only in that they were uniform, with fewer defects than existed in products made by any but the top craftsmen of earlier periods.

There was only one important exception, one new product, in those first fifty years: the steamboat, first made practical by Robert Fulton in 1807. It had little impact until thirty or forty years later. In fact, until almost the end of the nineteenth century more freight was carried on the world's oceans by sailing vessels than by steamships.

Then, in 1829, came the railroad, a product truly without precedent, and it forever changed economy, society, and politics.

In retrospect it is difficult to imagine why the invention of the railroad took so long. Rails to move carts had been around in coal mines for a very long time. What could be more obvious than to put a steam engine on a cart to drive it, rather than have it pushed by people or pulled by horses? But the railroad did not emerge from the cart in the mines. It was developed quite independently. And it was not intended to carry freight. On the contrary, for a long time it was seen only as a way to carry people. Railroads became freight carriers thirty years later, in America. (In fact, as late as the 1870s and 1880s the British engineers who were hired to build the railroads of newly Westernized Japan designed them to carry passengers—and to this day Japanese railroads are not equipped to carry freight.) But until the first railroad actually began to operate, it was virtually unanticipated.

Within five years, however, the Western world was engulfed by the biggest boom history had ever seen—the railroad boom. Punctuated by the most spectacular busts in economic history, the boom continued in Europe for thirty years, until the late 1850s, by which time most of today's major railroads had been built. In the United States it continued for another thirty years, and in outlying areas—Argentina, Brazil, Asian Russia, China—until the First World War.

The railroad was the truly revolutionary element of the Industrial Revolution, for not only did it create a new economic dimension but also it rapidly changed what I would call the mental geography. For the first time in history human beings had true mobility. For the first time the horizons of ordinary people expanded. Contemporaries immediately realized that a fundamental change in mentality had occurred. (A good account of this can be found in what is surely the best portrayal of the Industrial Revolution's society in transition, George Eliot's 1871 novel Middlemarch.) As the great French historian Fernand Braudel pointed out in his last major work, The Identity of France (1986), it was the railroad that made France into one nation and one culture. It had previously been a congeries of self-contained regions, held together only politically. And the role of the railroad in creating the American West is, of course, a commonplace in U.S. history.

Routinization

Like the Industrial Revolution two centuries ago, the Information Revolution so far—that is, since the first computers, in the mid-1940s—has only transformed processes that were here all along. In fact, the real impact of the Information Revolution has not been in the form of "information" at all. Almost none of the effects of information envisaged forty years ago have actually happened. For instance, there has been practically no change in the way major decisions are made in business or government. But the Information Revolution has routinized traditional processes in an untold number of areas.

The software for tuning a piano converts a process that traditionally took three hours into one that takes twenty minutes. There is software for payrolls, for inventory control, for delivery schedules, and for all the other routine processes of a business. Drawing the inside arrangements of a major building (heating, water supply, sewerage, and so on) such as a prison or a hospital formerly took, say, twenty-five highly skilled draftsmen up to fifty days; now there is a program that enables one draftsman to do the job in a couple of days, at a tiny fraction of the cost. There is software to help people do their tax returns and software that teaches hospital residents how to take out a gall bladder. The people who now speculate in the stock market online do exactly what their predecessors in the 1920s did while spending hours each day in a brokerage office. The processes have not been changed at all. They have been routinized, step by step, with a tremendous saving in time and, often, in cost.

The psychological impact of the Information Revolution, like that of the Industrial Revolution, has been enormous. It has perhaps been greatest on the way in which young children learn. Beginning at age four (and often earlier), children now rapidly develop computer skills, soon surpassing their elders; computers are their toys and their learning tools. Fifty years hence we may well conclude that there was no "crisis of American education" in the closing years of the twentieth century—there was only a growing incongruence between the way twentieth-century schools taught and the way late-twentieth-century children learned. Something similar happened in the sixteenth-century university, a hundred years after the invention of the printing press and movable type.

But as to the way we work, the Information Revolution has so far simply routinized what was done all along. The only exception is the CD-ROM, invented around twenty years ago to present operas, university courses, a writer's oeuvre, in an entirely new way. Like the steamboat, the CD-ROM has not immediately caught on.

The Meaning of E-Commerce

E-commerce is to the Information Revolution what the railroad was to the Industrial Revolution—a totally new, totally unprecedented, totally unexpected development. And like the railroad 170 years ago, e-commerce is creating a new and distinct boom, rapidly changing the economy, society, and politics.

One example: A mid-sized company in America's industrial Midwest, founded in the 1920s and now run by the grandchildren of the founder, used to have some 60 percent of the market in inexpensive dinnerware for fast-food eateries, school and office cafeterias, and hospitals within a hundred-mile radius of its factory. China is heavy and breaks easily, so cheap china is traditionally sold within a small area. Almost overnight this company lost more than half of its market. One of its customers, a hospital cafeteria where someone went "surfing" on the Internet, discovered a European manufacturer that offered china of apparently better quality at a lower price and shipped cheaply by air. Within a few months the main customers in the area shifted to the European supplier. Few of them, it seems, realize—let alone care—that the stuff comes from Europe.

In the new mental geography created by the railroad, humanity mastered distance. In the mental geography of e-commerce, distance has been eliminated. There is only one economy and only one market.

One consequence of this is that every business must become globally competitive, even if it manufactures or sells only within a local or regional market. The competition is not local anymore—in fact, it knows no boundaries. Every company has to become transnational in the way it is run. Yet the traditional multinational may well become obsolete. It manufactures and distributes in a number of distinct geographies, in which it is a local company. But in e-commerce there are neither local companies nor distinct geographies. Where to manufacture, where to sell, and how to sell will remain important business decisions. But in another twenty years they may no longer determine what a company does, how it does it, and where it does it.

At the same time, it is not yet clear what kinds of goods and services will be bought and sold through e-commerce and what kinds will turn out to be unsuitable for it. This has been true whenever a new distribution channel has arisen. Why, for instance, did the railroad change both the mental and the economic geography of the West, whereas the steamboat—with its equal impact on world trade and passenger traffic—did neither? Why was there no "steamboat boom"?

Equally unclear has been the impact of more-recent changes in distribution channels—in the shift, for instance, from the local grocery store to the supermarket, from the individual supermarket to the supermarket chain, and from the supermarket chain to Wal-Mart and other discount chains. It is already clear that the shift to e-commerce will be just as eclectic and unexpected.

Here are a few examples. Twenty-five years ago it was generally believed that within a few decades the printed word would be dispatched electronically to individual subscribers' computer screens. Subscribers would then either read text on their computer screens or download it and print it out. This was the assumption that underlay the CD-ROM. Thus any number of newspapers and magazines, by no means only in the United States, established themselves online; few, so far, have become gold mines. But anyone who twenty years ago predicted the business of Amazon.com and barnesandnoble.com—that is, that books would be sold on the Internet but delivered in their heavy, printed form—would have been laughed off the podium. Yet Amazon.com and barnesandnoble.com are in exactly that business, and they are in it worldwide. The first order for the U.S. edition of my most recent book, Management Challenges for the 21st Century (1999), came to Amazon.com, and it came from Argentina.

Another example: Ten years ago one of the world's leading automobile companies made a thorough study of the expected impact on automobile sales of the then emerging Internet. It concluded that the Internet would become a major distribution channel for used cars, but that customers would still want to see new cars, to touch them, to test-drive them. In actuality, at least so far, most used cars are still being bought not over the Internet but in a dealer's lot. However, as many as half of all new cars sold (excluding luxury cars) may now actually be "bought" over the Internet. Dealers only deliver cars that customers have chosen well before they enter the dealership. What does this mean for the future of the local automobile dealership, the twentieth century's most profitable small business?

Another example: Traders in the American stock-market boom of 1998 and 1999 increasingly buy and sell online. But investors seem to be shifting away from buying electronically. The major U.S. investment vehicle is mutual funds. And whereas almost half of all mutual funds a few years ago were bought electronically, it is estimated that the figure will drop to 35 percent next year and to 20 percent by 2005. This is the opposite of what "everybody expected" ten or fifteen years ago.

The fastest-growing e-commerce in the United States is in an area where there was no "commerce" until now—in jobs for professionals and managers. Almost half of the world's largest companies now recruit through Web sites, and some two and a half million managerial and professional people (two thirds of them not even engineers or computer professionals) have their résumés on the Internet and solicit job offers over it. The result is a completely new labor market.

This illustrates another important effect of e-commerce. New distribution channels change who the customers are. They change not only how customers buy but also what they buy. They change consumer behavior, savings patterns, industry structure—in short, the entire economy. This is what is now happening, and not only in the United States but increasingly in the rest of the developed world, and in a good many emerging countries, including mainland China.

Luther, Machiavelli, and the Salmon

The railroad made the Industrial Revolution accomplished fact. What had been revolution became establishment. And the boom it triggered lasted almost a hundred years. The technology of the steam engine did not end with the railroad. It led in the 1880s and 1890s to the steam turbine, and in the 1920s and 1930s to the last magnificent American steam locomotives, so beloved by railroad buffs. But the technology centered on the steam engine and in manufacturing operations ceased to be central. Instead the dynamics of the technology shifted to totally new industries that emerged almost immediately after the railroad was invented, not one of which had anything to do with steam or steam engines. The electric telegraph and photography were first, in the 1830s, followed soon thereafter by optics and farm equipment. The new and different fertilizer industry, which began in the late 1830s, in short order transformed agriculture. Public health became a major and central growth industry, with quarantine, vaccination, the supply of pure water, and sewers, which for the first time in history made the city a more healthful habitat than the countryside. At the same time came the first anesthetics.

With these major new technologies came major new social institutions: the modern postal service, the daily paper, investment banking, and commercial banking, to name just a few. Not one of them had much to do with the steam engine or with the technology of the Industrial Revolution in general. It was these new industries and institutions that by 1850 had come to dominate the industrial and economic landscape of the developed countries.

This is very similar to what happened in the printing revolution—the first of the technological revolutions that created the modern world. In the fifty years after 1455, when Gutenberg had perfected the printing press and movable type he had been working on for years, the printing revolution swept Europe and completely changed its economy and its psychology. But the books printed during the first fifty years, the ones called incunabula, contained largely the same texts that monks, in their scriptoria, had for centuries laboriously copied by hand: religious tracts and whatever remained of the writings of antiquity. Some 7,000 titles were published in those first fifty years, in 35,000 editions. At least 6,700 of these were traditional titles. In other words, in its first fifty years printing made available—and increasingly cheap—traditional information and communication products. But then, some sixty years after Gutenberg, came Luther's German Bible—thousands and thousands of copies sold almost immediately at an unbelievably low price. With Luther's Bible the new printing technology ushered in a new society. It ushered in Protestantism, which conquered half of Europe and, within another twenty years, forced the Catholic Church to reform itself in the other half. Luther used the new medium of print deliberately to restore religion to the center of individual life and of society. And this unleashed a century and a half of religious reform, religious revolt, religious wars.

At the very same time, however, that Luther used print with the avowed intention of restoring Christianity, Machiavelli wrote and published The Prince (1513), the first Western book in more than a thousand years that contained not one biblical quotation and no reference to the writers of antiquity. In no time at all The Prince became the "other best seller" of the sixteenth century, and its most notorious but also most influential book. In short order there was a wealth of purely secular works, what we today call literature: novels and books in science, history, politics, and, soon, economics. It was not long before the first purely secular art form arose, in England—the modern theater. Brand-new social institutions also arose: the Jesuit order, the Spanish infantry, the first modern navy, and, finally, the sovereign national state. In other words, the printing revolution followed the same trajectory as did the Industrial Revolution, which began 300 years later, and as does the Information Revolution today.

What the new industries and institutions will be, no one can say yet. No one in the 1520s anticipated secular literature, let alone the secular theater. No one in the 1820s anticipated the electric telegraph, or public health, or photography.

The one thing (to say it again) that is highly probable, if not nearly certain, is that the next twenty years will see the emergence of a number of new industries. At the same time, it is nearly certain that few of them will come out of information technology, the computer, data processing, or the Internet. This is indicated by all historical precedents. But it is true also of the new industries that are already rapidly emerging. Biotechnology, as mentioned, is already here. So is fish farming.

Twenty-five years ago salmon was a delicacy. The typical convention dinner gave a choice between chicken and beef. Today salmon is a commodity, and is the other choice on the convention menu. Most salmon today is not caught at sea or in a river but grown on a fish farm. The same is increasingly true of trout. Soon, apparently, it will be true of a number of other fish. Flounder, for instance, which is to seafood what pork is to meat, is just going into oceanic mass production. This will no doubt lead to the genetic development of new and different fish, just as the domestication of sheep, cows, and chickens led to the development of new breeds among them.

But probably a dozen or so technologies are at the stage where biotechnology was twenty-five years ago—that is, ready to emerge.

There is also a service waiting to be born: insurance against the risks of foreign-exchange exposure. Now that every business is part of the global economy, such insurance is as badly needed as was insurance against physical risks (fire, flood) in the early stages of the Industrial Revolution, when traditional insurance emerged. All the knowledge needed for foreign-exchange insurance is available; only the institution itself is still lacking.

The next two or three decades are likely to see even greater technological change than has occurred in the decades since the emergence of the computer, and also even greater change in industry structures, in the economic landscape, and probably in the social landscape as well.

The Gentleman Versus the Technologist

The new industries that emerged after the railroad owed little technologically to the steam engine or to the Industrial Revolution in general. They were not its "children after the flesh"—but they were its "children after the spirit." They were possible only because of the mind-set that the Industrial Revolution had created and the skills it had developed. This was a mind-set that accepted—indeed, eagerly welcomed—invention and innovation. It was a mind-set that accepted, and eagerly welcomed, new products and new services. It also created the social values that made possible the new industries. Above all, it created the "technologist." Social and financial success long eluded the first major American technologist, Eli Whitney, whose cotton gin, in 1793, was as central to the triumph of the Industrial Revolution as was the steam engine. But a generation later the technologist—still self-taught—had become the American folk hero and was both socially accepted and financially rewarded. Samuel Morse, the inventor of the telegraph, may have been the first example; Thomas Edison became the most prominent. In Europe the "businessman" long remained a social inferior, but the university-trained engineer had by 1830 or 1840 become a respected "professional."

By the 1850s England was losing its predominance and beginning to be overtaken as an industrial economy, first by the United States and then by Germany. It is generally accepted that neither economics nor technology was the major reason. The main cause was social. Economically, and especially financially, England remained the great power until the First World War. Technologically it held its own throughout the nineteenth century. Synthetic dyestuffs, the first products of the modern chemical industry, were invented in England, and so was the steam turbine. But England did not accept the technologist socially. He never became a "gentleman." The English built first-rate engineering schools in India but almost none at home. No other country so honored the "scientist"—and, indeed, Britain retained leadership in physics throughout the nineteenth century, from James Clerk Maxwell and Michael Faraday all the way to Ernest Rutherford. But the technologist remained a "tradesman." (Dickens, for instance, showed open contempt for the upstart ironmaster in his 1853 novel Bleak House.)

Nor did England develop the venture capitalist, who has the means and the mentality to finance the unexpected and unproved. A French invention, first portrayed in Balzac's monumental La Comédie humaine, in the 1840s, the venture capitalist was institutionalized in the United States by J. P. Morgan and, simultaneously, in Germany and Japan by the universal bank. But England, although it invented and developed the commercial bank to finance trade, had no institution to finance industry until two German refugees, S. G. Warburg and Henry Grunfeld, started an entrepreneurial bank in London, just before the Second World War.

Bribing the Knowledge Worker

What might be needed to prevent the United States from becoming the England of the twenty-first century? I am convinced that a drastic change in the social mind-set is required—just as leadership in the industrial economy after the railroad required the drastic change from "tradesman" to "technologist" or "engineer."

What we call the Information Revolution is actually a Knowledge Revolution. What has made it possible to routinize processes is not machinery; the computer is only the trigger. Software is the reorganization of traditional work, based on centuries of experience, through the application of knowledge and especially of systematic, logical analysis. The key is not electronics; it is cognitive science. This means that the key to maintaining leadership in the economy and the technology that are about to emerge is likely to be the social position of knowledge professionals and social acceptance of their values. For them to remain traditional "employees" and be treated as such would be tantamount to England's treating its technologists as tradesmen—and likely to have similar consequences.

Today, however, we are trying to straddle the fence—to maintain the traditional mind-set, in which capital is the key resource and the financier is the boss, while bribing knowledge workers to be content to remain employees by giving them bonuses and stock options. But this, if it can work at all, can work only as long as the emerging industries enjoy a stock-market boom, as the Internet companies have been doing. The next major industries are likely to behave far more like traditional industries—that is, to grow slowly, painfully, laboriously.

The early industries of the Industrial Revolution—cotton textiles, iron, the railroads—were boom industries that created millionaires overnight, like Balzac's venture bankers and like Dickens's ironmaster, who in a few years grew from a lowly domestic servant into a "captain of industry." The industries that emerged after 1830 also created millionaires. But they took twenty years to do so, and it was twenty years of hard work, of struggle, of disappointments and failures, of thrift. This is likely to be true of the industries that will emerge from now on. It is already true of biotechnology.

Bribing the knowledge workers on whom these industries depend will therefore simply not work. The key knowledge workers in these businesses will surely continue to expect to share financially in the fruits of their labor. But the financial fruits are likely to take much longer to ripen, if they ripen at all. And then, probably within ten years or so, running a business with (short-term) "shareholder value" as its first—if not its only—goal and justification will have become counterproductive. Increasingly, performance in these new knowledge-based industries will come to depend on running the institution so as to attract, hold, and motivate knowledge workers. When this can no longer be done by satisfying knowledge workers' greed, as we are now trying to do, it will have to be done by satisfying their values, and by giving them social recognition and social power. It will have to be done by turning them from subordinates into fellow executives, and from employees, however well paid, into partners.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

President Obama has asked intelligence officials to perform a “full review” of election-related hacking, a top advisor told reporters Friday. The White House will share a report of its findings with lawmakers before Obama leaves office on January 20, 2017, she said.

Lisa Monaco, the president’s advisor for homeland security, made the comments at a Christian Science Monitor event. They were first reported by Politico and The Hill.

Last week, every Democrat (and a Democrat-aligned Independent) on the Senate Intelligence Committee called on the White House to declassify and release more information about Russia’s involvement in the U.S. elections. It’s not clear whether the review announced Friday is connected to the letter from the committee members.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Why did Trump’s choice for national-security advisor perform so well in the war on terror, only to find himself forced out of the Defense Intelligence Agency?

How does a man like retired Lieutenant General Mike Flynn—who spent his life sifting through information and parsing reports, separating rumor and innuendo from actionable intelligence—come to promote conspiracy theories on social media?

Perhaps it’s less Flynn who’s changed than that the circumstances in which he finds himself—thriving in some roles, and flailing in others.

In diagnostic testing, there’s a basic distinction between sensitivity, or the ability to identify positive results, and specificity, the ability to exclude negative ones. A test with high specificity may avoid generating false positives, but at the price of missing many diagnoses. One with high sensitivity may catch those tricky diagnoses, but also generate false positives along the way. Some people seem to sift through information with high sensitivity, but low specificity—spotting connections that others can’t, and perhaps some that aren’t even there.

The president-elect has chosen Andrew Puzder, a vocal critic of minimum-wage hikes and new overtime rules.

Updated on December 9, 2016

President-Elect Donald Trump announced Thursday evening that he picked Andrew Puzder, the CEO of CKE Restaurants, which owns fast-food chains Carl’s Jr. and Hardee’s, to lead the U.S. Department of Labor. Puzder—like several of Trump’s other nominees—is a multi-millionaire and Washington outsider who served as an adviser and fundraiser during the presidential campaign. While there’s no political record to indicate how Puzder thinks about the labor market, his remarks as a business executive give some indication of the stances he’ll take on several important labor issues.

If confirmed, Puzder will likely take a pro-business, anti-labor, approach to steering the federal agency tasked with protecting American workers and their jobs, which clashes with Trump’s populist campaign message of fighting for blue-collar workers. Puzder has been a vocal defender of Trump’s economic policies, including lowering the corporate-tax rate, and has opposed Obamacare and certain business regulations, such as a higher minimum wage. Puzder has argued against raising the minimum wage and offering paid leave and health insurance to employees. Efforts to increase the minimum wage, he writes, will hurt everyone, especially low-skilled workers, because “businesses will have to figure out the best way to deal with the high labor costs.” Those changes, he says, will lead to price increases, more efficient labor management, and automation.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

Democrats who have struggled for years to sell the public on the Affordable Care Act are now confronting a far more urgent task: mobilizing a political coalition to save it.

Even as the party reels from last month’s election defeat, members of Congress, operatives, and liberal allies have turned to plotting a campaign against repealing the law that, they hope, will rival the Tea Party uprising of 2009 that nearly scuttled its passage in the first place. A group of progressive advocacy groups will announce on Friday a coordinated effort to protect the beneficiaries of the Affordable Care Act and stop Republicans from repealing the law without first identifying a plan to replace it.

They don’t have much time to fight back. Republicans on Capitol Hill plan to set repeal of Obamacare in motion as soon as the new Congress opens in January, and both the House and Senate could vote to wind down the law immediately after President-elect Donald Trump takes the oath of office on the 20th.

Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?

This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.

At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”