This is the basic problem with the Chris Anderson-anchored Wired cover story, "The Web is Dead." If you think about technology as a series of waves, each displacing the last, perhaps the rise of mobile apps would lead you to conclude that the browser-based web is a goner.

But the browser-based web is not a goner. It's still experiencing substantial growth -- as BoingBoing's Rob Beschizza showed with his excellent recasting of Wired's data -- and that should be one big clue that the technological worldview that says, "The new inevitably destroys the old," is fundamentally flawed.

My objection is not to the idea that the web could become of relatively lesser importance at some point in the future. That could happen, sure. And maybe magazines will end up making (a big chunk of) their money through closed-system apps instead of on the wild-and-woolly Internet. We don't have much evidence to support that thesis yet, but we know the web is a tough place to do business, so maybe apps will end up being how a very particular kind of content ends up packaged. That would certainly make Conde Nast and Wired (if not Wired.com, where I used to work) happy.

The problem is Anderson's assumption about the way technology works. Serious technology scholars long ago discarded the idea that tech was just a series of
increasingly awesomer things that successively displace each other. Australian historian Carroll Pursell, in reviewing Imperial College London professor David Edgerton's The Shock of the Old, summarized the academic thinking nicely:

An obsession with 'innovation' leads to a tidy timeline of progress, focusing on iconic machines, but an investigation of 'technology in use' reveals that some 'things' appear, disappear, and reappear...

Edgerton has the same flair for the flashy stat that Anderson does. For example, to illustrate the point that newer and older technologies happily coexist, he notes that the Germans used more horses in World War II than the British did in World War I. More prosaically, some of the electricity for your latest gadget was probably made in a power plant that's decades old. Many ways to bind pieces of paper -- staplers, binders, paper clips, etc -- remain in common usage ("The Paperclip Is Dead!"). World War I pilots used to keep homing pigeons tucked inside their cockpits as communication tools (see above). People piloting drones and helicopters fight wars against people who use machetes and forty-year old Soviet machine guns; all these tools can kill effectively, and they all exist right now together.

But that's not how Anderson presents technology in this article. Instead, technologies rise up and destroy each other. And there's nothing you or I can do to change the course of these wars. This is the nature of technology and capitalism, and there is not much room for individual decisionmaking or social influence in the algorithm.

"This was all inevitable. It is the cycle of capitalism. The story of industrial revolutions, after all, is a story of battles over control," Chris Anderson writes.
"A technology is invented, it spreads, a thousand flowers bloom, and
then someone finds a way to own it, locking out others. It happens every
time."

He mentions that the electric power industry consolidated in this way, but doesn't mention that the US government encouraged and protected the oligopoly as industry fought public power companies tooth and nail. Or that other countries do things differently, and the structure of their power industries [pdf] reflect that. Or that in states like California, smaller independent power producers have been the ones building the plants, thanks to regulatory changes. Or that in the future, it's possible that smaller-scale, lower-carbon energy sources will generate increasing amounts of power.

I wonder how many historians of technology would agree with him. It sure seems suspiciously like a "tidy timeline of progress," tinged with a little libertarian cynicism. I don't think that scholars represented in journals like Technology and Culture
and by Edgerton, Pursell, David Nye, Thomas Hughes, and Erick Schatzberg would agree that these things happen "every time." Too much scholarship has shown that technologies and systems are (messily) shaped
by social movements and events and governments, political ideas and freak accidents. The kind of logic that says, "This
was all inevitable," is impossible with that data in your hands.

From the vantage point of the present, it may seem that
technologies are deterministic. But this view is incorrect, no matter
how plausible it may seem. Cultures select and shape technologies, not
the other way around, and some societies have rejected or ignored even
the gun or the wheel. For millennia, technology has been an essential
part of the framework for imagining and moving into the future, but the
specific technologies chosen have varied. As the variety of human
cultures attests, there have always been multiple possibilities, and
there seems no reason to accept a single vision of the future.

In the details of the history, we see all the possibilities for other futures. We see the dead-ends and the false predictions, all the "inevitabilities" that never came to pass. We see the variety of systems that have existed in different places and similar ones that have existed at different times.

This is the fundamental value of having a historical sense about technology. It leads you away from making grand sweeping statements about how things must go. In July's Technology and Culture, Leo Marx traced the rise of the word 'technology,' as a way of understanding what technology has come to mean in modern society. He pinpoints exactly what makes the Andersonian worldview so compelling -- and so fraught with peril.

We have made [technology] an all-purpose agent of change. As compared with other means of reaching our social goals, the technological has come to seem the most feasible, practical, and economically viable. It relieves the citizenry of onerous decision-making obligations and intensifies their gathering sense of political impotence. The popular belief in technology as a--if not the--primary force shaping the future is matched by our increasing reliance on instrumental standards of judgment, and a corresponding neglect of moral and political standards, in making judgments about the direction of society.

If something is inevitable, if technologies want things, if destruction must occur, then there is no
use in trying to preserve the things about our lives that we love.
Technology (capital T) is just going to bulldoze them, no matter what.

"The delirious chaos of the open Web was an adolescent phase subsidized
by industrial giants groping their way in a new world," Anderson concludes. "Now they're doing
what industrialists do best -- finding choke points. And by the looks of
it, we're loving it."

But what if you don't? What if you love the open, appless web? Too bad! You're on the wrong side of the future, buddy.

But there is no such thing. We collectively choose the world that we want, not just as consumers, but as people who have and promote ideas.

And the great irony is that with this article, Anderson has done a masterful job of showing exactly how and why human beings try to shape the technological narrative of their worlds. We make arguments for personal and intellectual reasons based on our experience, desires, and ideological leanings.

Anderson doesn't work on, nor believe in, the economics of content on the web, and so while he's making his case against the web generally, he's also making the specific point that print and tablet editions of Wired make sense, but its website (which he doesn't edit) does not.

That's certainly an argument that can be made, but it's impossible not to notice -- if you worked at Wired.com like I did -- that Anderson's inevitable technological path happens to run perfectly through the domains (print/tablet) he controls at Wired, and away from the one that he doesn't.

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?

When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.

Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

A leading neuroscientist who has spent decades studying creativity shares her research on where genius comes from, whether it is dependent on high IQ—and why it is so often accompanied by mental illness.

As a psychiatrist and neuroscientist who studies creativity, I’ve had the pleasure of working with many gifted and high-profile subjects over the years, but Kurt Vonnegut—dear, funny, eccentric, lovable, tormented Kurt Vonnegut—will always be one of my favorites. Kurt was a faculty member at the Iowa Writers’ Workshop in the 1960s, and participated in the first big study I did as a member of the university’s psychiatry department. I was examining the anecdotal link between creativity and mental illness, and Kurt was an excellent case study.

He was intermittently depressed, but that was only the beginning. His mother had suffered from depression and committed suicide on Mother’s Day, when Kurt was 21 and home on military leave during World War II. His son, Mark, was originally diagnosed with schizophrenia but may actually have bipolar disorder. (Mark, who is a practicing physician, recounts his experiences in two books, The Eden Express and Just Like Someone Without Mental Illness Only More So, in which he reveals that many family members struggled with psychiatric problems. “My mother, my cousins, and my sisters weren’t doing so great,” he writes. “We had eating disorders, co-dependency, outstanding warrants, drug and alcohol problems, dating and employment problems, and other ‘issues.’ ”)

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.