Site Search Navigation

Site Navigation

Site Mobile Navigation

This is my last post for the Urban Planet series, and in a way I feel as if we’re just getting started. For me, thinking about cities — particularly at a time where they are so central to the future of the planet — is an endlessly fascinating topic, in part because cities are such hotbeds of innovation. Just in the past few years, we’ve seen some extraordinary developments in the design and technology of city life: New York’s 311 service; London’s congestion pricing; eco-city planning in China; and a long list of “big urbanist” projects, in New York and elsewhere, that combine the ambition of Robert Moses with the sidewalk nuances of Jane Jacobs.

Serious problems still exist, and not just in the megacities of the developing world. (See the portrait of Baltimore on HBO’s “The Wire,” a city narrative as complex and compelling as anything Dickens ever wrote.) But we have largely shed the sense of malaise and obsolescence that dominated so much of the public discourse about cities in the 1970s and ‘80s.

The transformation of New York City over the last three decades may be the most dramatic example of recent urban renewal. Think of the city circa 1975: teetering on the brink of insolvency, overrun with crime and garbage, with whole neighborhoods emptying out. Today, it’s as clean and safe as it has been since the ‘50s; the population is growing; there is remarkable diversity with very little ethnic or religious conflict; and dozens of major new parks and public spaces are either being planned or built. (Arguably the city’s biggest problem now is that it has become too desirable as a place to live, resulting in runaway real estate prices.) To have achieved so much while simultaneously weathering the worst terrorist attack in the history of the country is truly an amazing success story, one we would do well to remind ourselves of more frequently.

I think of that success every time I walk around my neighborhood in Brooklyn around drop-off or pick-up time at school, and see the overwhelming number of strollers clogging the sidewalks. Park Slope is notorious for its stroller traffic jams, as well as its slightly manic parents, but whenever I see all those young children on the sidewalk, I think of how many parents have opted to buck the trends of the past 50 years and raise their families in urban neighborhoods.

They know they could buy a McMansion in the suburbs for what they’re paying for a floor-through here, and they know they could have a real backyard. And yet they’ve decided to stay all the same, for the camaraderie and energy and diversity of city neighborhoods. These are virtues we were close to giving up on 30 years ago. That they are ascendant again is good news for all of us.

I began these posts with a look back at the squalor and terror of London 150 years ago, a city literally drowning in its own filth, ravaged by disease, and haunted by a scavenger class living off the refuse of the city — a group so large in number that had they broken off and formed their own city, it would have been the fifth largest in England. My trip to London last week brought yet another reminder of the immense progress the city — like most other cities in the developed world — has made in a relatively short time. The air and water are far cleaner; the killer epidemics of the Victorian age have been vanquished; life expectancies have doubled; and overall standards of living are significantly higher than they were in the 1850s.

But something else has changed since then. London was the largest city on the planet back in 1854, but now it is on the smaller side, as world cities go. (It ranks in the mid-teens, depending on how you define its borders.) Many of the cities that now top the charts — Mexico City, Sao Paulo, Mumbai — have as much in common with Victorian London as they do with the modern version. London’s hundred thousand scavengers are a mere footnote compared to the massive shantytowns that have exploded at the margins of today’s megacities.

These squatter communities have been built on land that is, technically speaking, illegally occupied — without official title deeds, electricity, running water or waste removal systems. (Underground economies providing all these services have started to develop, however.) In such places, the waterborne diseases — including cholera — that plagued the Victorians are still rampant, thanks to miserable public health and sanitation resources.

The squatters worldwide now add up to a billion people, and some experts project that by 2030, a quarter of the world’s population will live in shantytowns. This is not entirely reason for despair. As the writer Robert Neuwirth argues in his extraordinary book “Shadow Cities,” shantytowns are places of dynamic economic innovation and creativity. Some of the oldest ones — the Rocinha area in Rio de Janeiro, Squatter Colony in Mumbai — have already matured into fully-functioning urban areas with most of the comforts we’ve come to expect in the developed world. Improvised wood shacks have given way to steel and concrete, electricity, running water, even cable television.

On Monday, in London, I had a public conversation with the musician and artist Brian Eno on the past and future of cities. The event was sponsored by The Long Now Foundation, which Eno helped create several years ago. Long Now is a wonderful — and wonderfully ambitious — organization that aims, according to its Web site, “to provide counterpoint to today’s ‘faster/cheaper’ mind set and promote ‘slower/better’ thinking [and] to creatively foster responsibility in the framework of the next 10,000 years.”

Cities are a natural subject for long now thinking. They are among the few products of human culture that reliably last for centuries, sometimes millennia. London itself is a kind of supra-organism that has been evolving for more than a thousand years. And cities play a crucial role in some of the most pressing problems facing our planet over the next few centuries. Perhaps the most important of these is the population explosion we’ve been worrying about since Thomas Malthus’s “Essay on the Principle of Population” more than 200 years ago.

Many people are under the impression that global population growth is still on a runaway, unstoppable course that will inevitably cause human needs to outstrip the planet’s natural food supply. This dire projection was most famously outlined in Paul R. Ehrlich’s 1968 bestseller, “The Population Bomb,” which forecast mass famines due to overpopulation. Ehrlich’s predictions proved wrong on a number counts (global food production, for one, increased at a much sharper rate than he imagined). But the most dramatic deviation from his doomsday scenario has only recently become apparent. Demographers now believe that the earth’s population will peak in the middle of this century, at somewhere around 8 billion people, and then start decreasing.

Cities turn out to be a driving force behind that correction. Keep in mind that we have shifted, in just two centuries, from a planet where 3 percent of people lived in cities to a planet where 50 percent of us do. When people move to cities, they have significantly fewer children for several reasons: women tend to work, birth control is more readily available, and space limitations make children an economic liability. (In rural life, in contrast, children who help with farm labor are an economic asset.) Nations like Italy whose populations have long settled in cities have seen their childbirth rate drop below replacement levels. And in recent decades, even the developing world has seen a dramatic decrease.

In my last post I asked: if cities inevitably create opportunities for “asymmetric warfare,” what can we do about it? There are good reasons, as I’ve discussed over the past few weeks, to preserve urban density. But that density creates an opportunity for terrorism; it’s not an accident that the most high-profile European and American attacks of the past few years have all targeted emblems of metropolitan living: subways, commuter trains and office buildings. So what are our options?

The first approach is the easiest to formulate and the most difficult to execute: create a global environment that eliminates, as much as possible, terrorists and the conditions that encourage their development. I realize this is impossibly vague and includes everything from ending the war in Iraq to the much more nuanced — and I think ultimately productive — diplomatic approach advocated by people like Robert Wright of Slate magazine. I don’t want to get into the details of such proposals, since our focus here is on cities. But let it be admitted that if we can figure out a way to reduce the overall supply of terrorists, that will greatly help the situation in our cities.

The second option is to return cities to one of their original functions, to serve as fortresses. One of the advantages of dense city cores — from the point of view of fighting terrorism — is that they take up a relatively small amount of space and have a finite number of inroads. It is functionally impossible to patrol the borders of an entire nation, but city limits are a much more manageable proposition. (As an island, Manhattan is potentially the ultimate fortress city, which is why it worked so well as a prison in “Escape From New York.”) Obviously we wouldn’t want try to transform our city peripheries into oversized airport-security lines, or build medieval-style walls, but we could take a cue from London’s congestion-pricing system, which tracks automobiles that enter the city center by forcing drivers through tollbooths, and use advanced technology to create invisible barricades.

The most troubling terrorist threat involves radioactive material — either in the form of dirty bombs or actual nuclear devices — which emits gamma rays that can be detected yards away from the original source by high-tech geiger counters. (I wrote about this concept several years ago in a piece for Wired Magazine.) A virtual wall made of radiation sensors ringing a major metropolitan area, effectively invisible to ordinary commuters, could likely be built for a small fraction of the money that the military now spends developing new nuclear weapons.

Two comments from readers – responding to my first two columns – nicely lead us to one of the darker problems on our urban planet: the threat of terrorism.

Yoshi wrote: “How do urban areas ‘cultivate’ terrorism? This seems like a dangerous, possibly reactionary conclusion if left unsubstantiated.”

And Bill wrote that “although true that terrorists target cities, it is not simply because they are icons of American culture. It also happens to be where a concentration of people are to attack.”

Yoshi has a point. “Cultivate” suggests that cities play an active role in fomenting terrorism, which isn’t quite what I meant to say. A better way to say it would be: cities attract terrorism; they create an environment where terrorism grows more powerful. Why? Partly because large cities take on a certain iconic status in the geopolitical imagination – and because they contain structures, like the twin towers, that literally are icons of global power. Destroying an icon gets you a wider audience, thereby increasing your ability to terrorize.

But as Bill suggests, cities also supply something else that’s crucial to the terrorist: larger body counts. Ever since Sept. 11, we have heard a great deal about the threats of “asymmetric warfare,” in which small groups or individuals can have a disproportionate impact on large nations, thanks to increasingly accessible military technology and the amplified networks of modern media. The increased density of cities plays an important role in allowing that asymmetry. Of all the things exploited by Al Qaeda on Sept. 11, the one that caused the most loss of life was the complex of technologies that enabled 50,000 people to crowd into two buildings.

Even if you could somehow time-travel back to the mid-19th century with two Boeing 7-series aircraft, you’d be hard pressed to find a place anywhere on the planet where you could kill 2500 people by deliberately crashing the planes. But thanks to the mass migration to cities that we’ve seen over the past 150 years, there is now a functionally limitless supply of targets around that world where such a concentration of victims can be found. If terrorists could manage to get hold of a suitcase nuke, instead of an airplane, they could easily kill a million people. A death toll of that magnitude might well derail the urbanization trend of the past two centuries, at least for a decade or two. Several urban detonations might call into question the underlying premise of metropolitan living altogether.

In the 1800’s, the dense settlements of industrializing cities provided an environment where the bacterium that causes cholera could prosper – and kill – with an unprecedented force. Today’s cities offer a comparable opportunity for terrorists, an opportunity that is also predicated on population density. So the question is: what can we do about it? I’ll explain some of my thinking on this question in my next post, but in the meantime, I’d like to open the floor for your thoughts on potential solutions.

Earlier this month, Thomas Friedman began his column in The New York Times with a story about being chauffeured from Paris Charles De Gaulle Airport by a young, French-speaking African driver who chatted on his mobile phone the entire trip, while simultaneously watching a movie on the dashboard. Friedman, for his part, was writing a column on his laptop and listening to Stevie Nicks on the iPod.

Friedman wrote, “There was only one thing we never did: Talk to each other. . . . I relate all this because it illustrates something I’ve been feeling more and more lately – that technology is dividing us as much as uniting us. Yes, technology can make the far feel near. But it can also make the near feel very far.”

This is the lament of iPod Nation: we’ve built elaborate tools to connect us to our friends – and introduce us to strangers – who are spread across the planet, and at the same time, we’ve embraced technologies that help us block out the people we share physical space with, technologies that give us the warm cocoon of the personalized soundtrack. We wear white earbuds that announce to the world: whatever you’ve got to say, I can’t hear it.

Cities are naturally inclined to suffer disproportionately from these trends, since cities historically have produced public spaces where diverse perspectives can engage with each other – on sidewalks and subways, in bars and, yes, in taxicabs. Thirty years ago, the typical suburban commuter driving solo to work was already listening to his own private soundtrack on the car radio. (If anything, cell phones have made car-centric communities more social.) But for the classic vision of sidewalk urbanism articulated by Jane Jacobs, the activist and author, the bubble of permanent connectivity poses a real threat. There can be no Speaker’s Corner if everyone’s listening to his own private podcast.

I take these threats seriously, but let me suggest two reasons I am a bit less worried than Friedman is about the social disconnection of the connected age. One has to do with the past, the other the future.

The problem with all the talk about Americans living in a country divided between red and blue states lies not in the idea of division itself. The problem lies in framing the whole issue around states. We do live in a divided nation, but states are not the organizing principle we should use in thinking about the split. We are divided between the blue city and the red country.

Consider the breakdown of the past election: rural areas voted Republican by a small margin. The suburbs were evenly divided between the two parties. But 70 percent of Americans living in cities with more than 500,000 people voted for Democrats. Blue states are generally not blue because Democratic voters are evenly distributed statewide. They’re blue because their population base is concentrated in cities. James Carville once observed that, demographically speaking, Pennsylvania was Pittsburgh and Philadelphia with Alabama between them. We think of voting blocs in terms of states because that’s how the electoral map is drawn. But as Carville’s line suggests, voters in Pittsburgh have much more in common politically with voters in San Francisco than they do with their fellow Pennsylvanians living in rural areas.

Surrendering the big cities to the Democrats was a sensible strategy in the 1970’s and ’80’s, when most urban areas seemed headed for a permanent decline. But thanks to the revitalization of many urban cores and the rising tide of immigration, many American cities are growing again. More importantly for national politics, the battleground states of the American West – places like New Mexico or Idaho – are shifting from largely rural zones to states with growing, ethnically diverse urban centers. With the city-centric states of the coasts overwhelmingly Democratic, the urbanization of the mountain states does not bode well for the Republicans in the long run.

One of the reasons the Republicans have so thoroughly lost the urban vote is that they have spent the last 30 years demonizing the culture of big cities – from Reagan’s welfare queens to the recent scaremongering about San Franciscan Nancy Pelosi becoming speaker of the House. City dwellers, we’re told, are not part of “real America.” No doubt this division made more sense in the early days of the Republic, when the U.S. was more than 90 percent rural. But today, only 20 percent of Americans live in rural areas. And whatever you think about the culture of urban life, it is an undeniable fact that the big cities are footing the bill for the residents of so-called “real America.” Blue states consistently pay more in taxes than they receive in federal assistance; the opposite is true for the red states. Why? Because cities like New York or Los Angeles or San Francisco, despite their welfare queens, are tremendous engines of wealth creation. The right wing might still evoke gay marriage and beatniks when it slurs the “radical” Bay Area, but in terms of tax revenues – not to mention global brands – Apple and Google are much more representative of Bay Area values.

In late August of 1854, in London’s crowded working-class neighborhood of Soho, a 5-month-old girl fell ill with cholera, and unleashed a chain of events that ultimately helped shape the world we live in today. The girl — known only as a “Baby Lewis” — lived with her parents, Sarah and Thomas Lewis, at 40 Broad Street, across from a public water pump known throughout Soho for its reliably clean and cool water. When Sarah Lewis emptied out the water she had used to clean her child’s soiled linens, a small amount of that waste found its way into the well beneath the Broad Street pump, thanks to decaying brickwork that separated the well from the cesspool in the Lewises’ basement.

Within 36 hours, one of the most explosive outbreaks of cholera in the history of London erupted throughout the neighborhood. By the end, some two weeks later, 10 percent of the Lewises’ neighbors were dead, and far more would have perished had so many residents not fled in terror.

Those fateful days in the late summer of 1854 are in many ways a reminder of the progress we’ve made since then, in the developed world at least, where cholera is no longer a threat to metropolitan centers, and where amazingly comprehensive systems providing public health services, waste management and clean water have been established to combat the threats that were visible that summer on Broad Street.

But the story of the Broad Street epidemic is not just an historical tale of urban terror and devastation, because the outbreak turned out to be crucial to solving the mystery of cholera itself. The dominant theory at the time held that cholera was a disease caused by the inhalation of poisoned air, a model known generally as the “miasma” theory of disease. This misguided belief — cholera is in fact transmitted by contaminated water — led to some tragically inept public health interventions, including the Nuisances Act of 1848, which emptied many of the city’s cesspools into the Thames, all in the name of combating the dangerous smells in the city streets.