Saturday, June 30, 2012

It seems there have been a lot of articles about the future of food lately. Inevitably these articles in glossy magazines and web sites on major media outlets tout high-tech methods as the solution to coming up with enough food for the projected nine billion-plus people on the planet by 2050. Many of these ideas are already beyond the developmental stage.

In February of this year, as reported by Treehugger, an outfit called Plantagon broke ground on a "vertical farm" in Sweden, a multi-story "plantscraper" designed to grow plants in an urban environment. Vertical farms have become the darlings of the high-tech green movement. Essentially, proponents propose building towering concrete, steel and glass skyscrapers in dense urban areas around the world not to house people or businesses, but to grow plants.

Meanwhile in the Netherlands, scientists are growing meat in high-tech laboratories using cells from animals. Here's a description of the process:

It’s made by taking cells from pigs, adding horse fetal serum in petri dishes, and growing it into transparently-thin strips. Petri meat is “fed” a muckture of sugars, fats, amino acids, and other “nutrients”. The sources of these are, presumably, also made in labs. The color is gray, as there is no blood involved, and the texture…well, let’s not go there.

Since the color is gray, artificial coloring has to be added to make it look like "real" meat. According to one of the scientists, "In the beginning it will taste bland. I think we will need to work on the flavour." The cost of a single hamburger? Two hundred thousand pounds (311,200 dollars)! Here are more details:

Professor Post's group at Maastricht University in the Netherlands has grown small pieces of muscle about 2cm long, 1cm wide and about a mm thick. They are off-white and resemble strips of calamari in appearance. These strips will be mixed with blood and artificially grown fat to produce a hamburger by the autumn. The cost of producing the hamburger will be £200,000 but Professor Post says that once the principle has been demonstrated, production techniques will be improved and costs will come down.

This technique has the high-tech agriculturalists agog, and has been written up everywhere from Slate to Wired. In a similar vein, proponents are touting using genetically engineered salmon, nanotechnology, embedding computer chips to track plant growth, food pills and employing robots to harvest food in a world already awash in surplus labor.

So these sophisticated high-tech and energy-intensive "solutions" are designed to do what man has been doing literally since the stone age using no fossil fuels or technology - grow edible plants and produce meat. And this is what the "experts" are touting as the future of food? This brings to mind only one question:

Have we lost our minds?

There' another way to grow meat. It's called an animal. There's a bright glowing ball in the sky that will provide all the light you need to grow plants for free. Want to grow them indoors? Try a greenhouse. Honestly, are these "innovations" when we're using expensive and complex technologies to do things that even the lowliest peasant could accomplish relatively easily since the neolithic revolution? Really? Is this progress? For example, we need to turn to fish farming because we've toxified, polluted and overfished our oceans. We need to start businesses and design artificial systems to produce what nature used to give us for free. Is this something to get excited about? These ideas are like something out a bad 1970's science fiction flick. What's next, Soylent Green?

Seriously, how technologically-obsessed have we become as a society that these are the only solutions we can envision? These two examples are the best illustrations I can think of for what James Howard Kunstler calls techno-narcissism - the idea that we must always solve every problem by creating new high-tech whiz-bang solutions dependent on large quantities of energy and sophisticated technological know-how to keep the status quo going, rather than asking if the status quo is a good idea in the first place. We refuse to consider overhauling or reforming our dysfunctional systems of economics, politics, business, agriculture, urban design, and pretty much everything else in favor of coming up with new technology to prop up the current system at all costs, rather than just coming up with simpler and less complex solutions that favor human needs, human scale, and the needs of the environment.

Are we surprised this takes place in a country where people drive their SUV’s from work to the gym to spend an hour on the stationary bike? Where electric can openers are commonly found in the kitchen? Where office buildings are torn down to build parking garages? Where freeways are continually being expanded to alleviate congestion, only to cause more congestion? Where trees are chopped down to give better views of billboards? After all, this is the country that invented the solar-powered tanning salon.

Most of the intelligentsia got that way by climbing the ladder it in today’s technophilic modern economy, so their biases are toward the gee-whiz novelty factor of high-tech farming and meat grown in a lab (even if they’ve never met, much less talked to, an actual ‘regular’ farmer). These technical people know the latest Android model, but don’t know the first thing about agriculture – their food comes from the supermarket (and probably purchased via an iPhone app). If a machine were invented that scrubbed carbon from the atmosphere and turned it into useful food, construction materials, animal feedstocks, and fibers, all while rebuilding topsoil, it would be on the cover of every tech magazine in the world and its inventor would be a celebrity millionaire. Yet I’ve never seen a tree on the cover of Wired.

These technologies have inherent biases embedded in them – centralized control, specialized expertise, control of nature thorough artificial means, complexity, etc. Are not our problems with agriculture caused by these biases? Therefore, why do we assume that the solutions will come out of these biases? Perhaps it is the bias itself which is the problem, and the invention of new technologies along the same lines will not fix anything but merely cause more problems, both environmental and social. But we cannot look beyond them, so locked are we into this world view, so unable are we to see alternatives outside of it. The fundamental problems with agriculture have been caused by an over-reliance on technology, so it is unreasonable to believe that they will be mitigated by them.

What these ‘solutions’ have in common is that they require advanced technological expertise and lots of capital, as opposed to lower-tech alternatives and this is why they are preferred. It is not because lack of alternatives, but the alternatives are often low-tech, distributed, decentralized, work with natural processes, and do not require large amounts of capital or advanced technical knowledge. So when they say this is the future of food – this is the future that the multinational corporations, big finance and governments want for us; it is not better for us or the planet. And the centralized authorities will spend a lot of money, as in the Slate article, convincing us that this is the ‘answer’. The answer, of course, depends on the question you are asking. If the question is how to preserve our dysfunctional society, failing systems, wealth concentration and environmental degradation, than yes perhaps these are the answers. But if we ask different questions, we come up with very different answers. Are these solutions about really solving the problem or about keeping productive land and wealth concentrated in the hands of the few?

We tend to think of innovation solely in terms of technology. Concepts like Permaculture, agroforestry, biochar production, holistic management, food forests, planned rotational grazing and composting are not given the attention they deserve. Similarly, solutions to other problems, like denser urban communities, shorter working hours, or energy conservation are not considered not because they won't work, but because they do not fit the needs of the elites. I'd rather have forty million farmers than one vertical farm.

"I'm as concerned about food as the next guy — scratch that, I'm more concerned about food than the next guy — which is why I find it somewhat dismaying to see a serious and complicated set of issues turned into a sort of fetish. I really don't know what other word to use to describe the notion of spending "hundreds of millions" of dollars to build weird, poorly sited temples of food production in areas much better suited to dense, green residential and retail space. Brooklyn was once one of the most agriculturally productive regions in the United States. Manhattan was once home to innumerable factories. There's a reason that farms and factories decamped to more suitable locations. Using urban real estate in this manner is incredibly wasteful: bad for the economy and bad for the environment. Local food has its merits, but that's what New Jersey is for."

"Although the concept has provided opportunities for architecture students and others to create innovative, sometimes beautiful building designs, it holds little practical potential for providing food."

For obvious reasons, no one has ever proposed stacking solar photovoltaic panels one above the other. For the same reasons, crop fields cannot be layered one above the other without providing a substitute for the sunlight that has been cut off....As a result, the lion's share of a vertical farm's lighting would have to be supplied artificially, consuming resource-intensive electricity rather than free sunlight. This led us to wonder, "What would be the consequences of a vertical-farming effort large enough to allow us to remove from the landscape, say, the United States' 53 million acres of wheat?"...Our calculations, based on the efficiency of converting sunlight to plant matter, show that just to meet a year's U.S. wheat production with vertical farming would, for lighting alone, require eight times as much electricity as all U.S. utilities generate in an entire year.

The solution to soil and water degradation is not to strip food-producing plants from the landscape only to grow them, deprived of sunlight, in vertical factory farms. Instead, we have to address the Achilles heel of agriculture itself: that it has displaced, on a massive scale, diverse stands of natural perennial vegetation (such as prairies, savannahs, and forests) with monocultures of ephemeral, weakly rooted, soil-damaging annual crops such as corn, soybean, and wheat. So far, the weaknesses of the current food-production system have been compensated for agronomically through greater and greater inputs of fossil fuels and other resources, worsening the ecological impact; vertical farming would extend that trend.

[Vertical farm advocate Dickson] Despommier asserts that his system will require “no herbicides, pesticides, or fertilizers”. Perhaps he has never seen a fungal infestation in a greenhouse. And what does he expect the plants to grow on: water and air alone? He also insists that there will be “no need for fossil-fueled machinery”, which suggests that he intends to farm a 30-storey building without pumps, heating or cooling systems.

His idea, he says, is an antidote to “intensive industrial farming, carried out by an ever decreasing number of highly mechanized farming consortia” but then he calls on Cargill, Monsanto, Archer Daniels Midland and IBM to fund it. He suggests that “locally grown would become the norm”, but fails to explain why such businesses wouldn’t seek the most lucrative markets for their produce, regardless of locality. He expects, in other words, all the usual rules of business, economics, physics, chemistry and biology to be suspended to make way for his idea.

But the real issue is scarcely mentioned in his essays on the subject: light. Last week one of my readers, the film maker John Russell, sent me his calculations for the artificial lighting Despommier’s towers would require. ... They show that the light required to grow the 500 grammes of wheat that a loaf of bread contains would cost, at current prices, £9.82. (The current farm gate price for half a kilo of wheat is 6p.) That’s just lighting: no inputs, interest, rents, rates, or labour. Somehow this minor consideration – that plants need light to grow and that they aren’t going to get it except on the top storey – has been overlooked by the scheme’s supporters. I won’t bother to explain the environmental impacts.

And Lloyd Alter points out that:... the [vertical farm] idea only makes sense if you think of farming as a no-holds battle to the death and when you think of soil as nothing more than an mechanism to hold a plant up. Sami [Grover] has written that "there are more organisms in one teaspoon of soil than there have ever been humans on this planet." Others are trying to build biodynamic, organic, regenerative, or ecological farming communities, where food is grown naturally and is actually good for the soil instead of destroying it. It is a much more attractive and probably better tasting future of food.

As an architect, I have some idea of the difficulty in constructing large buildings; of the legions of experts, the engineering and design, the manpower, the planning, the staging, the legal restrictions, the complex funding instruments and revenue streams. These buildings, as works of civil engineering, are among the largest undertakings we do as a species. The advanced systems of construction, erection, and waterproofing are all based on fossil fuels. The material creation and application for skyscrapers are based on fossil fuels. The maintenance is based on fossil fuels. The systems that make them accessible and inhabitable are based on fossil fuels. And yet these are touted as a solution to fossil fuels use in agriculture?

Buildings require extensive maintenance. If future revenues do not materialize for maintenance, they fall apart, something that is already happening even to newer buildings in municipalities all over the country weighed down by debt. Planning and maintenance must be taken into account, as must be future expansion. This alternative proposal claims to make vertical farming “work” from an energy standpoint, but only at a cost of extremely complex, interconnected, highly technical system requiring delicate balancing, extensive maintenance and vulnerable to breakdown. It aims to mimic nature by recycling resources as much as possible. Why not just use nature instead, instead of trying to duplicate it? This is a complex solution, not a simple one.

As for frankenmeat, it takes a lot of sophisticated technical expertise and access to advanced technology to grow meat in a laboratory. It does not require as much to feed and slaughter a pig or chicken. And one can always breed more pigs and chickens, whereas a meat-growing lab or vertical farm nurtures dependence. The problem is not meat eating. It is too many people and the way we have used our land.

Meat consumption is a complex issue and too involved to go into much detail here. But it is known that a truly natural form of organic agriculture cannot be devoid of animals, and that some regions are much better suited to animal pasturing than annual crops. Proper use of animals in systems like planned, rotational grazing has been shown to reverse desertification and build topsoil. For thousands of years it was known that farming could not be done without animals. Yet we rely on chemical fertilizers while concentrating animals into consolidated animal feedlot operations that effectively torture living creatures, producing meat of lower quality as well as environmental destruction from effluents, antibiotic-resistant bacteria, and a host of other ills. If promoting resilience and reducing food miles is the answer, animals must be part of the solution. Joel Salatin’s diversified farm model provides abundant food sustainably, yet Salatin’s methods are considered impossible to implement because farming can only be "profitable" by farming millions of acres of monocrops with machines. As Sharon Astyk writes:

Without petroleum based and industrial fertilizers...there is no such thing as a viable agriculture without animal production. David Montgomery’s wonderful book _Dirt: The Erosion of Civilization_ will give anyone a clear overview of the radical difference between societies that manure ground and those that didn’t – such an agriculture has a long and important history.

Are there places where cows are being raised right this minute that shouldn’t have cows on them? Absolutely. Are there places right now that are being tilled to grow corn and soybeans that shouldn’t be tilled? Absolutely. It is impossible to speak in general terms about what one should do with each piece of arable land – in fact, the difficult and emergent task is to do WHAT IS BEST on each one – best for the land, best for the people who depend on it, best for the wildlife and other creatures who inhabit it.

What are these “right” and “wrong” ways of producing both meat and plant foods? For me, they are most succinctly summed up in Aldo Leopold’s land ethic: “A thing is right when it tends to preserve the integrity, stability and beauty of the biotic community. It is wrong when it tends otherwise.” While studying agroecology at Prescott College in Arizona, I was convinced that if what you are trying to achieve with an “ethical” diet is the least destructive impact on life as a whole on this planet, then in some circumstances, like living among dry, scrubby grasslands in Arizona, eating meat, is, in fact, the most ethical thing you can do other than subsist on wild game, tepary beans and pinyon nuts. A well-managed, free-ranged cow is able to turn the sunlight captured by plants into condensed calories and protein with the aid of the microorganisms in its gut. Sun > diverse plants > cow > human. This in a larger ethical view looks much cleaner than the fossil-fuel-soaked scheme of tractor-tilled field > irrigated soy monoculture > tractor harvest > processing > tofu > shipping > human.

Yes, natural meat production is more resource intensive than growing plants which are lower on the food chain. But it needs to be location-specific. No, it does not make sense to chop down the Amazon to grow beef cattle for McDonalds. Yes, CAFOs are an abomination. But it makes more sense to raise dairy cows than grow lentils in Wisconsin, and it makes more sense to pasture bison than grow soybeans in Nebraska. We should not resort to reflexive oversimplifications like "eat less meat" without taking into account what the best type of agriculture for a specific region is and how an integrated approach can be more sustainable in some cases. So yes, perhaps relying less on meat in our diets is part of the solution. But let's make sure we're growing our food in the right places, and integrating animals in an ecologically sustainable manner into our agriculture. Grasslands are enormous biomes that sequester carbon. They also stabilize soil (as the United States found out during the Dust Bowl). Grazing animals on these lands can be a boon. Meat eating doesn't have to be destructive. Shouldn't we fix this first before breaking out the test tubes? Integrating animals in the appropriate places in an environmentally sensitive manner is far more beneficial and less energy intensive than growing it in a lab.

Perplexingly, vertical farming and lab-grown meat are seen as a solutions to the problems with conventional farming, but isn’t the answer for correcting our misuse and abuse of the land to stop misusing and abusing the land, rather than building a skyscraper to grow plants or growing meat in a petri dish? How exactly do they ‘solve’ anything? Backyard gardens like the Victory Garden movement during the second world war grew large amounts of food. The city of Paris grew most of its vegetables locally in the nineteenth century using a series of innovative techniques based in empty lots in the city. French Intensive Gardening used horse manure on raised beds, mulches, human labor, cold frames, and even bell-shaped glass jars to grow food almost year-round at a northern latitude. What about vacant lots? What about rooftop gardens? What about backyard chickens? What about corporate control over the food supply and land consolidation? Maybe we should stop building over our most productive farmland and think about why we build in the first place and how we inhabit the landscape? These are just a few of the alternatives to the high-tech solution proposed by the technophilists. Many cities even in the age of sprawl are surrounded by productive farmland that is underutilized. Transporting food into the cities is hardly a crisis that needs to be solved with skyscrapers; bicycles, or even boats, would do in a pinch. How can we be proposing things like this when small farmers are still going bankrupt and throwing in the towel?

The other unstated assumption is that vertical farms are necessary to continue the global experiment of kicking farmers off the land and forcing them into overcrowded and unsanitary slums to eke out a living making consumer goods for rich countries in sweatshops or earning a living by whatever irregular work they can find. Maybe we should reconsider an economic system relentlessly devoted to sidelining labor in the name of efficiency and driving down costs no matter the consequences. Perhaps it is this policy in need of consideration rather than claiming that vertical farms and lab-grown meat are the answer. We’re breathlessly told that these are the only alternatives to looming disaster without asking how we got to this disaster point in the first place. Perhaps we should question the underlying assumptions that brought us to this point before plowing ahead with new techno-fixes.

To be clear, I'm not against acquiring new knowledge. I think it's great that we know how to do these things, and I fully support continued research in these areas. Every bit of knowledge is a potential tool, whether we end up using it or not, and I'm glad scientists are working on these things. No, what I'm criticizing are our priorities. We constantly hear laments about how we need more engineers. But why don't we hear as much about needing more farmers? Maybe we should give money to small-scale farmers and soil scientists before investing it in plantscrapers and meat labs.

Friday, June 29, 2012

[...] Rousseau was not the kind of man with whom one would share one's
picnic. He was the worst kind of hypochondriac – one who really is
always ill – and that most dangerous of paranoiacs – one who really is
persecuted. Even so, at the heart of an 18th-century Enlightenment
devoted to reason and civilisation, this maverick intellectual spoke up
for sentiment and nature. He was not, to be sure, as besotted by the
notion of the noble savage as some have considered. But he was certainly
a scourge of the idea of civilisation, which struck him for the most
part as exploitative and corrupt.

In this, he was a notable
precursor of Karl Marx. Private property, he wrote, brings war, poverty
and class conflict in its wake. It converts "clever usurpation into
inalienable right". Most social order is a fraud perpetrated by the rich
on the poor to protect their privileges. The law, he considered,
generally backs the strong over the weak; justice is largely a weapon of
violence and domination, while culture, science, the arts and religion
are harnessed to the task of preserving the status quo. The institution
of the state has "bound new fetters on the poor and given new powers to
the rich". For the benefit of a few ambitious men, he comments, "the
human race has been subjected to labour, servitude and misery".

He
was not, as it happens, opposed to private property as such. His
outlook was that of the petty-bourgeois peasant, clinging to his
hard-won independence in the face of power and privilege. He sometimes
writes as though any form of dependence on others is despicable. Yet he
was a radical egalitarian in an age when such thinkers were hard to
find. Almost uniquely for his age, he also believed in the absolute
sovereignty of the people. To bow to a law one did not have a hand in
creating was a recipe for tyranny. Self-determination lay at the root of
all ethics and politics. Human beings might misuse their freedom, but
they were not truly human without it.

What would this giant of
Geneva have thought of Europe 300 years on from his birth? He would no
doubt have been appalled by the drastic shrinking of the public sphere.
His greatest work, The Social Contract,
speaks up for the rights of the citizenry in the teeth of private
interests. He would also be struck by the way the democracy he cherished
so dearly is under siege from corporate power and a manipulative media.
Society, he taught, was a matter of common bonds, not just a commercial
transaction. In true republican fashion, it was a place where men and
women could flourish as ends in themselves, not as a set of devices for
promoting their selfish interests.

The same, he thought, should be
true of education. Rousseau ranks among the great educational theorists
of the modern era, even if he was the last man to put in charge of a
classroom. Young adults, he thought, should be allowed to develop their
capabilities in their distinctive way. They should also delight in doing
so as an end in itself. In the higher education systems of today's
world, this outlandish idea is almost dead on its feet. It is nearly as
alien as the notion that the purpose of education is to serve the
empire. Universities are no longer educational in any sense of the word
that Rousseau would have recognised. Instead, they have become unabashed
instruments of capital. Confronted with this squalid betrayal, one
imagines he would have felt sick and oppressed. As, indeed, he usually
did.

Basing the entire economy around debt is starting to look like a bad idea:

Surprised local taxpayers from Stockton, Calif., to Scranton, Pa., are finding themselves obligated for parking garages, hockey arenas and other enterprises that can no longer pay their debts. Officials have signed them up unknowingly to backstop the bonds of independent authorities, the special bodies of government that run projects like toll roads and power plants.

The practice, meant to save governments money, has been gaining popularity without attracting much notice, and is creating problems for a small but growing number of cities.

Data from Thomson Reuters suggests that local taxpayers are backing so-called enterprise debt at five times the rate they did 10 years ago. The resulting municipal bonds are sometimes called “double barreled,” because they are backed by both the future revenue of a project and some sort of taxpayer backstop. The exact wording and mechanics can vary.

With many cities now preoccupied with other crushing costs — pension obligations, retiree health care, accumulated unpaid bills — a sudden call to honor a long-forgotten bond guarantee can be a bolt from the blue, precipitating a crisis. The obligations mostly lurk in the dark. State laws requiring voter pre-approval of bonds don’t generally apply to guarantees. Local governments typically don’t include them in their own financial statements or set aside reserves to honor them.

The Governmental Accounting Standards Board on Monday released a draft of a new standard that would require governments to disclose details of the guarantees they have issued for other entities’ debts. The board started working on the new rules last year, after seeing more and more little-known guarantees coming to light.

The board’s research also showed that some guarantees were very large. The State of Texas, for one, has guaranteed the debts of all the school districts in that state, to the tune of a total of $50 billion. Texas has set aside about $25 billion, however, which analysts consider adequate. A number of states have also guaranteed venture capital projects.

Scranton’s version of a debt crisis began when a local parking authority said it couldn’t make a bond payment coming due in June, calling on the city’s guarantee. The authority had issued bonds to finance parking garages that the city had used in a campaign to woo Hilton Hotels and Resorts to operate a conference center downtown.

Each time the authority issued more bonds, the city backed them with a powerful “full faith and credit” guarantee. But by 2008 the authority had $54 million in bonds outstanding, and was spending about 60 percent of its budget on debt service — so much that it could not cut parking rates to compete with cheaper parking lots nearby.

There’s not a big, identifiable problem that’s driving Stockton, Calif., into bankruptcy. That’s why other cities are worried about its example.

Many of the high-profile public-sector bankruptcies over the past few years were triggered by massive projects that went bad, like the $3 billion sewer bond problem in Jefferson County, Ala., or Harrisburg, Pa.’s big incinerator. That has allowed municipal mavens to argue that forecasts of waves of bankruptcies from the likes of analyst Meredith Whitney are overblown.

The central California valley town of Stockton, however, has no such signature failure. Its woes are the result of a combination of many different factors.

Sitting beyond what was once considered the outer edge of the Bay Area, Stockton has been particularly hard hit by foreclosures. Its retirement benefits have been on the generous side -- people who worked for the city for as little as a month could count on health coverage for life. It also borrowed money during boom times for some high-profile projects along its waterfront. The city made some poor decisions, but nothing hundreds of other places haven’t done.

Because its recipe for disaster was not unusual, many other California cities are nervous. City officials from Fresno to San Jose are using Stockton as a warning of what could happen if they aren’t able to trim pensions and make other changes to what has been standard operating procedure.

“Stockton’s predicament varies only in degree from the woes faced by many other local governments in the state that acted as if the good times would last forever,” editorialized the San Francisco Chronicle. “It may be the largest to face bankruptcy, but it likely won’t be the last.”

Thursday, June 28, 2012

Mainstream economists see the 2008–09 global economic recession and near-collapse of the international financial system as a bump in the road, albeit an unusually big one, before a return to growth as usual. Projections of economic growth, whether by the World Bank, Goldman Sachs, or Deutsche Bank, typically show the global economy expanding by roughly 3 percent a year. At this rate the 2010 economy would easily double in size by 2035. With these projections, economic growth in the decades ahead is more or less an extrapolation of the growth of recent decades.

But natural scientists see that as the world economy expanded some 20-fold over the last century, it has revealed a flaw—a flaw so serious that if it is not corrected it will spell the end of civilization as we know it. At some point, what had been excessive local demands on environmental systems when the economy was small became global in scope.

A study by a team of scientists led by Mathis Wackernagel aggregates the use of the earth’s natural assets, including carbon dioxide overload in the atmosphere, into a single indicator—the ecological footprint. The authors concluded that humanity’s collective demands first surpassed the earth’s regenerative capacity around 1980. By 2007, global demands on the earth’s natural systems exceeded sustainable yields by 50 percent. Stated otherwise, it would take 1.5 Earths to sustain our current consumption. If we use environmental indicators to evaluate our situation, then the global decline of the economy’s natural support systems—the environmental decline that will lead to economic decline and social collapse—is well under way.

No civilization can destroy it resource base and survive:

The archeological record indicates that civilizational collapse does not come suddenly out of the blue. Archeologists analyzing earlier civilizations talk about a decline-and-collapse scenario. Economic and social collapse was almost always preceded by a period of environmental decline.

For past civilizations it was sometimes a single environmental trend that was primarily responsible for their decline. Sometimes it was multiple trends. For Sumer, rising salt concentrations in the soil, as a result of an environmental flaw in the design of their otherwise extraordinary irrigation system, led to a decline in wheat yields. The Sumerians then shifted to barley, a more salt-tolerant crop. But eventually barley yields also began to decline. The collapse of the civilization followed.

For the Mayans, it was deforestation and soil erosion. As more and more land was cleared for farming to support the expanding empire, soil erosion undermined the productivity of their tropical soils. A team of scientists from the National Aeronautics and Space Administration has noted that the extensive land clearing by the Mayans likely also altered the regional climate, reducing rainfall. In effect, the scientists suggest, it was the convergence of several environmental trends, some reinforcing others, that led to the food shortages that brought down the Mayan civilization.

Although we live in a highly urbanized, technologically advanced society, we are as dependent on the earth’s natural support systems as the Sumerians and Mayans were. If we continue with business as usual, civilizational collapse is no longer a matter of whether but when. We now have an economy that is destroying its natural support systems, one that has put us on a decline and collapse path.

...what is approaching completion over on London's South Bank is almost the perfect metaphor for how the capital is being transformed – for the worse. The skyscraper both encapsulates and extends the ways in which London is becoming more unequal and dangerously dependent on hot money.

Consider again the story of the Shard. This is a high-rise that has been imposed on London Bridge despite protests from residents, conservation groups and a warning from Unesco that it may compromise the world-heritage status of the nearby Tower of London. What's more, its owners and occupiers will have very little to do with the area, which for all its centrality is also home to some of the worst deprivation and unemployment in the entire city. The building is 95% owned by the government of Qatar and its developer, Irvine Sellar, talks of it as a "virtual town", comprising a five-star hotel and Michelin-starred restaurants.

It will also have 10 flats that are on sale for between £30m to £50m, and from where on a clear day it will be easier to gaze out on to the North Sea, 44 miles away, than at the beetle-sized locals 65 floors down below. "We won't really market these apartments," the PR man cheerily told me. "At this level of the market, there are probably only 25 to 50 possible buyers in the world. The agents will simply phone them up."

So one of London's most identifiable buildings will have almost nothing to do with the city itself. Even the office space rented out at the bottom is intended for hedge funds and financiers wanting more elbow room than they can afford in the City or Mayfair. The only working-class Londoners will presumably bus in at night from the outskirts to clean the bins. Otherwise, to all intents and purposes, this will be the Tower of the 1%.

Perhaps the most remarkable thing about the Shard is that it simply exemplifies a number of trends. First, it merely confirms how far the core of London is becoming, in industrial terms, a one-horse town. Finance, which began in the Square Mile, has now spread to Docklands to the east, to Mayfair in the west and now to the South Bank.

Second, it proves that buildings are no longer merely premises owned by businesses, but are now chips for investment. What's more those chips are increasingly owned by people who barely ever set foot in the country. A study from Cambridge University last year, Who Owns the City?, found that 52% of the City's offices are now in the hands of foreign investors – up from just 8% in 1980. What's more, foreigners are piling into London property at an ever-increasing rate, as they look for relatively safe havens from the global financial turmoil.

Architecture always reflects the society that builds it. Indeed a shard - a sharp, jutting, disjointed fragment cut off from a
larger whole - is a perfect metaphor for our modern times. It looks
aggressive - like it's piercing the sky. The constructivist feel, sheer
glass cladding, and monumental scale reinforce these views. Its pyramidal shape echoes our social structure, much like its ancient antecedents.

Wednesday, June 27, 2012

... But the fact is, in all of these cases and many others, the over-activity of the Court, the Fed and the White House has less to do with a Masonic desire to reorder the constitutional status quo than the fact that one of the three vital organs of our nation is intellectually and morally dead.

Which branch? Let me give you a hint: It has 100 blow-dried millionaires living off our taxes in six year increments, and another 465 cosmetically enhanced egotists whose overriding priority is ensuring they will win the right to another two-years at the trough.

The black hole that is Congress has shut down, like a liver ravaged by years of abuse, and the rest of the body is doing all it can to sustain life.

Congress likes to hold hearings about threats to national security – al Qaeda, Iranian nukes, stumpy North Korean dictators, etc., etc. This is truly theater of the absurd at this stage.

No threat foreign or domestic can bring this nation to its knees as quickly as this dysfunctional legislative farce. Sometimes, military strategists are fond of saying, the best course of action is to do nothing.

This ain’t one of those times.

When a person is blinded, or loses their hearing, medical science tells us that this invariably leads to a sharpening of the other senses.

Something very similar is going on in Washington, and you may not like what you hear, see or smell from these hyper-active institutions, but the problem is clear: We must reform Congress, because its dysfunction is now a true national emergency. It forces activism and the bending of constitutional norms just to prevent disaster. It empowers those inside government who view the legislative branch’s abdication of its responsibility as an invitation to fill the vacuum.

If you don't like the Court meddling in health care or abortion, DEMAND that your representatives in Congress deal with the question. If you don't like the Fed unleashing QE3, demand that Congress does something about the anemia affecting our economy.

Vote. Them. Out.

Polls show most Americans abhor Congress. We should be sure to abhor the vacuum its irresponsible leadership has created, too.

How do we fix this? I’ve seen a few good templates – and I’ve suggested a few myself, in particular doing away with mid-term elections and synchronizing our voting to encourage maximum turnout. Former Texas Rep. Mickey Cantor has a good six-point plan for bringing Congress into the 20th Century (which would be progress even if the 21st remains too far a reach).

But the first step is to recognize that Congress and its 19th century dictates has become the primarily source of our national emergency. Where is the candidate willing to press this agenda?

In the 19th century, robber barons started their own private universities when they were not satisfied with those already available. But Leland Stanford never assumed his university should be run like his railroad empire. Andrew Carnegie did not design his institute in Pittsburgh to resemble his steel company. The University of Chicago, John D. Rockefeller’s dream come true, assumed neither his stern Baptist values nor his monopolistic strategies. That’s because for all their faults, Stanford, Carnegie, and Rockefeller knew what they didn’t know.

In the 21st century, robber barons try to usurp control of established public universities to impose their will via comical management jargon and massive application of ego and hubris. At least that’s what’s been happening at one of the oldest public universities in the United States—Thomas Jefferson’s dream come true, the University of Virginia.

On Thursday night, a hedge fund billionaire, self-styled intellectual, “radical moderate,” philanthropist, former Goldman Sachs partner, and general bon vivant named Peter Kiernan resigned abruptly from the foundation board of the Darden School of Business at the University of Virginia. He had embarrassed himself by writing an email claiming to have engineered the dismissal of the university president, Teresa Sullivan, ousted by a surprise vote a few days earlier.

The events at UVA raise important questions about the future of higher education, the soul of the academic project, and the way we fund important public services.

Kiernan, who earned his MBA at Darden and sent his children to the university, has been a longtime and generous supporter of both the business school and the College of Arts and Sciences, where I work as a professor. Earlier this year he published a book called—I am not making this up—Becoming China’s Bitch. It purports to guide America through its thorniest problems, from incarceration to education to foreign policy. The spectacle of a rich man telling us how to fix our country was irresistible to the New York Times, which ran a glowing profile of Kiernan and his book on Feb. 29.

At some point in recent American history, we started assuming that if people are rich enough, they must be experts in all things. That’s why we trust Mark Zuckerberg to save Newark schools and Bill Gates to rid the world of malaria. Expertise is so 20th century.

We Americans take these institutions for granted. We assume that private enterprise generates what is so casually called “innovation” all by itself. It does not. The Web browser you are using to read this essay was invented at the University of Illinois at Urbana-Champaign. The code that makes this page possible was invented at a publicly funded academic research center in Switzerland. That search engine you use many times a day, Google, was made possible by a grant from the National Science Foundation to support Stanford University. You didn’t get polio in your youth because of research done in the early 1950s at Case Western Reserve. California wine is better because of the University of California at Davis. Hollywood movies are better because of UCLA. And your milk was not spoiled this morning because of work done at the University of Wisconsin at Madison.

These things did not just happen because someone saw a market opportunity and investors and inventors rushed off to meet it. That’s what happens in business-school textbooks. In the real world, we roll along, healthy and strong, in the richest nation in the world because some very wise people decided decades ago to invest in institutions that serve no obvious short-term purpose. The results of the work we do can take decades to matter—if at all. Most of what we do fails. Some succeeds. The system is terribly inefficient. And it’s supposed to be that way.

We hear every day from higher-education pundits who can’t seem to express themselves in anything other than jargon and buzzwords that American higher education is “unsustainable.” No. It’s just not adequately sustained. There is a big difference. We could choose to invest in people. We could choose to invest in culture. We could choose to invest in science and technology. We choose instead to imagine that there are quick technological fixes or commercial interventions that can “transform” universities into digital diploma mills. Pundits blame professors for fighting “change.” But they ignore the fact that universities are the chief site of innovation and experimentation in digital teaching and research and that professors might actually know what works and what does not.

Instead of holding up their responsibility, states are divesting themselves of the commitment to help their young people achieve social mobility. States are rigging the system so that only the wealthy can compete for slots in the best universities. States shift the cost of higher education from taxpayers—all of whom benefit from living in a wiser, more creative society—to the students themselves. Yet students keep coming, desperate to enter the privileged classes, unable to imagine a different way through a cruel economy that has no use for the uneducated any more.

Universities are supposed to be special places where we let young people imagine a better world. They are supposed to be able to delay the pressures of the daily grind for a few years. They are supposed to be able to aspire to greatness and inspire each other. A tiny few will aspire to be poets. Many more will aspire to be engineers. Some will become both. Along the way they will bond with friends, meet lovers, experience hangovers, make mistakes, and read some mind-blowing books.

Does that sound wasteful? Does that sound inefficient? Nostalgic? Out-of-sync with the times? Damn right it does. But if we don’t want young people of all backgrounds to experiment with ideas and identities because it seems too expensive to support, we have to ask ourselves what sort of society we are trying to become.

A Much Higher Education (Slate) Excellent articles on the UVA putch. Worth reading in full. What institution aren't the rich trying to corrupt? Case in point:

In support of their political strategy, global megabanks also run a highly sophisticated disinformation/propaganda operation, with the goal of creating at least a veneer of respectability for the subsidies that they receive. This is where universities come in.

At a recent Commodity Futures Trading Commission roundtable, the banking-sector representative sitting next to me cited a paper by a prominent Stanford University finance professor to support his position against a particular regulation. The banker neglected to mention that the professor was paid $50,000 for the paper by the Securities Industry and Financial Markets Association, SIFMA, a lobby group. (The professor, Darrell Duffie, disclosed the size of this fee and donated it to charity.)

Why should we take such work seriously – or any more seriously than other paid consulting work, for example, by a law firm or someone else working for the industry?

The answer presumably is that Stanford University is very prestigious. As an institution, it has done great things. And its faculty is one of the best in the world. When a professor writes a paper on behalf of an industry group, the industry benefits from – and is, in a sense, renting – the university’s name and reputation. Naturally, the banker at the CFTC roundtable stressed “Stanford” when he cited the paper. (I’m not criticizing that particular university; in fact, other Stanford faculty, including Anat Admati, are at the forefront of pushing for sensible reform.)

Ferguson believes that this form of academic “consulting” is generally out of control. I agree, but reining it in will be difficult as long as the universities and “too big to fail” banks remain so intertwined.

In this context, I was recently disappointed to read in The Wall Street Journal an interview with Lee Bollinger, President of Columbia University. Bollinger is a “class C” director of the Federal Reserve Bank of New York – appointed by the Board of Governors of the Federal System to represent the public interest.

In what was apparently his first-ever interview or public statement on banking-reform issues (or even finance), Bollinger’s main point was that Dimon should continue to serve on the board of the New York Fed. He used surprisingly nonacademic language – stating that “foolish” people who suggest that Dimon should resign or be replaced have a “false understanding” of how the system really works.

Tuesday, June 26, 2012

WASHINGTON
(AP) — When it comes to the economy, half of Americans in a new poll
say it won't matter much whether Barack Obama or Mitt Romney wins — even
though the presidential
candidates have staked their chances on which would be better at fixing
the economic mess.

People
are especially pessimistic about the future president's influence over
jobs, according to the Associated Press-GfK poll. Asked how much impact
the November winner will
have on unemployment, 6 in 10 gave answers ranging from slim to none.

Yet
the candidates, the polls and the pundits agree — the economy is the
issue of 2012. Can either man convince voters that he would set things
right?

A
retired policeman, Gray plans to vote for Romney and thinks the
Republican might win. But he doesn't have much hope that would improve
things for people like him, living
on a fixed income. "Every time you go to the grocery store the prices
have gone up," he said.

Stories like this one fill me with so many conflicting emotions. Let's just start at the beginning, shall we?

SEWANEE,
Tenn. - As Robin Layman, a mother of two who has major health troubles
but no insurance, arrived at a free clinic here, she had a big personal
stake in the Supreme
Court's imminent decision on the new national health care law.

Not that she realized that. "What new law?" she said. "I've not heard anything about that."

[...]

Layman
was hardly the only patient unaware that the law aims to help people
like her, by expanding health insurance beginning in 2014. And this gets
to the heart of the political
dilemma for Democrats: Despite spending tremendous political capital to
pass the law, the party is unlikely to win many votes from the law's
future beneficiaries, most of whom live in Republican-dominated states
in the South and West. In fact, many at the
clinic said they don't vote at all.

Oh,
lord have mercy. My knee-jerk reaction is to scream, "Pay some fucking
attention, people! Christ on a cracker, what the hell is wrong with
you?" But I already know the
answer to that. It's not that they're too busy or lazy or uneducated to
pay attention -- there may be an element of that, but that's not the
crux of the issue.

The
real problem is that these are people who have given up. They've
decided -- with good reason -- that our institutions were not created
for them. Those things like caring
about who goes to Washington and what's happening in the news are for
other people. Robin Layman has already been told she doesn't matter, so
what's the point of civic crap like voting? What's it going to get her?

It's
really hard to argue with that. Turn on any news broadcast -- Fox News,
MSNBC, CNN, it doesn't matter -- and tell me who's talking about people
like Robin Layman? No one.
We had one presidential candidate in the past decade who did that, and
he ended up being a scoundrel. The worst thing John Edwards did wasn't
having a baby with his mistress, it was in bringing the plight of the
working poor onto the national stage and then
dropping it like a hot potato when he got tripped up by ambition and
his penis.

The
real issue is surely turnout. In the US it has been low for a long
time: between 50-60% for presidential elections and 30-45% for mid-term
congressionals since the second
world war. In the UK it has slipped dramatically: from 84% in 1950
to 65% in 2010. An analysis by the Institute for Public Policy
Research shows that the collapse has occurred largely among younger and
poorer people. “Older people and richer or better
educated people … are now much more influential at the ballot box”.

The
major reason, the institute says, is the “’low-stakes’ character of
recent elections”: the major parties “fought on quite similar
platforms”. The biggest decline in recent
political history – from 1997 to 2001 – lends weight to this
contention. In 1997 the young and the poor believed they faced a real
political and economic choice. By 2001, Blair had moved Labour so far to
the right that there was scarcely a choice to be made.

Corporate behavior is nothing new, it's just more powerful thanks to technology:

The Netherlands United East India Company, or Voc, was the world's first multinational corporation. And just as corporations today seek to monopolise plant genes in the developing world, the Voc set about seizing total control of spice production.

In 1652, after displacing the Portuguese and Spanish, the Dutch introduced a policy known as extirpatie: extirpation. All clove trees not controlled by the Voc were uprooted and burned. Anyone caught growing, stealing or possessing clove plants without authorisation faced the death penalty. On the Banda Islands, to the south - the world's only source of nutmeg - the Dutch used Japanese mercenaries to slaughter almost the entire male population.

Like Opec today, the Voc also limited supply to keep prices high. Only 800-1,000 tonnes of cloves were exported per year. The rest of the harvest was burned or dumped in the sea. Somehow, Afo managed to slip through the net. A rogue clove. A guerrilla plant waging a secret war of resistance. Afo would eventually bring down the Dutch monopoly on cloves.

In 1770, a Frenchman, appropriately named Poivre, stole some of Afo's seedlings. This Monsieur Pepper took them to France, then the Seychelles Islands and, eventually, Zanzibar, which is today the world's largest producer of cloves.

Sunday, June 24, 2012

Brilliant column by Lord Robert Skidelsky, British economist, member of parliament and noted biographer of fellow countryman John Maynard Keynes. It may well sum up the economic arguments I've been making on this blog since the beginning, so I've included the entire thing:

LONDON – As people in the developed world wonder how their countries will return to full employment after the Great Recession, it might benefit us to take a look at a visionary essay that John Maynard Keynes wrote in 1930, called “Economic Possibilities for our Grandchildren.”

Keynes’s General Theory of Employment, Interest, and Money, published in 1936, equipped governments with the intellectual tools to counter the unemployment caused by slumps. In this earlier essay, however, Keynes distinguished between unemployment caused by temporary economic breakdowns and what he called “technological unemployment” – that is, “unemployment due to the discovery of means of economizing the use of labor outrunning the pace at which we can find new uses for labor.”

Keynes reckoned that we would hear much more about this kind of unemployment in the future. But its emergence, he thought, was a cause for hope, rather than despair. For it showed that the developed world, at least, was on track to solving the “economic problem” – the problem of scarcity that kept mankind tethered to a burdensome life of toil.

Machines were rapidly replacing human labor, holding out the prospect of vastly increased production at a fraction of the existing human effort. In fact, Keynes thought that by about now (the early twenty-first century) most people would have to work only 15 hours a week to produce all that they needed for subsistence and comfort.

Developed countries are now about as rich as Keynes thought they would be, but most of us work much longer than 15 hours a week, though we do take longer holidays, and work has become less physically demanding, so we also live longer. (note: but are much less healthy overall, obesity, etc. - CH) But, in broad terms, the prophecy of vastly increased leisure for all has not been fulfilled. Automation has been proceeding apace, but most of us who work still put in an average of 40 hours a week. In fact, working hours have not fallen since the early 1980’s.

At the same time, “technological unemployment” has been on the rise. Since the 1980’s, we have never regained the full employment levels of the 1950’s and 1960’s. If most people still work a 40-hour week, a substantial and growing minority have had unwanted leisure thrust upon them in the form of unemployment, under-employment, and forced withdrawal from the labor market. And, as we recover from the current recession, most experts expect this group to grow even larger.

What this means is that we have largely failed to convert growing technological unemployment into increased voluntary leisure. The main reason for this is that the lion’s share of the productivity gains achieved over the last 30 years has been seized by the well-off.

Particularly in the United States and Britain since the 1980’s, we have witnessed a return to the capitalism “red in tooth and claw” depicted by Karl Marx. The rich and very rich have gotten very much richer, while everyone else’s incomes have stagnated. So most people are not, in fact, four or five times better off than they were in 1930. It is not surprising that they are working longer than Keynes thought they would.

But there is something else. Modern capitalism inflames through every sense and pore the hunger for consumption. Satisfying it has become the great palliative of modern society, our counterfeit reward for working irrational hours. Advertisers proclaim a single message: your soul is to be discovered in your shopping.

Aristotle knew of insatiability only as a personal vice; he had no inkling of the collective, politically orchestrated insatiability that we call economic growth. The civilization of “always more” would have struck him as moral and political madness.

And, beyond a certain point, it is also economic madness. This is not just or mainly because we will soon enough run up against the natural limits to growth. It is because we cannot go on for much longer economizing on labor faster than we can find new uses for it. That road leads to a division of society into a minority of producers, professionals, supervisors, and financial speculators on one side, and a majority of drones and unemployables on the other.

Apart from its moral implications, such a society would face a classic dilemma: how to reconcile the relentless pressure to consume with stagnant earnings. So far, the answer has been to borrow, leading to today’s massive debt overhangs in advanced economies. Obviously, this is unsustainable, and thus is no answer at all, for it implies periodic collapse of the wealth-producing machine.

The truth is that we cannot go on successfully automating our production without rethinking our attitudes toward consumption, work, leisure, and the distribution of income. Without such efforts of social imagination, recovery from the current crisis will simply be a prelude to more shattering calamities in the future.

As long as the means of production, thus both technology and labour
included, continue to be privately owned, their functioning will only
respond to productivity pressures. Under these conditions, unemployment
will continue to increase and that will ad further pressures to create a
surplus from both labour and technology investments. Unless society
recovers the ownership of labour (thrugh messures such as the
implementation of a universal basic income) and technology, we will be
chained to perpetuate these dynamics.

Saturday, June 23, 2012

Yet more "benefits" of modern industrial society revealed as a drawbacks with serious unintended consequences:

OVER 7,000 strong and growing, community farmers’ markets are being heralded as a panacea for what ails our sick nation. The smell of fresh, earthy goodness is the reason environmentalists approve of them, locavores can’t live without them, and the first lady has hitched her vegetable cart crusade to them. As health-giving as those bundles of mouthwatering leafy greens and crates of plump tomatoes are, the greatest social contribution of the farmers’ market may be its role as a delivery vehicle for putting dirt back into the American diet and in the process, reacquainting the human immune system with some “old friends.”

Increasing evidence suggests that the alarming rise in allergic and autoimmune disorders during the past few decades is at least partly attributable to our lack of exposure to microorganisms that once covered our food and us. As nature’s blanket, the potentially pathogenic and benign microorganisms associated with the dirt that once covered every aspect of our preindustrial day guaranteed a time-honored co-evolutionary process that established “normal” background levels and kept our bodies from overreacting to foreign bodies. This research suggests that reintroducing some of the organisms from the mud and water of our natural world would help avoid an overreaction of an otherwise healthy immune response that results in such chronic diseases as Type 1 diabetes, inflammatory bowel disease, multiple sclerosis and a host of allergic disorders.

In a world of hand sanitizer and wet wipes (not to mention double tall skinny soy vanilla lattes), we can scarcely imagine the preindustrial lifestyle that resulted in the daily intake of trillions of helpful organisms. For nearly all of human history, this began with maternal transmission of beneficial microbes during passage through the birth canal — mother to child. However, the alarming increase in the rate of Caesarean section births means a potential loss of microbiota from one generation to the next. And for most of us in the industrialized world, the microbial cleansing continues throughout life. Nature’s dirt floor has been replaced by tile; our once soiled and sooted bodies and clothes are cleaned almost daily; our muddy water is filtered and treated; our rotting and fermenting food has been chilled; and the cowshed has been neatly tucked out of sight. While these improvements in hygiene and sanitation deserve applause, they have inadvertently given rise to a set of truly human-made diseases.

People are risking their health by working on smartphones, tablets and laptops after they have left the office, according to the Chartered Society of Physiotherapy. It says people have become "screen slaves" and are often working while commuting or after they get home. The society said poor posture in these environments could lead to back and neck pain.

Unions said people needed to learn to switch off their devices.

An online survey, of 2,010 office workers by the Society found that nearly two-thirds of those questioned continued working outside office hours. The organisation said people were topping up their working day with more than two hours of extra screentime, on average, every day. The data suggested that having too much work and easing pressure during the day were the two main reasons for the extra workload.

There is the old joke in Houston about how you define a pedestrian: A
person looking for their car. People don't do a lot of walking in the
heat; perhaps that's why McAllen-Edinburg-Mission in Texas is the most obese region in America and Boulder, Colo. is the least.

But there may be a more important reason than the driving; it may be biological. A study by David Allison of the University of Birmingham found that air conditioning might make you fat:

One
of the most intriguing factors listed in the study is the “reduction in
variability of ambient temperature.” The widespread use of central
heating and air conditioning means that most homes and offices are now
kept at a relatively constant temperature year-round. Allison’s group
found evidence that this causes the body to expend less energy, because
it does not have to work to warm up or cool down, potentially leading to
increased fat stores. In the South, where obesity rates are the highest
in the nation, homes with central air increased from 37 to 70 percent
between 1978 and 1990.

One doctor wasn't so sure, telling ABC: "Since people stay thin in all different climates, it is unlikely [air conditioning] plays much of a role." But that's not much of an answer; People in the southern United States are fat, and people in Italy or France generally are not. In italy, people often live in apartments with thick walls that resist the heat, and as seen in the photo I took in Milan last month, everybody has external shutters pulled down to keep the heat out. Not many people have air conditioning because they know how to keep cool. Few of the people I saw there were obese.

Friday, June 22, 2012

There's a fascinating new book out by Peter Watson called The Great Divide, exploring why the parallel human experiments of the New and Old Worlds turned out the way that they did. He takes up where Jared Diamond left off in Guns, Germs and Steel. As Diamond did, he covers the essential differences in climate, geography, geology, diseases, flora and fauna, etc., between the New World and the Old. Unlike Diamond, however, he puts the role of ideologies front and center. He argues that the above factors shaped differing ideologies in the Old and New Worlds, that the Old World had a greater diversity of ideologies, and that's why history unfolded as it did.

To oversimplify (based on reviews; I have not read the book), he argues that the thinking of the New World was shaped by the unpredictable weather and geological phenomena (monsoons, tsunamis, earthquakes, volcanoes), and the presence of a large number of hallucinogenic plants. This gave rise to a philosophy of the irregularity and unpredictability of nature, rather than something that could be dominated. Phenomena like human sacrifice were attempts to control the capriciousness of nature through mysticism. And hallucinogens like peyote and ayahuasca led to a more supernatural view of the world, including man's place in it. This view is typified by the shaman.

The Old World, by contrast, was shaped by the regularity of the seasons and of the flooding of the great river valleys. The Old World had many domesticable plant and animal species giving rise to the idea of nature as something that could be controlled and exploited for profit. It's no wonder the idea of money as interest-bearing debt arose here; plant a seed and harvest a hundred; breed a calf and get several in return (until drought and disease set in, that is). Metals are more present here, and the wheel, pastoral nomadism, waterways, and the uniform latitude of the major civilizations gave rise to numerous trading networks and empires. This view is typified by the shepherd.

Of course we know which views won out - it's the reason we use the terms New and Old World as we do. The book looks to be one of the must-reads of this year. Here are some reviews, courtesy of Marginal Revolution:

Ransacking the specialist literature from a collection of disparate fields – cosmology, climatology, geology, palaeontology, mythology, botany, archaeology and volcanology – Watson considers how ecology, broadly construed, shaped the evolution of human civilisation. He owes a considerable debt here to Jared Diamond, whose book Guns, Germs and Steel: A Short History of Everybody for the last 13,000 years, was an unlikely blockbuster in the late 1990s and started a trend for big picture histories that look at long-term climatic shifts as decisive factors of historical change.

Watson doesn’t have Diamond’s catchy three-word formulation; but he similarly argues that geography, climate and weather are inextricably bound to destiny. “The physical world,” Watson writes, “which early people inhabited – the landscape, the vegetation, the non-human animal life, plus the dominant features of the climate, of latitude and the relation of the land to the sea, determined the ideology of humans, their beliefs, their religious practices, their social structure, their commercial and industrial activities, and that, in turn, ideology. Once it had emerged and cohered, determined the further characteristic interaction between humans and the environment.”

Watson sees broad climatic factors as shaping forces of culture in each hemisphere. The dominant feature of the Old World was the “weakening monsoon”, which brought drying trends to the Eurasian land mass. This, in turn, led to seasonal fluctuations, which provoke the rise of fertility cults.

The Old World gave rise to the cultivation of cereal grasses; domesticable animals were used to plough fields and transport goods. Pastoral nomadism spread language and technology. As Watson notes, Eurasia is geographically orientated on an east-west axis. Climates are less varied there, animals and goods could move around with ease. Watson correlates religious mythology with natural events; the great Biblical flood, for example, might be traced to rising sea levels around 6000BC.

In the Americas, there were few such animals; primary foodstuffs grew year round, in marked distinction to cyclical Old World cereal crops. The land mass of the Americas was orientated on a north-south axis, with its major civilisations – Chavin, Moche, Olmec, Maya, Inca, Aztec – concentrated in the tropics. Violent weather brought about by El Niño, which unleashed freakish storms and winds on Mesoamerica. Central America is also at the juncture of several tectonic plates; earthquakes and volcanoes also wrought great damage. Gods were invoked to stave off the devastation, with little success.

Watson argues that the major civilisations of the New World were typified by a “more vivid religion” marked by shamanism and the use of psychoactive drugs to produce visionary hallucinations (the Aztecs used a mushroom called teonanacatl to produce temixoch, the “flowery dream”). and Watson writes “the sheer vividness, and the fearsome nature of some of the transformations experienced in trance, the overwhelming psychological intensity of altered states of consciousness induced by hallucinogens, would, among other things, have made New World religious experiences far more convincing and therefore more resistant to change than those of the Old World.”

Anthropologists and archaeologists, as Watson points out, have generally
preferred to emphasise the similarities between the various human
cultures that have developed since the last Ice Age; but Watson himself
is altogether more intrigued by the contrasts. Between 15,000BC, when
the first humans crossed into Alaska, and 1492, when Columbus arrived in
the Caribbean, there were two distinct populations of homo sapiens
developing in parallel, each utterly unaware of the other. This
constituted, in Watson's words, "the greatest natural experiment the
world has seen" – and it is his attempt to trace it, and to draw
apposite conclusions from it about "how nature and human nature
interact", that constitutes the meat of this fascinating, ambitious and
yet ultimately frustrating book.

The broad thrust of his argument, that civilisation in both the New and Old Worlds has been shaped above all by environmental factors, will be familiar to anyone who has read Jared Diamond's Guns, Germs and Steel (1997). The "Great Divide", in Watson's pithy summation, was between shepherds and shamans. The plentiful availability in Eurasia of animals just waiting to be domesticated ultimately led to the invention of the plough, the chariot, the wool industry and the pork pie. Meanwhile, what the peoples of the New World might have lacked in terms of horses or cattle was compensated for by a quite prodigious supply of naturally occurring hallucinogens. While the great intellects of Eurasia were busy inventing monotheism and the water-mill, their counterparts in the Americas were off their faces on drugs. This, combined with the fact that the New World is much more prone to extremes of weather and seismic activity than the Old, resulted in gods that were scarily in people's faces. "In the New World," so Watson argues, "the existence of a supernatural world was altogether more convincing."

Watson's technique is to explore the connection between myths and historical and natural events. In the case of the expulsion from the Garden of Eden (onset of farming) and the Flood (probably the sea level rise of around 6000BC) the connection seems well-founded. In South America, he attempts to explain pre-Columbian ritual killing: "Only extraordinary events can explain what is to us the barbarity yet universality of human sacrifice".

Effectively, the book is a history of the world from 15,000 BC to 1500AD, and there is much that is truly illuminating. The more we know about the emergence of agriculture, the more plausible the story of Adam and Eve as an allegory of the change from the hunter-gatherer life to settled farming. A passage from Genesis ("I will greatly multiply your sorrow and your conception; In pain you shall bring forth children") reflects the big changes that came with sedentism. Farming and a settled life allowed women to have babies every two years instead of the four usual in hunter-gatherer societies. There is evidence that the female pelvis is narrower in modern humans than in hunter-gatherers.

Watson's rationale for the different religions that developed in the Middle East and South America does carry conviction. In the Old World, the regularity of the natural cycles meant that supplicative religions could be said to work. If your existence depends on the Nile flooding every year or the arrival of the monsoon, and you engage in rituals to implore these life-giving waters to return, and they do – the ritual is consolidated. But in the New World, the climate was extreme, with terrifying unpredictable events such as volcanic eruptions, tsunamis, earthquakes, violent storms. The gods were unappeasable and human sacrifice became the last resort. A reinforcing factor was the abundance of hallucinogens in South America; thanks to them, "the existence of the supernatural world was... more convincing".

Patterson’s rousing stump speech for sprawl is emblematic of how we as a culture are far too invested in a vision of the American dream that doesn’t make sense in the 21st century. Over the past 30 years we’ve stripped away the supporting mechanisms of sprawl but have continued to create it.

We’ve built more houses than we’ve needed — and built them farther away from jobs. This has led to longer commutes, which has created more traffic. In response, we built more highways, increasing fuel consumption and, as transportation planners acknowledge, doing little if anything to reduce traffic. It’s a vicious, seemingly endless cycle, and at its core is the notion that the American dream can exist only within the framework of the single-family home on a large lot.

Indeed, we’ve become so fixated on this as the sole delivery mechanism of that American dream that we’ve spent a disproportionate amount of our collective energies (home-) improving it without considering meaningful alternative visions — or devoting at least a smidgen of attention to what’s outside the front door or down the block. Everything in our culture today reinforces this idea of home as castle (or fortress) rather than home as part of a larger whole (i.e., neighborhood). We need to find our way to the latter view, and part of that means finding a better way to talk about it.

Houses were too big, too isolated, too generic, too hard to maintain. Or they were designed for the quintessential nuclear family that exists more in our cultural imagination than in reality. Few homes offered options for aging in place, for returning college kids or elderly parents, or even decent home office space. Would-be residents lamented the lack of amenities like a café or a playground within walking distance in master-planned communities of 5,000, 10,000 or even 40,000 homes (!), an absence often explained away with “a community of this size couldn’t support it.”

People have begun to wake up to the fact that the more time spent in the car means poorer health and less time with their families — and they’re seeking shorter commutes. They’re interested in smaller homes that are easier to maintain (and less expensive to heat and cool). Young millennials and older baby boomers are also showing less and less interest in car ownership and a corresponding greater interest in public transit, walking and biking. And again, it’s likely that we’re all less interested in continuing to discuss “urban” and “suburban” as dueling polar opposites — and more interested in recognizing there’s mutual benefit to some overlap.

The aforementioned changes point to the fact that a paradigmatic shift in our concept of the American dream is underway.

And yet … there are still those who are having none of it. And they are a vocal and often breathtakingly well-funded minority. For them, the sprawl that characterized the years leading up to the financial crisis remains a dream to strive for. Any threat to the McMansion of yore is equated to “feudal socialism” (I kid you not). And these opponents not only excel at mobilizing the troops but at mastering the message. Take a look at the rhetoric of, say, the Texas Republican party, which recently passed “Resist 21” in opposition to Agenda 21, the United Nations’ sustainable communities strategy adopted in 1992. Taken together, proclaims Resist 21, those strategies aspire to “the comprehensive control of all our population and its reduction to sustainable levels and the socialization of all activities by their relocation to highly restricted urban settlement centers.”

"...We as a culture are far too invested in a vision of the American dream that doesn’t make sense in the 21st century." Boy, if that isn't our epitaph.

I particularly like her description of the American house as a fortress. So you can sit in your oversized plywood castle and have everything delivered to you in your digital fortress like a modern-day mad prince. What kind of culture builds it's homes as fortresses from the outside world? Certainly not a culture that one would expect to hold together in hard times. And our balkinization along income lines has led to not only the lowest social mobility in the developed world, but one in which classes never even see one another, and so can be played off against each other by the elites while they grab what's left. Truly sad.

Wednesday, June 20, 2012

... We like to think that the reason we enjoy our high standards of living is because we have been so clever at figuring out how to use the world's available resources. But we should not dismiss the possibility that there may also have been a nontrivial contribution of simply having been quite lucky to have found an incredibly valuable raw material that for a century and a half or so was relatively easy to obtain. Optimists may expect the next century and a half to look like the last. Benes and coauthors are suggesting that instead we should perhaps expect the next decade to look like the last.

China has warned that the decline in its rare earth reserves in major mining areas is "accelerating", as most of the original resources are depleted. In a policy paper, China's cabinet blamed excessive exploitation and illegal mining for the decline. China accounts for more than 90% of the world's rare earth supplies, but has just 23% of global reserves. It has urged those with reserves to boost production of the elements, which are used to make electrical goods. "After more than 50 years of excessive mining, China's rare earth reserves have kept declining and the years of guaranteed rare earth supply have been reducing," China's cabinet said in the paper on the rare earth industry published by the official Xinhua news agency.

"It really will be a new world, biologically, at that point," warns Anthony Barnosky, professor of integrative biology at the University of California, Berkeley, and lead author of a review paper appearing in the June 7 issue of the journal Nature. "The data suggests that there will be a reduction in biodiversity and severe impacts on much of what we depend on to sustain our quality of life, including, for example, fisheries, agriculture, forest products and clean water. This could happen within just a few generations."