The Archdruid Report

Wednesday, February 25, 2015

I've commented more than once in these essays about the
cooperative dimension of writing:the
way that even the most solitary of writers inevitably takes part in what
Mortimer Adler used to call the Great Conversation, the flow of ideas and
insights across the centuries that’s responsible for most of what we call
culture. Sometimes that conversation takes place second- or third-hand—for
example, when ideas from two old books collide in an author’s mind and give
rise to a third book, which will eventually carry the fusion to someone else
further down the stream of time—but sometimes it’s far more direct.

Last week’s post here brought an example of the latter kind.
My attempt to cut through the ambiguities surrounding that slippery word
“progress” sparked a lively discussion on the comments page of my blog about
just exactly what counted as progress, what factors made one change
“progressive” while another was denied that label. In the midst of it all, one
of my readers—tip of the archdruidical hat to Jonathan—proposed an unexpected
definition:what makes a change qualify
as progress, he suggested, is that it increases the externalization of costs.

I’ve been thinking about that definition since Jonathan
proposed it, and it seems to me that it points up a crucial and mostly
unrecognized dimension of the crisis of our time. To make sense of it, though,
it’s going to be necessary to delve briefly into economic jargon.

Economists use the term “externalities” to refer to the
costs of an economic activity that aren’t paid by either party in an exchange,
but are pushed off onto somebody else. You won’t hear a lot of talk about
externalities these days; it many circles, it’s considered impolite to mention them,
but they’re a pervasive presence in contemporary life, and play a very large
role in some of the most intractable problems of our age. Some of those
problems were discussed by Garret Hardin in his famous essay on the tragedy of
the commons, and more recently by Elinor Ostrom in her studies of how that
tragedy can be avoided; still, I’m not sure how often it’s recognized that the
phenomena they discussed applies not just to commons systems, but to societies
as a whole—especially to societies like ours.

An example may be useful here. Let’s imagine a blivet
factory, which turns out three-prong, two-slot blivets in pallet loads for
customers. The blivet-making process, like manufacturing of every other kind,
produces waste as well as blivets, and we’ll assume for the sake of the example
that blivet waste is moderately toxic and causes health problems in people who
ingest it. The blivet factory produces one barrel of blivet waste for every
pallet load of blivets it ships. The cheapest option for dealing with the
waste, and thus the option that economists favor, is to dump it into the river
that flows past the factory.

Notice what happens as a result of this choice. The blivet
manufacturer has maximized his own benefit from the manufacturing process, by
avoiding the expense of finding some other way to deal with all those barrels
of blivet waste. His customers also benefit, because blivets cost less than
they would if the cost of waste disposal was factored into the price. On the
other hand, the costs of dealing with the blivet waste don’t vanish like so
much twinkle dust; they are imposed on the people downstream who get their
drinking water from the river, or from aquifers that receive water from the
river, and who suffer from health problems because there’s blivet waste in
their water. The blivet manufacturer is externalizing the cost of waste
disposal; his increased profits are being paid for at a remove by the increased
health care costs of everyone downstream.

That’s how externalities work. Back in the days when people
actually talked about the downsides of economic growth, there was a lot of
discussion of how to handle externalities, and not just on the leftward end of
the spectrum.I recall a thoughtful book
titled TANSTAAFL—that’s an acronym, for those who don’t know their
Heinlein, for “There Ain’t No Such Thing As A Free Lunch”—which argued, on
solid libertarian-conservative grounds, that the environment could best be
preserved by making sure that everyone paid full sticker price for the
externalities they generated. Today’s crop of pseudoconservatives, of course,
turned their back on all this a long time ago, and insist at the top of their
lungs on their allegedly God-given right to externalize as many costs as they
possibly can.This is all the more
ironic in that most pseudoconservatives claim to worship a God who said some
very specific things about “what ye do to the least of these,” but that’s a
subject for a different post.

Economic life in the industrial world these days can be
described, without too much inaccuracy, as an arrangement set up to allow a
privileged minority to externalize nearly all their costs onto the rest of
society while pocketing as much as possible the benefits themselves. That’s
come in for a certain amount of discussion in recent years, but I’m not sure
how many of the people who’ve participated in those discussions have given any
thought to the role that technological progress plays in facilitating the
internalization of benefits and the externalization of costs that drive today’s
increasingly inegalitarian societies. Here again, an example will be helpful.

Before the invention of blivet-making machinery, let’s say,
blivets were made by old-fashioned blivet makers, who hammered them out on iron
blivet anvils in shops that were to be found in every town and village. Like
other handicrafts, blivet-making was a living rather than a ticket to wealth;
blivet makers invested their own time and muscular effort in their craft, and
turned out enough in the way of blivets to meet the demand. Notice also the
effect on the production of blivet waste. Since blivets were being made one at
a time rather than in pallet loads, the total amount of waste was smaller; the
conditions of handicraft production also meant that blivet makers and their
families were more likely to be exposed to the blivet waste than anyone else,
and so had an incentive to invest the extra effort and expense to dispose of it
properly. Since blivet makers were ordinary craftspeople rather than
millionaires, furthermore, they weren’t as likely to be able to buy exemption
from local health laws.

The invention of the mechanical blivet press changed that
picture completely.Since one blivet
press could do as much work as fifty blivet makers, the income that would have
gone to those fifty blivet makers and their families went instead to one
factory owner and his stockholders, with as small a share as possible set aside
for the wage laborers who operate the blivet press. The factory owner and
stockholders had no incentive to pay for the proper disposal of the blivet
waste, either—quite the contrary, since having to meet the disposal costs cut
into their profit, buying off local governments was much cheaper, and if the
harmful effects of blivet waste were known, you can bet that the owner and
shareholders all lived well upstream from the factory.

Notice also that a blivet manufacturer who paid a living
wage to his workers and covered the costs of proper waste disposal would have
to charge a higher price for blivets than one who did neither, and thus would
be driven out of business by his more ruthless competitor. Externalities aren’t
simply made possible by technological progress, in other words; they’re the
inevitable result of technological progress in a market economy, because
externalizing the costs of production is in most cases the most effective way
to outcompete rival firms, and the firm that succeeds in externalizing the
largest share of its costs is the most likely to prosper and survive.

Each further step in the progress of blivet manufacturing,
in turn, tightened the same screw another turn. Today, to finish up the
metaphor, the entire global supply of blivets is made in a dozen factories
indistant Slobbovia, where sweatshop
labor under ghastly working conditions and the utter absence of environmental
regulations make the business of blivet fabrication more profitable than
anywhere else. The blivets are as shoddily made as possible; the entire blivet
supply chain from the open-pit mines worked by slave labor that provide the raw
materials to the big box stores with part-time, poorly paid staff selling
blivetronic technology to the masses is a human and environmental
disaster.Every possible cost has been
externalized, so that the two multinational corporations that dominate the
global blivet industry can maintain their profit margins and pay absurdly high
salaries to their CEOs.

That in itself is bad enough, but let’s broaden the focus to
include the whole systems in which blivet fabrication takes place: the economy
as a whole, society as a whole, and the biosphere as a whole. The impact of
technology on blivet fabrication in a market economy has predictable and well
understood consequences for each of these whole systems, which can be summed up
precisely in the language we’ve already used. In order to maximize its own
profitability and return on shareholder investment, the blivet industry
externalizes costs in every available direction. Since nobody else wants to
bear those costs, either, most of them end up being passed onto the whole
systems just named, because the economy, society, and the biosphere have no
voice in today’s economic decisions.

Like the costs of dealing with blivet waste, though, the
other externalized costs of blivet manufacture don’t go away just because
they’re externalized. As externalities increase, they tend to degrade the whole
systems onto which they’re dumped—the economy, society, and the biosphere. This
is where the trap closes tight, because blivet manufacturing exists within those
whole systems, and can’t be carried out unless all three systems are
sufficiently intact to function in their usual way. As those systems degrade,
their ability to function degrades also, and eventually one or more of them
breaks down—the economy plunges into a depression, the society disintegrates
into anarchy or totalitarianism, the biosphere shifts abruptly into a new mode
that lacks adequate rainfall for crops—and the manufacture of blivets stops
because the whole system that once supported it has stopped doing so.

Notice how this works out from the perspective of someone
who’s benefiting from the externalization of costs by the blivet industry—the
executives and stockholders in a blivet corporation, let’s say. As far as
they’re concerned, until very late in the process, everything is fine and
dandy: each new round of technological improvements in blivet fabrication
increases their profits, and if each such step in the onward march of progress
also means that working class jobs are eliminated or offshored, democratic
institutions implode, toxic waste builds up in the food chain, or what have
you, hey, that’s not their problem—and after all, that’s just the normal
creative destruction of capitalism, right?

That sort of insouciance is easy for at least three reasons.
First, the impacts of externalities on whole systems can pop up a very long way
from the blivet factories.Second, in a
market economy, everyone else is externalizing their costs as enthusiastically
as the blivet industry, and so it’s easy for blivet manufacturers (and everyone
else) to insist that whatever’s going wrong is not their fault.Third, and most crucially, whole systems as
stable and enduring as economies, societies, and biospheres can absorb a lot of
damage before they tip over into instability. The process of externalization of
costs can thus run for a very long time, and become entrenched as a basic
economic habit, long before it becomes clear to anyone that continuing along
the same route is a recipe for disaster.

Even when externalized costs have begun to take a visible
toll on the economy, society, and the biosphere, furthermore, any attempt to
reverse course faces nearly insurmountable obstacles. Those who profit from the
existing order of things can be counted on to fight tooth and nail for the
right to keep externalizing their costs: after all, they have to pay the full
price for any reduction in their ability to externalize costs, while the
benefits created by not imposing those costs on whole systems are shared among all
participants in the economy, society, and the biosphere respectively. Nor is it
necessarily easy to trace back the causes of any given whole-system disruption
to specific externalities benefiting specific people or industries. It’s rather
like loading hanging weights onto a chain; sooner or later, as the amount of
weight hung on the chain goes up, the chain is going to break, but the link
that breaks may be far from the last weight that pushed things over the edge,
and every other weight onthe chain made
its own contribution to the end result

A society that’s approaching collapse because too many
externalized costs have been loaded onto on the whole systems that support it
thus shows certain highly distinctive symptoms. Things are going wrong with the
economy, society, and the biosphere, but nobody seems to be able to figure out
why; the measurements economists use to determine prosperity show contradictory
results, with those that measure the profitability of individual corporations
and industries giving much better readings those that measure the performance
of whole systems; the rich are convinced that everything is fine, while outside
the narrowing circles of wealth and privilege, people talk in low voices about
the rising spiral of problems that beset them from every side. If this doesn’t
sound familiar to you, dear reader, you probably need to get out more.

At this point it may be helpful to sum up the argument I’ve
developed here:

a) Every increase in technological complexity tends also to
increase the opportunities for externalizing the costs of economic activity;

b) Market forces make the externalization of costs mandatory
rather than optional, since economic actors that fail to externalize costs will
tend to be outcompeted by those that do;

c) In a market economy, as all economic actors attempt to
externalize as many costs as possible, externalized costs will tend to be
passed on preferentially and progressively to whole systems such as the
economy, society, and the biosphere, which provide necessary support for
economic activity but have no voice in economic decisions;

d) Given unlimited increases in technological complexity,
there is no necessary limit to the loading of externalized costs onto whole
systems short of systemic collapse;

e) Unlimited increases in technological complexity in a
market economy thus necessarily lead to the progressive degradation of the
whole systems that support economic activity;

f) Technological progress in a market economyis therefore self-terminating, and ends in
collapse.

Now of course there are plenty of arguments that could be
deployed against this modest proposal. For example, it could be argued that
progress doesn’t have to generate a rising tide of externalities. The
difficulty with this argument is that externalization of costs isn’t an
accidental side effect of technology but an essential aspect—it’s not a bug,
it’s a feature. Every technology is a means of externalizing some cost that
would otherwise be borne by a human body. Even something as simple as a hammer
takes the wear and tear that would otherwise affect the heel of your hand,
let’s say, and transfers it to something else: directly, to the hammer;
indirectly, to the biosphere, by way of the trees that had to be cut down to
make the charcoal to smelt the iron, the plants that were shoveled aside to get
the ore, and so on.

For reasons that are ultimately thermodynamic in nature, the
more complex a technology becomes, the more costs it generates. In order to
outcompete a simpler technology, each more complex technology has to
externalize a significant proportion of its additional costs, in order to
compete against the simpler technology. In the case of such contemporary
hypercomplex technosystems as the internet, the process of externalizing costs
has gone so far, through so many tangled interrelationships, that it’s
remarkably difficult to figure out exactly who’s paying for how much of the
gargantuan inputs needed to keep the thing running. This lack of transparency
feeds the illusion that large systems are cheaper than small ones, by making
externalities of scale look like economies of scale.

It might be argued instead that a sufficiently stringent
regulatory environment, forcing economic actors to absorb all the costs of
their activities instead of externalizing them onto others, would be able to
stop the degradation of whole systems while still allowing technological
progress to continue. The difficulty here is that increased externalization of
costs is what makes progress profitable. As just noted, all other things being
equal, a complex technology will on average be more expensive in real terms
than a simpler technology, for the simple fact that each additional increment
of complexity has to be paid for by an investment of energy and other forms of
real capital.

Strip complex technologies of the subsidies that transfer
some of their costs to the government, the perverse regulations that transfer
some of their costs to the rest of the economy, the bad habits of environmental
abuse and neglect that transfer some of their costs to the biosphere, and so
on, and pretty soon you’re looking at hard economic limits to technological
complexity, as people forced to pay the full sticker price for complex
technologies maximize their benefits by choosing simpler, more affordable
options instead. A regulatory environment sufficiently strict to keep
technology from accelerating to collapse would thus bring technological
progress to a halt by making it unprofitable.

Notice, however, the flipside of the same argument: a
society that chose to stop progressing technologically could maintain itself
indefinitely, so long as its technologies weren’t dependent on nonrenewable
resources or the like. The costs imposed by a stable technology on the economy,
society, and the biosphere would be more or less stable, rather than increasing
over time, and it would therefore be much easier to figure out how to balance
out the negative effects of those externalities and maintain the whole system
in a steady state.Societies that
treated technological progress as an option rather than a requirement, and
recognized the downsides to increasing complexity, could also choose to reduce
complexity in one area in order to increase it in another, and so on—or they
could just raise a monument to the age of progress, and go do something else
instead.

The logic suggested here requires a comprehensive rethinking of most of the contemporary world’s notions about technology, progress, and the good society. We’ll begin that discussion in future posts—after, that is, we discuss a second dimension of progress that came out of last week’s discussion.

Wednesday, February 18, 2015

Last week’s post here on The Archdruid Report appears
to have hit a nerve. That didn’t come as any sort of a surprise,
admittedly.It’s one thing to point out
that going back to the simpler and less energy-intensive technologies of earlier
eras could help extract us from the corner into which industrial society has
been busily painting itself in recent decades; it’s quite another to point out
that doing this can also be great fun, more so than anything that comes out of
today’s fashionable technologies, and in a good many cases the results include
an objectively better quality of life as well

That’s not one of the canned speeches that opponents of
progress are supposed to make. According to the folk mythology of modern
industrial culture, since progress always makes things better, the foes of
whatever gets labeled as progress are supposed to put on hair shirts and insist
that everyone has to suffer virtuously from a lack of progress, for some reason
based on sentimental superstition. The Pygmalion effect being what it is, it’s
not hard to find opponents of progress who say what they’re expected to say,
and thus fulfill their assigned role in contemporary culture, which is to stand
there in their hair shirts bravely protesting until the steamroller of progress
rolls right over them.

The grip of that particular bit of folk mythology on the
collective imagination of our time is tight enough that when somebody brings up
some other reason to oppose “progress”—we’ll get into the ambiguities behind
that familiar label in a moment—a great many people quite literally can’t
absorb what’s actually being said, and respond instead to the canned speeches
they expect to hear. Thus I had several people attempt to dispute the comments
on last week’s post, castigating my readers with varying degrees of wrath and
profanity for thinking that they had to sacrifice the delights of today’s
technology and go creeping mournfully back to the unsatisfying lifestyles of an
earlier day.

That was all the more ironic in that none of the readers who
were commenting on the post were saying anything of the kind. Most of them were
enthusiastically talking about how much more durable, practical, repairable,
enjoyable, affordable, and user-friendly older technologies are compared to the
disposable plastic trash that fills the stores these days. They were discussing
how much more fun it is to embrace the delights of outdated technologies than
it would be to go creeping mournfully back—or forward, if you prefer—to the
unsatisfying lifestyles of the present time. That heresy is far more than the
alleged openmindness and intellectual diversity of our age is willing to
tolerate, so it’s not surprising that some people tried to pretend that nothing
of the sort had been said at all. What was surprising to me, and pleasantly so,
was the number of readers who were ready to don the party clothes of some
earlier time and join in the Butlerian carnival.

There are subtleties to the project of deliberate
technological regress that may not be obvious at first glance, though, and it
seems sensible to discuss those here before we proceed.It’s important, to begin with, to remember
that when talking heads these days babble about technology in the singular, as
a uniform, monolithic thing that progresses according to some relentless
internal logic of its own, they’re spouting balderdash.In the real world, there’s no such monolith;
instead, there are technologies in the plural, a great many of them, clustered
more or less loosely in technological suites which may or may not have any
direct relation to one another.

An example might be useful here. Consider the technologies
necessary to build a steel-framed bicycle. The metal parts require the
particular suite of technologies we use to smelt ores, combine the resulting
metals into useful alloys, and machine and weld those into shapes that fit
together to make a bicycle. The tires, inner tubes, brake pads, seat cushion,
handlebar grips, and paint require a different suite of technologies drawing on
various branches of applied organic chemistry, and a few other suites also have
a place:for example, the one that’s
needed to make and apply lubricantsThe
suites that make a bicycle have other uses; if you can build a bicycle, as
Orville and Wilbur Wright demonstrated, you can also build an aircraft, and a
variety of other interesting machines as well; that said, there are other
technologies—say, the ones needed to manufacture medicines, or precision
optics, or electronics—that require very different technological suites. You
can have everything you need to build a bicycle and still be unable to make a
telescope or a radio receiver, and vice versa.

Strictly speaking, therefore, nothing requires the project
of deliberate technological regress to move in lockstep to the technologies of
a specific past date and stay there. It would be wholly possible to dump
certain items of modern technology while keeping others. It would be just as
possible to replace one modern technological suite with an older equivalent
from one decade, another with an equivalent from a different decade and so on.
Imagine, for example, a future America in which solar water heaters (worked out
by 1920) and passive solar architecture (mostly developed in the 1960s and
1970s) were standard household features, canal boats (dating from before 1800)
and tall ships (ditto) were the primary means of bulk transport, shortwave
radio (developed in the early 20th century) was the standard long-range
communications medium, ultralight aircraft (largely developed in the 1980s)
were still in use, and engineers crunched numbers using slide rules (perfected
around 1880).

There’s no reason why such a pastiche of technologies from
different eras couldn’t work. We know this because what passes for modern
technology is a pastiche of the same kind, in which (for example) cars whose
basic design dates from the 1890s are gussied up with onboard computers
invented a century later. Much of modern technology, in fact, is old technology
with a new coat of paint and a few electronic gimmicks tacked on, and it’s old
technology that originated in many different eras, too. Part of what
differentiates modern technology from older equivalents, in other words, is
mere fashion. Another part, though, moves into more explosive territory.

In the conversation that followed last week’s post, one of
my readers—tip of the archdruid’s hat to Cathy—recounted the story of the one
and only class on advertising she took at college. The teacher invited a
well-known advertising executive to come in and talk about the business, and
one of the points he brought up was the marketing of disposable razors. The
old-fashioned steel safety razor, the guy admitted cheerfully, was a much
better product: it was more durable, less expensive, and gave a better shave
than disposable razors. Unfortunately, it didn’t make the kind of profits for
the razor industry that the latter wanted, and so the job of the advertising
company was to convince shavers that they really wanted to spend more money on
a worse product instead.

I know it may startle some people to hear a luxuriantly
bearded archdruid talk about shaving, but I do have a certain amount of
experience with the process—though admittedly it’s been a while. The executive
was quite correct: an old-fashioned safety razor gives better shaves than a
disposable. What’s more, an old-fashioned safety razor combined with a shaving
brush, a cake of shaving soap, a mug and a bit of hot water from the teakettle
produces a shaving experience that’s vastly better, in every sense, than what
you’ll get from squirting cold chemical-laced foam out of a disposable can and
then scraping your face with a disposable razor; the older method, furthermore,
takes no more time, costs much less on a per-shave basis, and has a drastically
smaller ecological footprint to boot.

Notice also the difference in the scale and complexity of
the technological suites needed to maintain these two ways of shaving. To shave
with a safety razor and shaving soap, you need the metallurgical suite that
produces razors and razor blades, the very simple household-chemistry suite
that produces soap, the ability to make pottery and brushes, and some way to
heat water. To shave with a disposable razor and a can of squirt-on shaving
foam, you need fossil fuels for plastic feedstocks, chemical plants to
manufacture the plastic and the foam, the whole range of technologies needed to
manufacture and fill the pressurized can, and so on—all so that you can count
on getting an inferior shave at a higher price, and the razor industry can
boost its quarterly profits.

That’s a small and arguably silly example of a vast and far
from silly issue. These days, when you see the words “new and improved” on a
product, rather more often than not, the only thing that’s been improved is the
bottom line of the company that’s trying to sell it to you. When you hear
equivalent claims about some technology that’s being marketed to society as a
whole, rather than sold to you personally, the same rule applies at least as often.
That’s one of the things that drove the enthusiastic conversations on this
blog’s comment page last week, as readers came out of hiding to confess that
they, too, had stopped using this or that piece of cutting-edge, up-to-date,
hypermodern trash, and replaced it with some sturdy, elegant, user-friendly
device from an earlier decade which works better and lacks the downsides of the
newer item.

What, after all, defines a change as “progress”? There’s a
wilderness of ambiguities hidden in that apparently simple word. The popular
notion of progress presupposes that there’s an inherent dynamic to history,
that things change, or tend to change, or at the very least ought to change,
from worse to better over time.That
presupposition then gets flipped around into the even more dubious claim that
just because something’s new, it must be better than whatever it replaced. Move
from there to specific examples, and all of a sudden it’s necessary to deal
with competing claims—if there are two hot new technologies on the market, is
option A more progressive than option B, or vice versa? The answer, of course,
is that whichever of them manages to elbow the other aside will be
retroactively awarded the coveted title of the next step in the march of
progress.

That was exactly the process by which the appropriate tech
of the 1970s was shoved aside and buried in the memory hole of our culture. In
its heyday, appropriate tech was as cutting-edge and progressive as anything
you care to name, a rapidly advancing field pushed forward by brilliant young
engineers and innovative startups, and it saw itself (and presented itself to
the world) as the wave of the future. In the wake of the Reagan-Thatcher
counterrevolution of the 1980s, though, it was retroactively stripped of its
erstwhile status as an icon of progress and consigned to the dustbin of the
past. Technologies that had been lauded in the media as brilliantly innovative
in 1978 were thus being condemned in the same media as Luddite throwbacks by
1988. If that abrupt act of redefinition reminds any of my readers of the way
history got rewritten in George Orwell’s 1984—“Oceania has never been
allied with Eurasia” and the like—well, let’s just say the parallel was noticed
at the time, too.

The same process on a much smaller scale can be traced with
equal clarity in the replacement of the safety razor and shaving soap with the
disposable razor and squirt-can shaving foam. In what sense is the latter,
which wastes more resources and generates more trash in the process of giving
users a worse shave at a higher price, more progressive than the former? Merely
the fact that it’s been awarded that title by advertising and the media. If
razor companies could make more money by reintroducing the Roman habit of
scraping beard hairs off the face with a chunk of pumice, no doubt that would
quickly be proclaimed as the last word in cutting-edge, up-to-date
hypermodernity, too.

Behind the mythological image of the relentless and
inevitable forward march of technology-in-the-singular in the grand cause of
progress, in other words, lies a murky underworld of crass commercial motives
and no-holds-barred struggles over which of the available technologies will get
the funding and marketing that will define it as the next great step in progress.
That’s as true of major technological programs as it is of shaving supplies.
Some of my readers are old enough, as I am, to remember when supersonic
airliners and undersea habitats were the next great steps in progress, until
all of a sudden they weren’t.We may not
be all that far from the point at which space travel and nuclear power will go
the way of Sealab and the Concorde.

In today’s industrial societies, we don’t talk about that.
It’s practically taboo these days to mention the long, long list of waves of
the future that abruptly stalled and rolled back out to sea without delivering
on their promoters’ overblown promises. Remind people that the same rhetoric
currently being used to prop up faith in space travel, nuclear power, or any of
today’s other venerated icons of the religion of progress was lavished just as
thickly on these earlier failures, and you can pretty much expect to have that
comment shouted down as an irrelevancy if the other people in the conversation
don’t simply turn their backs and pretend that they never heard you say
anything at all.

They have to do something of the sort, because the
alternative is to admit that what we call “progress” isn’t the impersonal,
unstoppable force of nature that industrial culture’s ideology insists it must
be. Pay attention to the grand technological projects that failed, compare them
with those that are failing now, and it’s impossible to keep ignoring certain
crucial if hugely unpopular points. To begin with technological progress is a
function of collective choices—do we fund Sealab or the Apollo program?
Supersonic transports or urban light rail? Energy conservation and appropriate
tech or an endless series of wars in the Middle East? No impersonal force makes
those decisions; individuals and institutions make them, and then use the
rhetoric of impersonal progress to cloak the political and financial agendas
that guide the decision-making process.

What’s more, even if the industrial world chooses to invest
its resources in a project, the laws of physics and economics determine whether
the project is going to work. The Concorde is the poster child here, a
technological successbut an economic flop that never even managed to cover its
operating costs. Like nuclear power, it was only viable given huge and
continuing government subsidies, and since the strategic benefits Britain and
France got from having Concordes in the air were nothing like so great as those
they got from having an independent source of raw material for nuclear weapons,
it’s not hard to see why the subsidies went where they did.

That is to say, when something is being lauded as the next
great step forward in the glorious march of progress leading humanity to a
better world, those who haven’t drunk themselves tipsy on folk mythology need
to keep four things in mind. The first is that the next great step forwardin the glorious march of progres (etc.) might
not actually work when it’s brought down out of the billowing clouds of
overheated rhetoric into the cold hard world of everyday life. The second is
that even if it works, the next great step forward (etc.) may be a white
elephant in economic terms, and survive only so long as it gets propped up by
subsidies. The third is that even if it does make economic sense, the next great
step (etc.) may be an inferior product, and do a less effective job of meeting
human needs than whatever it’s supposed to replace. The fourth is that when it
comes right down to it, to label something as the next great (etc.) is just a
sales pitch, an overblown and increasingly trite way of saying “Buy this
product!”

Those necessary critiques, in turn, are all implicit in the
project of deliberate technological regress. Get past the thoughtstopping
rhetoric that insists “you can’t turn back the clock”—to rephrase a comment of
G.K. Chesterton’s, most people turn back the clock every fall, so that’s hardly
a valid objection—and it becomes hard not to notice that “progress” is just a
label for whatever choices happen to have been made by governments and corporations,
with or without input from the rest of us. If we don’t like the choices that
have been made for us in the name of progress, in turn, we can choose something
else.

Now of course it’s possible to stuff that sort of thinking
back into the straitjacket of progress, and claim that progress is chugging
along just fine, and all we have to do is get it back on the proper track, or
what have you. This is a very common sort of argument, and one that’s been used
over and over again by critics of this or that candidate for the next (etc.).
The problem with that argument, as I see it, is that it may occasionally win
battles but it pretty consistently loses the war; by failing to challenge the
folk mythology of progress and the agendas that are enshrined by that
mythology, it guarantees that no matter what technology or policy or program
gets put into place, it’ll end up leading the same place as all the others
before it, because it questions the means but forgets to question the goals.

That’s the trap hardwired into the contemporary faith in
progress. Once you buy into the notion that the specific choices made by
industrial societies over the last three centuries or so are something more
than the projects that happened to win out in the struggle for wealth and
power, once you let yourself believe that there’s a teleology to it all—that
there’s some objectively definable goal called “progress” that all these
choices did a better or worse job of furthering—you’ve just made it much harder
to ask where this thing called “progress” is going. The word “progress,”
remember, means going further in the same direction, and it’s precisely
questions about the direction that industrial society is going that most need
to be asked.

I’d like to suggest, in fact, that going further in the
direction we’ve been going isn’t a particularly bright idea just now.It isn’t even necessary to point to the more
obviously self-destructive dimensions of business as usual. Look at any trend
that affects your life right now, however global or local that trend may be,
and extrapolate it out in a straight line indefinitely; that’s what going
further in the same direction means. If that appeals to you, dear reader, then
you’re certainly welcome to it.I have
to say it doesn’t do much for me.

It’s only from within the folk mythology of progress that we have no choice but to accept the endless prolongation of current trends. Right now, as individuals, we can choose to shrug and walk away from the latest hypermodern trash, and do something else instead. Later on, on the far side of the crisis of our time, it may be possible to take the same logic further, and make deliberate technological regress a recognized policy option for organizations, communities, and whole nations—but that will depend on whether individuals do the thing first, and demonstrate to everyone else that it’s a workable option. In next week’s post, we’ll talk more about where that strategy might lead.

Wednesday, February 11, 2015

Over the last week or so, I’ve heard from a remarkable
number of people who feel that a major crisis is in the offing. The people in
question don’t know each other, many of them have even less contact with the
mass media than I do, and the sense they’ve tried to express to me is inchoate
enough that they’ve been left fumbling for words, but they all end up reaching
for the same metaphors: that something in the air just now seems reminiscent of
the American colonies in 1775, France in 1789, America in 1860, Europe in 1914,
or the world in 1939: a sense of being poised on the brink of convulsive
change, with the sound of gunfire and marching boots coming ever more clearly
from the dimly seen abyss ahead.

It’s not an unreasonable feeling, all things considered. In
Washington DC, Obama’s flunkies are beating the war drums over Ukraine,
threatening to send shipments of allegedly “defensive” weapons to join the
mercenaries and military advisors we’ve already not-so-covertly got over there.
Russian officials have responded to American saber-rattling by stating flatly
that a US decision to arm Kiev will be the signal for all-out war. The current
Ukrainian regime, installed by a US-sponsored coup and backed by NATO, means to
Russia precisely what a hostile Canadian government installed by a
Chinese-sponsored coup and backed by the People’s Liberation Army would mean to
the United States; if Obama’s trademark cluelessness leads him to ignore that
far from minor point and decide that the Russians are bluffing, we could be
facing a European war within weeks.

Head south and west from the fighting around Donetsk, and
another flashpoint is heating up toward an explosion of its own just now. Yes,
that would be Greece, where the new Syriza government has refused to back down
from the promises that got it into office: promises that center on the
rejection of the so-called “austerity” policies that have all but destroyed the
Greek economy since they were imposed in 2009.This shouldn’t be news to anyone; those same policies, though they’ve
been praised to the skies by neoliberal economists for decades now as a
guaranteed ticket to prosperity, have had precisely the opposite effect in
every single country where they’ve been put in place.

Despite that track record of unbroken failure, the EU—in
particular, Germany, which has benefited handsomely from the gutting of
southern European economies—continues to insist that Greece must accept what
amounts to a perpetual state of debt peonage. The Greek defense minister noted
in response in a recent speech that if Europe isn’t willing to cut a deal,
other nations might well do so. He’s quite correct; it’s probably a safe bet
that cold-eyed men in Moscow and Beijing are busy right now figuring out how
best to step through the window of opportunity the EU is flinging open for
them. If they do so—well, I’ll leave it to my readers to consider how the US is
likely to respond to the threat of Russian air and naval bases in Greece, which
would be capable of projecting power anywhere in the eastern and central
Mediterranean basin. Here again, war is a likely outcome; I hope that the Greek
government is braced for an attempt at regime change.

That is to say, the decline and fall of industrial
civilization is proceeding in the normal way, at pretty much the normal pace.
The thermodynamic foundations tipped over into decline first, as stocks of
cheap abundant fossil fuels depleted steadily and the gap had to be filled by
costly and much less abundant replacements, driving down net energy; the
economy went next, as more and more real wealth had to be pulled out of all
other economic activities to keep the energy supply more or less steady, until
demand destruction cut in and made that increasingly frantic effort moot; now a
global political and military superstructure dependent on cheap abundant fossil
fuels, and on the economic arrangement that all of that surplus energy made
possible, is cracking at the seams.

One feature of times like these is that the number of people
who can have an influence on the immediate outcome declines steadily as crisis
approaches. In the years leading up to 1914, for example, a vast number of
people contributed to the rising spiral of conflict between the aging British
Empire and its German rival, but the closer war came, the narrower the circle
of decision-makers became, until a handful of politicians in Germany, France,
and Britain had the fate of Europe in their hands. A few more bad decisions,
and the situation was no longer under anybody’s control; thereafter, the only
option left was to let the juggernaut of the First World War roll mindlessly
onward to its conclusion.

In the same way, as recently as the 1980s, many people in
the United States and elsewhere had some influence on how the industrial age
would end; unfortunately most of them backed politicians who cashed in the
resources that could have built a better future on one last round of absurd
extravagance, and a whole landscape of possibilities went by the boards. Step
by step, as the United States backed itself further and further into a morass
of short-term gimmicks with ghastly long-term consequences, the number of
people who have had any influence on the trajectory we’re on has narrowed
steadily, and as we approach what may turn out to be the defining crisis of our
time, a handful of politicians in a handful of capitals are left to make the
last decisions that can shape the situation in any way at all, before the tanks
begin to roll and the fighter-bombers rise up from their runways.

Out here on the fringes of the collective conversation of
our time, where archdruids lurk and heresies get uttered, the opportunity to
shape events as they happen is a very rare thing. Our role, rather, is to set
agendas for the future, to take ideas that are unthinkable in the mainstream
today and prepare them for their future role as the conventional wisdom of eras
that haven’t dawned yet. Every phrase on the lips of today’s practical men of
affairs, after all, was once a crazy notion taken seriously only by the lunatic
fringe—yes, that includes democracy, free-market capitalism, and all the other
shibboleths of our age.

With that in mind, while we wait to see whether today’s
practical men of affairs stumble into war the way they did in 1914, I propose
to shift gears and talk about something else—something that may seem whimsical,
even pointless, in the light of the grim martial realities just discussed. It’s
neither whimsical nor pointless, as it happens, but the implications may take a
little while to dawn even on those of my readers who’ve been following the last
few years of discussions most closely. Let’s begin with a handful of data
points.

Item: Britain’s largest bookseller recently noted that sales
of the Kindle e-book reader have dropped
like a rock in recent months, while sales of old-fashioned printed
books are up. Here in the more gizmocentric USA, e-books retain more of their
erstwhile popularity, but the bloom is off the rose; among the young and hip,
it’s not hard at all to find people who got rid of their book collections in a
rush of enthusiasm when e-books came out, regretted the action after it was too
late, and now are slowly restocking their bookshelves while their e-book
readers collect cobwebs or, at best, find use as a convenience for travel and
the like.

Item: more generally, a good many of the hottest new trends
in popular culture aren’t new trends at all—they’re old trends revived, in many
cases, by people who weren’t even alive to see them the first time around. Kurt
B. Reighley’s lively guide The
United States of Americana was the first, and remains the best,
introduction to the phenomenon, one that embraces everything from burlesque
shows and homebrewed bitters to backyard chickens and the revival of Victorian
martial arts. One pervasive thread that runs through the wild diversity of this
emerging subculture is the simple recognition that many of these older things
are better, in straightforwardly measurable senses, than their shiny modern
mass-marketed not-quite-equivalents.

Item: within that subculture, a small but steadily growing
number of people have taken the principle to its logical extreme and adopted
the lifestyles and furnishings of an earlier decade wholesale in
their personal lives. The 1950s are a common target, and so far as I know,
adopters of 1950s culture are the furthest along the process of turning into a
community, but other decades are increasingly finding the same kind of welcome
among those less than impressed by what today’s society has on offer.
Meanwhile, the reenactment scene has expanded spectacularly in recent years
from the standard hearty fare of Civil War regiments and the neo-medievalism of
the Society for Creative Anachronism to embrace almost any historical period you
care to name. These aren’t merely dress-up games; go to a
buckskinner’s rendezvous or an outdoor SCA event, for example, and you’re as
likely as not to see handspinners turning wool into yarn with drop spindles, a
blacksmith or two laboring over a portable forge, and the like.

Other examples of the same broad phenomenon could be added
to the list, but these will do for now. I’m well aware, of course, that most
people—even most of my readers—will have dismissed the things just listed as
bizarre personal eccentricities, right up there with the goldfish-swallowing
and flagpole-sitting of an earlier era. I’d encourage those of my readers who
had that reaction to stop, take a second look, and tease out the mental
automatisms that make that dismissal so automatic a part of today’s
conventional wisdom. Once that’s done, a third look might well be in order, because
the phenomenon sketched out here marks a shift of immense importance for our
future.

For well over two centuries now, since it first emerged as
the crackpot belief system of a handful of intellectuals on the outer fringes
of their culture, the modern ideology of progress has taken it as given that
new things were by definition better than whatever they replaced.That assumption stands at the heart of
contemporary industrial civilization’s childlike trust in the irreversible
cumulative march of progress toward a future among the stars. Finding ways to
defend that belief even when it obviously wasn’t true—when the latest, shiniest
products of progress turned out to be worse in every meaningful sense than the
older products they elbowed out of the way—was among the great growth
industries of the 20th century; even so, there were plenty of cases where
progress really did seem to measure up to its billing. Given the steady
increases of energy per capita in the world’s industrial nations over the last
century or so, that was a predictable outcome.

The difficulty, of course, is that the number of cases where
new things really are better than what they replace has been shrinking steadily
in recent decades, while the number of cases where old products are quite
simply better than their current equivalents—easier to use, more effective,
more comfortable, less prone to break, less burdened with unwanted side effects
and awkward features, and so on—has been steadily rising. Back behind the myth
of progress, like the little man behind the curtain in The Wizard of Oz,
stand two unpalatable and usually unmentioned realities. The first is that
profits, not progress, determines which products get marketed and which get
roundfiled; the second is that making a cheaper, shoddier product and using
advertising gimmicks to sell it anyway has been the standard marketing strategy
across a vast range of American businesses for years now.

More generally, believers in progress used to take it for
granted that progress would sooner or later bring about a world where everyone
would live exciting, fulfilling lives brimfull of miracle products and
marvelous experiences. You still hear that sort of talk from the faithful now
and then these days, but it’s coming to sound a lot like all that talk about
the glorious worker’s paradise of the future did right around the time the Iron
Curtain came down for good. In both cases, the future that was promised didn’t
have much in common with the one that actually showed up. The one we got
doesn’t have some of the nastier features of the one the former Soviet Union
and its satellites produced—well, not yet, at least—but the glorious consumer’s
paradise described in such lavish terms a few decades back got lost on the way
to the spaceport, and what we got instead was a bleak landscape of decaying
infrastructure, abandoned factories, prostituted media, and steadily declining
standards of living for everyone outside the narrowing circle of the
privileged, with the remnants of our once-vital democratic institutions hanging
above it all like rotting scarecrows silhouetted against a darkening sky.

In place of those exciting, fulfilling lives mentioned
above, furthermore, we got the monotony and stress of long commutes, cubicle
farms, and would-you-like-fries-with that for the slowly shrinking fraction of
our population who can find a job at all. The Onion, with its usual
flair for packaging unpalatable realities in the form of deadpan humor, nailed
it a few days ago with a faux health-news article announcing that the
best thing office workers could do for their health is stand up at their desk,
leave the office, and never go back. Joke or not, it’s not bad
advice; if you have a full-time job in today’s America, the average medieval
peasant had a less stressful job environment and more days off than you do; he
also kept a larger fraction of the product of his labor than you’ll ever see.

Then, of course, if you’re like most Americans, you’ll numb
yourself once you get home by flopping down on the sofa and spending most of
your remaining waking hours staring at little colored pictures on a glass
screen. It’s remarkable how many people get confused about what this action
really entails. They insist that they’re experiencing distant places, traveling
in worlds of pure imagination, and so on through the whole litany of
self-glorifying drivel the mass media likes to employ in its own praise. Let us
please be real: when you watch a program about the Amazon rain forest, you’re
not experiencing the Amazon rain forest; you’re experiencing colored pictures
on a screen, and you’re only getting as much of the experience as fits through
the narrow lens of a video camera and the even narrower filter of the
production process. The difference between experiencing something and watching
it on TV or the internet, that is to say, is precisely the same as the
difference between making love and watching pornography; in each case, the
latter is a very poor substitute for the real thing.

For most people in today’s America, in other words, the
closest approach to the glorious consumer’s paradise of the future they can
expect to get is eight hours a day, five days a week of mindless, monotonous
work under the constant pressure of management efficiency experts, if they’re
lucky enough to get a job at all, with anything up to a couple of additional
hours commuting and any off-book hours the employer happens to choose to demand
from them into the deal, in order to get a paycheck that buys a little less
each month—inflation is under control, the government insists, but prices
somehow keep going up—of products that get more cheaply made, more likely to be
riddled with defects, and more likely to pose a serious threat to the health
and well-being of their users, with every passing year. Then they can go home
and numb their nervous systems with those little colored pictures on the
screen, showing them bland little snippets of experiences they will never have,
wedged in there between the advertising.

That’s the world that progress has made. That’s the shining
future that resulted from all those centuries of scientific research and
technological tinkering, all the genius and hard work and sacrifice that have
gone into the project of progress. Of course there’s more to the consequences
of progress than that; progress has saved quite a few children from infectious
diseases, and laced the environment with so many toxic wastes that childhood
cancer, all but unheard of in 1850, is a routine event today; it’s made
impressive contributions to human welfare, while flooding the atmosphere with
greenhouse gases that will soon make far more impressive contributions to human
suffering and death—well, I could go on along these lines for quite a while.
True believers in the ideology of perpetual progress like to insist that all
the good things ought to be credited to progress while all the bad things ought
to be blamed on something else, but that’s not so plausible an article of faith
as it once was, and it bids fair to become a great deal less common as the
downsides of progress become more and more difficult to ignore.

The data points I noted earlier in this week’s post, I’ve
come to believe, are symptoms of that change, the first stirrings of wind that
tell of the storm to come. People searching for a better way of living than the
one our society offers these days are turning to the actual past, rather than
to some imaginary future, in that quest. That’s the immense shift I mentioned
earlier. What makes it even more momentous is that by and large, it’s not being
done in the sort of grim Puritanical spirit of humorless renunciation that
today’s popular culture expects from those who want something other than what
the consumer economy has on offer. It’s being done, rather, in a spirit of
celebration.

One of my readers responded to my
posttwo weeks ago on deliberate
technological regress by suggesting that I was proposing a Butlerian
jihad of sorts. (Those of my readers who don’t get the reference should pick up
a copy of Frank Herbert’s iconic SF novel Dune and read it.) I demurred,
for two reasons. First, the Butlerian jihad in Herbert’s novel was a revolt
against computer technology, and I see no need for that; once the falling cost
of human labor intersects the rising cost of energy and technology, and it
becomes cheaper to hire file clerks and accountants than to maintain the
gargantuan industrial machine that keeps computer technology available,
computers will go away, or linger as a legacy technology for a narrowing range
of special purposes until the hardware finally burns out.

The second reason, though, is the more important. I’m not a
fan of jihads, or of holy wars of any flavor; history shows all too well that
when you mix politics and violence with religion, any actual religious content
vanishes away, leaving its castoff garments to cover the naked rule of force and
fraud. If you want people to embrace a new way of looking at things,
furthermore, violence, threats, and abusive language don’t work, and it’s even
less effective to offer that new way as a ticket to virtuous misery, along the
lines of the Puritanical spirit noted above. That’s why so much of the
green-lifestyle propaganda of the last thirty years has done so little good—so
much of it has been pitched as a way to suffer self-righteously for the good of
Gaia, and while that approach appeals to a certain number of wannabe martyrs,
that’s not a large enough fraction of the population to matter.

The people who are ditching their Kindles and savoring books
as physical objects, brewing their own beer and resurrecting other old arts and
crafts, reformatting their lives in the modes of a past decade, or spending
their spare time reconnecting with the customs and technologies of an earlier
time—these people aren’t doing any of those things out of some passion for
self-denial. They’re doing them because these things bring them delights that
the shoddy mass-produced lifestyles of the consumer economy can’t match. What
these first stirrings suggest to me is that the way forward isn’t a Butlerian
jihad, but a Butlerian carnival—a sensuous celebration of the living world
outside the cubicle farms and the glass screens, which will inevitably draw
most of its raw materials from eras, technologies, and customs of the past,
which don’t require the extravagant energy and resource inputs that the modern
consumer economy demands, and so will be better suited to a future defined by
scarce energy and resources.

The Butlerian carnival isn’t the only way to approach the deliberate technological regression we need to carry out in the decades ahead, but it’s an important one. In upcoming posts, I’ll talk more about how this and other avenues to the same goal might be used to get through the mess immediately ahead, and start laying foundations for a future on the far side of the crises of our time.

Wednesday, February 04, 2015

I was saddened to learn a few days ago, via a phone call
from a fellow author, that William R. Catton Jr. died early last month, just
short of his 89th birthday. Some of my readers will have no idea who he was;
others may dimly recall that I’ve mentioned him and his most important book, Overshoot,
repeatedly in these essays. Those who’ve taken the time to read the book just
named may be wondering why none of the sites in the peak oil blogosphere has
put up an obituary, or even noted the man’s passing. I don’t happen to know the
answer to that last question, though I have my suspicions.

I encountered Overshoot for the first time in a
college bookstore in Bellingham, Washington in 1983. Red letters on a stark
yellow spine spelled out the title, a word I already knew from my classes in
ecology and systems theory; I pulled it off the shelf, and found the future
staring me in the face. This is what’s on the front cover below the title:

carrying capacity: maximum
permanently supportable load.

cornucopian myth: euphoric
belief in limitless resources.

drawdown: stealing
resources from the future.

cargoism:
delusion that technology will always save us from

overshoot:growth beyond
an area’s carrying capacity, leading to

crash: die-off.

If you want to know where I got the core ideas I’ve been
exploring in these essays for the last eight-going-on-nine years, in other
words, now you know. I still have that copy of Overshoot; it’s sitting
on the desk in front of me right now, reminding me yet again just how many
chances we had to turn away from the bleak future that’s closing in around us
now, like the night at the end of a long day.

Plenty of books in the 1970s and early 1980s applied the
lessons of ecology to the future of industrial civilization and picked up at
least part of the bad news that results. Overshoot was arguably the best
of the lot, but it was pretty much guaranteed to land even deeper in the memory
hole than the others. The difficulty was that Catton’s book didn’t pander to
the standard mythologies that still beset any attempt to make sense of the
predicament we’ve made for ourselves; it provided no encouragement to what he
called cargoism, the claim that technological progress will inevitably allow us
to have our planet and eat it too, without falling off the other side of the
balance into the sort of apocalyptic daydreams that Hollywood loves to make
into bad movies. Instead, in calm, crisp, thoughtful prose, he explained how
industrial civilization was cutting its own throat, how far past the point of
no return we’d already gone, and what had to be done in order to salvage
anything from the approaching wreck.

As I noted in
a post here in 2011, I had the chance to meet Catton at an ASPO
conference, and tried to give him some idea of how much his book had meant to
me. I did my best not to act like a fourteen-year-old fan meeting a rock star,
but I’m by no means sure that I succeeded. We talked for fifteen minutes over
dinner; he was very gracious; then things moved on, each of us left the
conference to carry on with our lives, and now he’s gone. As the old song says,
that’s the way it goes.

There’s much more that could be said about William Catton,
but that task should probably be left for someone who knew the man as a
teacher, a scholar, and a human being. I didn’t; except for that one
fifteen-minute conversation, I knew him solely as the mind behind one of the
books that helped me make sense of the world, and then kept me going on the
long desert journey through the Reagan era, when most of those who claimed to
be environmentalists over the previous decade cashed in their ideals and waved
around the cornucopian myth as their excuse for that act. Thus I’m simply going
to urge all of my readers who haven’t yet read Overshoot to do so as
soon as possible, even if they have to crawl on their bare hands and knees over
abandoned fracking equipment to get a copy. Having said that, I’d like to go on
to the sort of tribute I think he would have appreciated most: an attempt to
take certain of his ideas a little further than he did.

The core of Overshoot, which is also the core of the
entire world of appropriate technology and green alternatives that got shot
through the head and shoved into an unmarked grave in the Reagan years, is the
recognition that the principles of ecology apply to industrial society just as
much as they do to other communities of living things. It’s odd, all things
considered, that this is such a controversial proposal. Most of us have no
trouble grasping the fact that the law of gravity affects human beings the same
way it affects rocks; most of us understand that other laws of nature really do
apply to us; but quite a few of us seem to be incapable of extending that same
sensible reasoning to one particular set of laws, the ones that govern how
communities of living things relate to their environments.

If people treated gravity the way they treat ecology, you
could visit a news website any day of the week and read someone insisting with
a straight face that while it’s true that rocks fall down when dropped, human
beings don’t—no, no, they fall straight up into the sky, and anyone who thinks
otherwise is so obviously wrong that there’s no point even discussing the
matter. That degree of absurdity appears every single day in the American
media, and in ordinary conversations as well, whenever ecological issues come
up. Suggest that a finite planet must by definition contain a finite amount of
fossil fuels, that dumping billions of tons of gaseous trash into the air every
single year for centuries might change the way that the atmosphere retains
heat, or that the law of diminishing returns might apply to technology the way
it applies to everything else, and you can pretty much count on being shouted
down by those who, for all practical purposes, might as well believe that the
world is flat.

Still, as part of the ongoing voyage into the unspeakable in
which this blog is currently engaged, I’d like to propose that, in fact, human
societies are as subject to the laws of ecology as they are to every other
dimension of natural law. That act of intellectual heresy implies certain
conclusions that are acutely unwelcome in most circles just now; still, as my regular
readers will have noticed long since, that’s just one of the services this blog
offers.

Let’s start with the basics. Every ecosystem, in
thermodynamic terms, is a process by which relatively concentrated energy is
dispersed into diffuse background heat. Here on Earth, at least, the
concentrated energy mostly comes from the Sun, in the form of solar
radiation—there are a few ecosystems, in deep oceans and underground, that get
their energy from chemical reactions driven by the Earth’s internal heat instead.
Ilya Prigogine showed some decades back that the flow of energy through a
system of this sort tends to increase the complexity of the system; Jeremy
England, a MIT physicist, has recently shown that the same process accounts
neatly for the origin of life itself. The steady flow of energy from source to
sink is the foundation on which everything else rests.

The complexity of the system, in turn, is limited by the
rate at which energy flows through the system, and this in turn depends on the
difference in concentration between the energy that enters the system, on the
one hand, and the background into which waste heat diffuses when it leaves the
system, on the other. That shouldn’t be a difficult concept to grasp. Not only
is it basic thermodynamics, it’s basic physics—it’s precisely equivalent, in
fact, to pointing out that the rate at which water flows through any section of
a stream depends on the difference in height between the place where the water
flows into that section and the place where it flows out.

Simple as it is, it’s a point that an astonishing number of
people—including some who are scientifically literate—routinely miss. A
while back on this blog, for example, I noted that one of the core
reasons you can’t power a modern industrial civilization on solar energy is
that sunlight is relatively diffuse as an energy source, compared to the
extremely concentrated energy we get from fossil fuels. I still field rants
from people insisting that this is utter hogwash, since photons have exactly
the same amount of energy they did when they left the Sun, and so the energy
they carry is just as concentrated as it was when it left the Sun. You’ll
notice, though, that if this was the only variable that mattered, Neptune would
be just as warm as Mercury, since each of the photons hitting the one planet
pack on average the same energetic punch as those that hit the other.

It’s hard to think of a better example of the blindness to
whole systems that’s pandemic in today’s geek culture. Obviously, the
difference between the temperatures of Neptune and Mercury isn’t a function of
the energy of individual photons hitting the two worlds; it’s a function of
differing concentrations of photons—the number of them, let’s say, hitting a
square meter of each planet’s surface. This is also one of the two figures that
matter when we’re talking about solar energy here on Earth. The other? That’s
the background heat into which waste energy disperses when the system, eco- or
solar, is done with it. On the broadest scale, that’s deep space, but
ecosystems don’t funnel their waste heat straight into orbit, you know. Rather,
they diffuse it into the ambient temperature at whatever height above or below
sea level, and whatever latitude closer or further from the equator, they
happen to be—and since that’s heated by the Sun, too, the difference between
input and output concentrations isn’t very substantial.

Nature has done astonishing things with that very modest
difference in concentration. People who insist that photosynthesis is horribly
inefficient, and of course we can improve its efficiency, are missing a crucial
point: something like half the energy that reaches the leaves of a green plant
from the Sun is put to work lifting water up from the roots by an ingenious
form of evaporative pumping, in which water sucked out through the leaf pores
as vapor draws up more water through a network of tiny tubes in the plant’s
stems. Another few per cent goes into the manufacture of sugars by
photosynthesis, and a variety of minor processes, such as the chemical
reactions that ripen fruit, also depend to some extent on light or heat from
the Sun; all told, a green plant is probably about as efficient in its total
use of solar energy as the laws of thermodynamics will permit.

What’s more, the Earth’s ecosystems take the energy that
flows through the green engines of plant life and put it to work in an
extraordinary diversity of ways. The water pumped into the sky by what
botanists call evapotranspiration—that’s the evaporative pumping I mentioned a
moment ago—plays critical roles in local, regional, and global water cycles.
The production of sugars to store solar energy in chemical form kicks off an
even more intricate set of changes, as the plant’s cells are eaten by
something, which is eaten by something, and so on through the lively but
precise dance of the food web. Eventually all the energy the original plant
scooped up from the Sun turns into diffuse waste heat and permeates slowly up
through the atmosphere to its ultimate destiny warming some corner of deep
space a bit above absolute zero, but by the time it gets there, it’s usually
had quite a ride.

That said, there are hard upper limits to the complexity of
the ecosystem that these intricate processes can support. You can see that
clearly enough by comparing a tropical rain forest to a polar tundra. The two
environments may have approximately equal amounts of precipitation over the course
of a year; they may have an equally rich or poor supply of nutrients in the
soil; even so, the tropical rain forest can easily support fifteen or twenty
thousand species of plants and animals, and the tundra will be lucky to support
a few hundred. Why? The same reason Mercury is warmer than Neptune: the rate at
which photons from the sun arrive in each place per square meter of surface.

Near the equator, the sun’s rays fall almost
vertically.Close to the poles, since
the Earth is round, the Sun’s rays come in at a sharp angle, and thus are
spread out over more surface area. The ambient temperature’s quite a bit warmer
in the rain forest than it is on the tundra, but because the vast heat engine
we call the atmosphere pumps heat from the equator to the poles, the difference
in ambient temperature is not as great as the difference in solar input per
cubic meter. Thus ecosystems near the equator have a greater difference in
energy concentration between input and output than those near the poles, and the
complexity of the two systems varies accordingly.

All this should be common knowledge. Of course it isn’t,
because the industrial world’s notions of education consistently ignore what
William Catton called “the processes that matter”—that is, the fundamental laws
of ecology that frame our existence on this planet—and approach a great many of
those subjects that do make it into the curriculum in ways that encourage the
most embarrassing sort of ignorance about the natural processes that keep us
all alive. Down the road a bit, we’ll be discussing that in much more detail.
For now, though, I want to take the points just made and apply them
systematically, in much the way Catton did, to the predicament of industrial
civilization.

A human society is an ecosystem.Like any other ecosystem, it depends for its
existence on flows of energy, and as with any other ecosystem, the upper limit
on its complexity depends ultimately on the difference in concentration between
the energy that enters it and the background into which its waste heat
disperses. (This last point is a corollary of White’s Law, one of the
fundamental principles of human ecology, which holds that a society’s economic
development is directly proportional to its consumption of energy per capita.)Until the beginning of the industrial
revolution, that upper limit was not much higher than the upper limit of
complexity in other ecosystems, since human ecosystems drew most of their
energy from the same source as nonhuman ones: sunlight falling on green plants.As human societies figured out how to tap
other flows of solar energy—windpower to drive windmills and send ships
coursing over the seas, water power to turn mills, and so on—that upper limit
crept higher, but not dramatically so.

The discoveries that made it possible to turn fossil fuels
into mechanical energy transformed that equation completely. The geological
processes that stockpiled half a billion years of sunlight into coal, oil, and
natural gas boosted the concentration of the energy inputs available to
industrial societies by an almost unimaginable factor, without warming the
ambient temperature of the planet more than a few degrees, and the huge
differentials in energy concentration that resulted drove an equally
unimaginable increase in complexity. Choose any measure of complexity you
wish—number of discrete occupational categories, average number of human beings
involved in the production, distribution, and consumption of any given good or
service, or what have you—and in the wake of the industrial revolution, it
soared right off the charts. Thermodynamically, that’s exactly what you’d
expect.

The difference in energy concentration between input and
output, it bears repeating, defines the upper limit of complexity. Other
variables determine whether or not the system in question will achieve that
upper limit. In the ecosystems we call human societies, knowledge is one of
those other variables. If you have a highly concentrated energy source and
don’t yet know how to use it efficiently, your society isn’t going to become as
complex as it otherwise could. Over the three centuries of industrialization,
as a result, the production of useful knowledge was a winning strategy, since
it allowed industrial societies to rise steadily toward the upper limit of
complexity defined by the concentration differential. The limit was never
reached—the law of diminishing returns saw to that—and so, inevitably,
industrial societies ended up believing that knowledge all by itself was
capable of increasing the complexity of the human ecosystem. Since there’s no
upper limit to knowledge, in turn, that belief system drove what Catton called
the cornucopian myth, the delusion that there would always be enough resources
if only the stock of knowledge increased quickly enough.

That belief only seemed to work, though, as long as the
concentration differential between energy inputs and the background remained
very high. Once easily accessible fossil fuels started to become scarce, and
more and more energy and other resources had to be invested in the extraction
of what remained, problems started to crop up. Tar sands and oil shales in
their natural form are not as concentrated an energy source as light sweet
crude—once they’re refined, sure, the differences are minimal, but a whole
system analysis of energy concentration has to start at the moment each energy
source enters the system. Take a cubic yard of tar sand fresh from the pit
mine, with the sand still in it, or a cubic yard of oil shale with the oil
still trapped in the rock, and you’ve simply got less energy per unit volume
than you do if you’ve got a cubic yard of light sweet crude fresh from the
well, or even a cubic yard of good permeable sandstone with light sweet crude
oozing out of every pore.

It’s an article of faith in contemporary culture that such
differences don’t matter, but that’s just another aspect of our cornucopian
myth. The energy needed to get the sand out of the tar sands or the oil out of
the shale oil has to come from somewhere, and that energy, in turn, is not
available for other uses. The result, however you slice it conceptually, is
that the upper limit of complexity begins moving down. That sounds abstract,
but it adds up to a great deal of very concrete misery, because as already noted,
the complexity of a society determines such things as the number of different
occupational specialties it can support, the number of employees who are
involved in the production and distribution of a given good or service, and so
on. There’s a useful phrase for a sustained contraction in the usual measures
of complexity in a human ecosystem: “economic depression.”

The economic troubles that are shaking the industrial world
more and more often these days, in other words, are symptoms of a disastrous
mismatch between the level of complexity that our remaining concentration
differential can support, and the level of complexity that our preferred
ideologies insist we ought to have. As those two things collide, there’s no
question which of them is going to win. Adding to our total stock of knowledge
won’t change that result, since knowledge is a necessary condition for economic
expansion but not a sufficient one: if the upper limit of complexity set by the
laws of thermodynamics drops below the level that your knowledge base would
otherwise support, further additions to the knowledge base simply mean that
there will be a growing number of things that people know how to do in theory,
but that nobody has the resources to do in practice.

Knowledge, in other words, is not a magic wand, a surrogate
messiah, or a source of miracles. It can open the way to exploiting energy more
efficiently than otherwise, and it can figure out how to use energy resources
that were not previously being used at all, but it can’t conjure energy out of
thin air. Even if the energy resources are there, for that matter, if other
factors prevent them from being used, the knowledge of how they might be used
offers no consolation—quite the contrary.

That latter point, I think, sums up the tragedy of William Catton’s career. He knew, and could explain with great clarity, why industrialism would bring about its own downfall, and what could be done to salvage something from its wreck. That knowledge, however, was not enough to make things happen; only a few people ever listened, most of them promptly plugged their ears and started chanting “La, la, la, I can’t hear you” once Reagan made that fashionable, and the actions that might have spared all of us a vast amount of misery never happened. When I spoke to him in 2011, he was perfectly aware that his life’s work had done essentially nothing to turn industrial society aside from its rush toward the abyss. That’s got to be a bitter thing to contemplate in your final hours, and I hope his thoughts were on something else last month as the night closed in at last.

Wednesday, January 28, 2015

All things considered, 2015 just isn’t shaping up to be a
good year for believers in business as usual. Since last week’s post here on
The Archdruid Report, the anti-austerity party Syriza has
swept the Greek elections, to the enthusiastic cheers of similar parties all
over Europe and the discomfiture of the Brussels hierarchy. The latter have no
one to blame for this turn of events but themselves; for more than a decade
now, EU policies have effectively put sheltering banks and bondholders from the
healthy discipline of the market ahead of all other considerations, including
the economic survival of entire nations. It should be no surprise to anyone
that this wasn’t an approach with a long shelf life.

Meanwhile, the fracking bust continues unabated. The number
of drilling rigs at work in American oilfields continues to drop vertically
from week to week, layoffs in the nation’s various oil patches are picking up
speed, and the price of oil remains down at levels that make further fracking a
welcome mat for the local bankruptcy judge. Those media pundits who are still
talking the fracking industry’s book keep insisting that the dropping price of
oil proves that they were right and those dratted heretics who talk of peak oil
must be wrong, but somehow those pundits never get around to explaining why
iron ore, copper, and most other major commodities are dropping in price even
faster than crude oil, nor why demand for petroleum products here in the US has
been declining steadily as well.

The fact of the matter is that an industrial economy built
to run on cheap conventional oil can’t run on expensive oil for long without
running itself into the ground. Since 2008, the world’s industrial nations have
tried to make up the difference by flooding their economies with cheap credit,
in the hope that this would somehow make up for the sharply increased amounts
of real wealth that have had to be diverted from other purposes into the
struggle to keep liquid fuels flowing at their peak levels. Now, though, the
laws of economics have called their bluff; the wheels are coming off one
national economy after another, and the price of oil (and all those other
commodities) has dropped to levels that won’t cover the costs of fracked oil,
tar sands, and the like, because all those frantic attempts to externalize the
costs of energy production just meant that the whole global economy took the
hit.

Now of course this isn’t how governments and the media are
spinning the emerging crisis. For that matter, there’s no shortage of people
outside the corridors of power, or for that matter of punditry, who ignore the
general collapse of commodity prices, fixate on oil outside of the broader
context of resource depletion in general, and insist that the change in the
price of oil must be an act of economic warfare, or what have you. It’s a logic
that readers of this blog will have seen deployed many times in the past:
whatever happens, it must have been decided and carried out by human beings. An
astonishing number of people these days seem unable to imagine the possibility
that such wholly impersonal factors as the laws of economics, geology, and
thermodynamics could make things happen all by themselves.

The problem we face now is precisely that the unimaginable is
now our reality. For just that little bit too long, too many people have
insisted that we didn’t need to worry about the absurdity of pursuing limitless
growth on a finite and fragile planet, that “they’ll think of something,” or
that chattering on internet forums about this or that or the other piece of
technological vaporware was doing something concrete about our species’
imminent collision with the limits to growth. For just that little bit too
long, not enough people were willing to do anything that mattered, and now
impersonal factors have climbed into the driver’s seat, having mugged all seven
billion of us and shoved us into the trunk.

As I noted in
last week’s post, that puts hard limits on what can be done in the
short term. In all probability, at this stage of the game, each of us will be
meeting the oncoming wave of crisis with whatever preparations we’ve made,
however substantial or insubstantial those happen to be. I’m aware that a
certain subset of my readers are unhappy with that suggestion, but that can’t
be helped; the future is under no obligation to wait patiently while we get
ready for it. A few years back, when I posted an essay here whose title sums
up the strategy I’ve been proposing, I probably should have put more
stress on the most important word in that slogan: now.
Still, that’s gone wherever might-have-beens spend their time.

That doesn’t mean the world is about to end. It means that
in all probability, beginning at some point this year and continuing for
several years after that, most of my readers will be busy coping with the
multiple impacts of a thumping economic crisis on their own lives and those of
their families, friends, communities, and employers, at a time when political
systems over much of the industrial world have frozen up into gridlock, the
simmering wars in the Middle East and much of the Third World seem more than
usually likely to boil over, and the twilight of the Pax Americana is pushing
both the US government and its enemies into an ever greater degree of
brinksmanship. Exactly how that’s going to play out is anyone’s guess, but no
matter what happens, it’s unlikely to be pretty.

While we get ready for the first shocks to hit, though, it’s
worth talking a little bit about what comes afterwards.No matter how long a train of financial
dominoes the collapse of the fracking bubble sets toppling, the last one fill
fall eventually, and within a few years things will have found a “new normal,”
however far down the slope of contraction that turns out to be. No matter how
many proxy wars, coups d’etat, covert actions, and manufactured insurgencies
get launched by the United States or its global rivals in their struggle for
supremacy, most of the places touched by that conflict will see a few years at
most of actual warfare or the equivalent, with periods of relative peace before
and after. The other driving forces of collapse act in much the same way;
collapse is a fractal process, not a linear one.

Thus there’s something on the far side of crisis besides
more of the same. The discussion I’d like to start at this point centers on
what might be worth doing once the various masses of economic, political, and
military rubble stops bouncing. It’s not too early to begin planning for that.
If nothing else, it will give readers of this blog something to think about
while standing in bread lines or hiding in the basement while riot police and
insurgents duke it out in the streets. That benefit aside, the sooner we start
thinking about the options that will be available once relative stability
returns, the better chance we’ll have of being ready to implement it, in our
own lives or on a broader scale, once stability returns.

One of the interesting consequences of crisis, for that
matter, is that what was unthinkable before a really substantial crisis may not
be unthinkable afterwards. Read Barbara Tuchman’s brilliant The Proud
Tower and you’ll see how many of the unquestioned certainties of 1914
were rotting in history’s compost bucket by the time 1945 rolled around, and
how many ideas that had been on the outermost fringes before the First World
War that had become plain common sense after the Second. It’s a common
phenomenon, and I propose to get ahead of the curve here by proposing, as raw
material for reflection if nothing else, something that’s utterly unthinkable
today but may well be a matter of necessity ten or twenty or forty years from
now.

What do I have in mind? Intentional technological regression
as a matter of public policy.

Imagine, for a moment, that an industrial nation were to
downshift its technological infrastructure to roughly what it was in 1950. That
would involve a drastic decrease in energy consumption per capita, both
directly—people used a lot less energy of all kinds in 1950—and
indirectly—goods and services took much less energy to produce then, too. It
would involve equally sharp decreases in the per capita consumption of most
resources. It would also involve a sharp increase in jobs
for the working classes—a great many things currently done by robots were done
by human beings in those days, and so there were a great many more paychecks
going out of a Friday to pay for the goods and services that ordinary consumers
buy. Since a steady flow of paychecks to the working classes is one of the
major things that keep an economy stable and thriving, this has certain obvious
advantages, but we can leave those alone for now.

Now of course the change just proposed would involve certain
changes from the way we do things. Air travel in the 1950s was extremely
expensive—the well-to-do in those days were called “the jet set,” because
that’s who could afford tickets—and so everyone else had to put up with fast,
reliable, energy-efficient railroads when they needed to get from place to
place. Computers were rare and expensive, which meant once again that more
people got hired to do jobs, and also meant that when you called a utility or a
business, your chance of getting a human being who could help you with whatever
problem you might have was considerably higher than it is today.

Lacking the internet, people had to make do instead with
their choice of scores of AM and shortwave radio stations, thousands of general
and specialized print periodicals, and full-service bookstores and local
libraries bursting at the seams with books—in America, at least, the 1950s were
the golden age of the public library, and most small towns had collections you
can’t always find in big cities these days. Oh, and the folks who like looking
at pictures of people with their clothes off, and who play a large and usually
unmentioned role in paying for the internet today, had to settle for naughty
magazines, mail-order houses that shipped their products in plain brown
wrappers, and tacky stores in the wrong end of town. (For what it’s worth, this
didn’t seem to inconvenience them any.)

As previously noted, I’m quite aware that such a project is
utterly unthinkable today, and we’ll get to the superstitious horror that lies
behind that reaction in a bit. First, though, let’s talk about the obvious
objections. Would it be possible? Of course. Much of it could be done by simple
changes in the tax code. Right now, in the United States, a galaxy of perverse
regulatory incentives penalize employers for hiring people and reward them for
replacing employees with machines. Change those so that spending money on
wages, salaries and benefits up to a certain comfortable threshold makes more
financial sense for employers than using the money to automate, and you’re
halfway there already.

A revision in trade policy would do most of the rest of
what’s needed.What’s jokingly called
“free trade,” despite the faith-based claims of economists, benefits the rich
at everyone else’s expense, and would best be replaced by sensible tariffs to
support domestic production against the sort of predatory export-driven
mercantilism that dominates the global economy these days. Add to that high
tariffs on technology imports, and strip any technology beyond the 1950 level
of the lavish subsidies that fatten the profit margins of the welfare-queen
corporations in the Fortune 500, and you’re basically there.

What makes the concept of technological regression so
intriguing, and so workable, is that it doesn’t require anything new to be
developed. We already know how 1950 technology worked, what its energy and
resource needs are, and what the upsides and downsides of adopting it would be;
abundant records and a certain fraction of the population who still remember
how it worked make that easy. Thus it would be an easy thing to pencil out
exactly what would be needed, what the costs and benefits would be, and how to
minimize the former and maximize the latter; the sort of blind guesses and
arbitrary assumptions that have to go into deploying a brand new technology
need not apply.

So much for the first objection. Would there be downsides to
deliberate technological regression? Of course. Every technology and every set
of policy options has its downsides.A
common delusion these days claims, in effect, that it’s unfair to take the downsides
of new technologies or the corresponding upsides of old ones into consideration
when deciding whether to replace an older technology with a newer one. An even
more common delusion claims that you’re not supposed to decide at all; once a
new technology shows up, you’re supposed to run bleating after it like everyone
else, without asking any questions at all.

Current technology has immense downsides. Future
technologies are going to have them, too—it’s only in sales brochures and
science fiction stories, remember, that any technology is without them. Thus
the mere fact that 1950 technology has problematic features, too, is not a
valid reason to dismiss technological retrogression. The question that needs to
be asked, however unthinkable it might be, is whether, all things considered,
it’s wiser to accept the downsides of 1950 technology in order to have a
working technological suite that can function on much smaller per capita inputs
of energy and resources, and thus a much better chance to get through the age
of limits ahead than today’s far more extravagant and brittle technological
infrastructure.

It’s probably also necessary to talk about a particular
piece of paralogic that comes up reliably any time somebody suggests
technological regression: the notion that if you return to an older technology,
you have to take the social practices and cultural mores of its heyday as well.
I fielded a good many such comments last year when I
suggested steam-powered Victorian technology powered by solar energy
as a form the ecotechnics of the future might take. An astonishing number of
people seemed unable to imagine that it was possible to have such a technology
without also reintroducing Victorian habits such as child labor and sexual
prudery. Silly as that claim is, it has deep roots in the modern imagination.

No doubt, as a result of those deep roots, there will be
plenty of people who respond to the proposal just made by insisting that the
social practices and cultural mores of 1950 were awful, and claiming that those
habits can’t be separated from the technologies I’m discussing. I could point
out in response that 1950 didn’t have a single set of social practices and
cultural mores; even in the United States, a drive from Greenwich Village to
rural Pennsylvania in 1950 would have met with remarkable cultural diversity
among people using the same technology.

The point could be made even more strongly by noting that
the same technology was in use that year in Paris, Djakarta, Buenos Aires,
Tokyo, Tangiers, Novosibirsk, Guadalajara, and Lagos, and the social practices
and cultural mores of 1950s middle America didn’t follow the technology around
to these distinctly diverse settings, you know. Pointing that out, though, will
likely be wasted breath. To true believers in the religion of progress, the
past is the bubbling pit of eternal damnation from which the surrogate messiah
of progress is perpetually saving us, and the future is the radiant heaven into
whose portals the faithful hope to enter in good time. Most people these days
are no more willing to question those dubious classifications than a medieval
peasant would be to question the miraculous powers that supposely emanated from
the bones of St. Ethelfrith.

Nothing, but nothing, stirs up shuddering superstitious
horror in the minds of the cultural mainstream these days as effectively as the
thought of, heaven help us, “going back.” Even if the technology of an earlier
day is better suited to a future of energy and resource scarcity than the
infrastructure we’ve got now, even if the technology of an earlier day actually
does a better job of many things than what we’ve got today, “we can’t go back!”
is the anguished cry of the masses. They’ve been so thoroughly bamboozled by
the propagandists of progress that they never stop to think that, why, yes,
they can, and there are valid reasons why they might even decide that it’s the
best option open to them.

There’s a very rich irony in the fact that alternative and
avant-garde circles tend to be even more obsessively fixated on the dogma of
linear progress than the supposedly more conformist masses. That’s one of the
sneakiest features of the myth of progress; when people get dissatisfied with
the status quo, the myth convinces them that the only option they’ve got is to
do exactly what everyone else is doing, and just take it a little further than
anyone else has gotten yet. What starts off as rebellion thus gets coopted into
perfect conformity, and society continues to march mindlessly along its current
trajectory, like lemmings in a Disney nature film, without ever asking the
obvious questions about what might be waiting at the far end.

That’s the thing about progress; all the word means is
“continued movement in the same direction.” If the direction was a bad idea to
start with, or if it’s passed the point at which it still made sense,
continuing to trudge blindly onward into the gathering dark may not be the best
idea in the world. Break out of that mental straitjacket, and the range of
possible futures broadens out immeasurably.

It may be, for example, that technological regression to the
level of 1950 turns out to be impossible to maintain over the long term. If the
technologies of 1920can be supported on
the modest energy supply we can count on getting from renewable sources, for
example, something like a 1920 technological suite might be maintained over the
long term, without further regression. It might turn out instead that something
like the solar steampower I mentioned earlier, an ecotechnic equivalent of 1880
technology, might be the most complex technology that can be supported on a
renewable basis. It might be the case, for that matter, that something like the
technological infrastructure the United States had in 1820, with windmills and
water wheels as the prime movers of industry, canalboats as the core domestic
transport technology, and most of the population working on small family farms
to support very modest towns and cities, is the fallback level that can be
sustained indefinitely.

Does that last option seem unbearably depressing? Compare it to another very likely scenario—what will happen if the world’s industrial societies gamble their survival on a great leap forward to some unproven energy source, which doesn’t live up to its billing, and leaves billions of people twisting in the wind without any working technological infrastructure at all—and you may find that it has its good points. If you’ve driven down a dead end alley and are sitting there with the front grill hard against a brick wall, it bears remembering, shouting “We can’t go back!” isn’t exactly a useful habit. In such a situation—and I’d like to suggest that that’s a fair metaphor for the situation we’re in right now—going back, retracing the route as far back as necessary, is the one way forward.

Subscribe To

If you enjoy reading this blog, please consider putting a tip in the Archdruid's tip jar. Many thanks!

Followers

About JMG

John Michael Greer is the Grand Archdruid of the Ancient Order of Druids in America and the author of more than thirty books on a wide range of subjects, including peak oil and the future of industrial society. He lives in Cumberland, MD, an old red brick mill town in the north central Appalachians, with his wife Sara.

Well of Galabes

If you enjoy this blog and can handle discussions of Druidry, magic, and occult philosophy, you might like my other blog, Well of Galabes.