Posts Tagged ‘archaic tech’

I keep thinking about the difference between pre-human and fully human — between Neanderthal, say, and us — and it’s occurred to me that the most significant distinction may be in terms of conscious control.

This idea grows out of a notion I’ve been toying with for years — that much of evolution has been a process of incremental internalization.

Hundreds of millions of years ago, for example, the hot new trend among primitive life-forms involved the development of bodily organs to take over functions that had previously been left up to the environment. Instead of passively allowing the sea to wash through them, they developed a digestive system that could pull in nutrients and eject the waste and a circulatory system to carry oxygen and nutrients to every cell. As time passed, they added shells or skeletons for stability, limbs for propulsion, and various appendages for grasping.

For higher-order creatures, biological evolution was eventually supplemented by behavioral evolution — but the goal was still to replace chance with control. Instead of depositing their eggs in the sand like a turtle and leaving the hatchlings to fend for themselves, they might feed and protect their babies like a dinosaur or mammal. Or they might create a specialized mini-environment to provide greater safety and comfort: a nest, a burrow, or an ant hill.

Among our own human lineage, however, there has been a further step — where taking control becomes a matter not of more advanced instincts but of conscious thought.

A few months ago, I discussed recent findings on the ancient use of fire to produce a kind of glue which was used to attach stone axe heads to wooden handles. Now another early example of the sophisticated use of fire in tool-making has been described — and like the glue, it comes from South Africa and has been dated around 72,000 years ago.

The technique in question involves heat-treating a yellowish stone called silcrete, which is not well-adapted to tool-making, so that it turns a deep, glossy red and is very easily flaked.

It occurred to me after doing the previous entry that someone with a skeptical eye might object that I seem to be cherry-picking my examples in order to make a case for the prehistoric origins of mystical beliefs and practices.

In fact, from my point of view, it’s quite the opposite. Until recently, I shared the prevailing assumption that sophisticated intellectual and philosophical systems go back to not much earlier than 500 BC and that human knowledge before then was relatively unsystematic, intuitive, or “mythopoeic.”

It’s been only with the emergence of geek culture over the last couple of years that I’ve become convinced that geeks as a personality type have existed since the origins of modern humanity. (I doubt there were Neanderthal geeks — there’s certainly no sign of them in the archeological record — which may be why we’re still here, for all our flaws, and they’re not.)

And it’s in the nature of geeks to mess around with stuff, try to make sense of it, create intellectual systems of dizzying complexity to explain it, exchange wild metaphysical speculations with their fellow geeks, and generally geek out to the max at any opportunity.

See, for example, the Mayan calendar as an example of geekitude run amok. Or the I Ching. Or the pyramids. Geeks just can’t help themselves. Intellectual complexity mated to metaphysical subtlety is what they do. It’s the water in which they swim.

So, no, I’m not cherry-picking my examples. I’m just being struck by the fact that there are signs saying “Geeks at Work” in big flashing letters all over the archaeological record.

According to a story at Wired this week, “Researchers who reverse-engineered an ancient superglue have found that Stone Age people were smarter than we thought. Making the glue, originally used on 70,000-year-old composite tools, clearly required high-level cognitive powers.”

That’s pretty neat in itself — even though this whole “smarter than we thought” business does tend to inspire kind of a “What You Mean ‘We,’ Kimosabe” reaction.

But the really interesting part is how this superglue was created. It seems that when the researchers tried to use acacia gum — of which they’d found traces on the ancient stone tools — to attach their replicas to wooden handles, it didn’t work. It wasn’t until they added in the iron-rich pigment of which they’d also found traces that everything held together.

“Making the glue required much more than simple mixing,” the Wired article continues. “It demanded careful and sustained attention. Keeping the fire at the right temperature required certain types of wood, with a certain degree of moisture content. If glues were mixed too close to the fire, they contained air bubbles. If too dry, they weren’t cohesive; if too wet, they were weak. The Sibudu Cave’s Stone Age inhabitants, wrote the researchers, were ‘competent chemists, alchemists and pyrotechnologists.’”

Yeah — alchemists. Their word, not mine. But it was very aptly chosen.

The standard twentieth century story would have us believe that very little happened between the development of agriculture and the rise of civilization — that it was a time of simple peasant villages, with few interests beyond securing the next harvest and few inventions besides ingenious domestic devices, like the technology of churning butter or spinning flax.

However, it is now becoming clear that the period from roughly 7000 to 4000 BC gave rise to one of humanity’s greatest intellectual breakthroughs — the first scientific cosmology. The movements of the sun and stars were closely tracked and were found to mark out an astonishingly precise four-fold partition of both space (the cardinal directions — north, south, east, and west) and time (the winter and summer solstices and the spring and fall equinoxes.)

This recognition of a profound cosmic order underlying the flux of everyday life completely revolutionized art and religion, transforming the domain of human consciousness from a world of whim and accident to one of order and predictability.

This electrifying discovery has generally been attributed to simple farmers seeking clues on when to plant their crops. It has been described as an inevitable insight once people had settled down to the land and could watch the rising point of the sun move back and forth along the same mountain year after year.

But for those who lived through it, this intellectual revolution was far from obvious. It took careful long-term observation and analysis, carried out over many lifetimes, and the radical alteration in perspective which resulted was as dramatic and unexpected in its own time as those resulting from the Copernican system in the 16th century or Einstein’s relativity in the 20th.

The origins of agriculture could be explained to the satisfaction of twentieth century materialists as resulting from a series of accidental discoveries, refined by natural selection. However, other achievements of the Neolithic, like the construction of cities and the rise of complex states, were not so easily dismissed as unintended adaptations to circumstances. It is very hard to build a city by accident.

In order to complete their mechanistic model, the archeologists were thus forced to turn from Darwin to Karl Marx.

In Marx’s theory of historical materialism, all social change starts with changes in the mean of production. Everything else, from government to religion, is merely a cultural superstructure erected upon the hard foundation of economics.

From this point of view, once the “Neolithic Revolution” had altered the way in which people met their basic needs, a whole array of other changes became inevitable, including all aspects of the “Urban Revolution.” It was just that simple.

However, this ultra-deterministic view of historical causality, which always involved a certain amount of hand-waving, has now been completely undermined by new data. It is becoming obvious that people were already living in villages before they began taming wild plants, and that even a few fair-sized cities were built by people who practiced relatively little agriculture. It is beginning to seem as though subjective factors, such as an active desire for the advantages of urban life, may have preceded and been the cause of the shift to farming, rather than its outcome.

When modern humans first ventured out of East Africa some eighty or a hundred thousand years ago, they were few and the world was very large. For tens of thousands of years, they were free to wander at will, always seeking the next horizon. Driven by curiousity and a spirit of adventure, they spread over the entire planet with amazing speed.

Eventually Homo sapiens filled every corner of the Earth, from England to Tierra del Fuego — and at that point things started to get crowded. Suddenly people were having to deal with nearby neighbors, who might even be competitors, and they could no longer just pull up stakes and move to the next valley. Instead, they had to apply their ingenuity and make do with what was available.

The archaeological evidence shows a far more intensive exploitation of resources starting in the late Ice Age. People at that time learned how to spear fish and snare birds, they increased their level of cooperative interaction with dogs, and they worked hard at finding useful new food plants.

Most important of all, they began to take increased control of their environment. They became gardeners, altering local conditions to encourage the growth of plants which they favored. They developed the habit of bringing home interesting samples to replant and of herding animals to keep them close at hand. And they devised novel ways of processing plants which had previously been unpalatable, or even toxic, and turning them into heathy and delicious meals.

As long as prehistoric hunter-gatherers were regarded as simple-minded savages, it was difficult to imagine how they could have come up with the radical innovations that marked the onset of the Neolithic. The typical response of twentieth century archaeologists to this problem was to deny that any genuine creativity had been involved. Instead, they did their best to reduce the profound social and technological transformations of the early Neolithic to an almost entirely automatic process, driven by impersonal environmental forces and requiring little or no actual thought or planning.

The primary model on which archaeologists based this analysis was that of Darwinian evolution. Each small step towards agriculture was considered as a kind of random mutation which one hunting group or another could have stumbled onto by accident. In accordance with the principle of survival of the fittest, those groups which adopted practices that increased their food supply would have prospered as the expense of those which did not. In the course of time, the natural superiority of farming would have ensured its dominance over hunting.

This scenario may have seemed convincing to twentieth century materialists, but there were any number of problems with it, not least the delicate question of just what constitutes evolutionary fitness. Recent studies of both ancient and contemporary hunters and farmers have shown that farmers work harder, have a less nutritious diet, and die younger than hunter-gatherers. Rather than taking the superiority of farming for granted, archaeologists are now struggling to answer the question of why hunters would have voluntarily given up their freedom and leisure in order to become peasants bound to the soil.

But perhaps the most profound difficulty with the standard twentieth century account of the Neolithic was the way it cast this dramatic retooling of human society as being the work of terminally clueless idiots. Farming, for example, was supposed to have resulted when some half-bright caveman noticed useful crops flourishing on the trash-heap of a former campsite and got the daring notion that they had grown there from discarded seeds. Pottery was similarly supposed to have been discovered when a lump of clay accidentally fell in the fire.