Tag Archives: Plato

‘It lifts and separates’ is a slogan that will be familiar to those of my generation – it was advertised as the chief virtue of the Playtex ‘cross-your-heart’ Bra. However, it also serves as a memorable illustration of my theory concerning the origin of what we think of as Language.

The conventional account presents Language, as we have it now, as evolved speech, i.e. its origins go back to our first utterance, with the acquisition of a written form for transcribing it a logical development that occurs in due course – around five thousand years ago – but only after speech has held sway as the primary means of human communication for a couple of hundred thousand years.

However, I think that is not what happened; in particular, the notion that speech was the original and primary means of human communication, occupying a position analogous to what we call Language (both spoken and written) today, is an erroneous backwards projection, based on the status that speech only now enjoys.

The conventional account could be summed up briefly thus: ‘First, we learned to speak, and that is what made us human and marked us out as special; then we learned to write down what we said in order to give it permanent form, and that enabled us to store the knowledge and wisdom which has enabled us to achieve our present pre-eminence.’

However, there are good reasons to suppose that the eminence currently enjoyed by speech actually results from the invention of writing and its impact on human expression – what I would call the Playtex Moment, because the effect of that impact was to lift speech above the rest of human expression, and separate it.

Prior to the invention of writing, and indeed for a good time after it, since its impact was far from immediate, speech was, I would say, simply one aspect of human expression, and by no means the most important. By ‘human expression’ I mean the broad range of integrated activity – facial expression, gesture, posture and bodily movement, and a range of sounds, including speech – which human beings use to express their thoughts and feelings. Bear in mind that up to the invention of writing (and for a good time after it) speech was always part of some larger activity, to which it contributed, but did not (I would assert) dominate.

My ground for supposing this is that it is only through the effort to give speech a written form (which probably did not start to happen till writing had been around for a thousand years) that we come to study it closely, and to analyse it. I suggest there are two reasons for this – the first is that it was not possible to study speech till it was given a permanent, objective form; the second is that the need to analyse speech is part and parcel of the process of giving it written form. Crucially, it is only in writing that the notion of the word as a component of speech arises; speech naturally presents as an uninterrupted flow – rhythm and emphasis are of significance, but word separation is not. Word separation – which not every writing system uses – is a feature of writing, not speech.

In the same way, the whole analysis of speech in terms of the relations between words – grammar – arises from writing (for the good reason that it is only through writing that we can become aware of it). It is the understanding of Language that arises through the development of writing as a tool to transcribe speech that elevates and separates speech from the other forms with which it has hitherto been inseparably bound up.

The notion that we invented writing expressly to transcribe speech does not bear examination*: it was invented for the lesser task of making lists and inventories, as a way of storing information. It was only very gradually that we began to realise its wider potential (the earliest instance of anything we might call literature occurs a good thousand years after the first appearance of writing). Rather than writing being a by-product of speech, speech – as we now know it, our primary mode of communication and expression – is a by-product of writing.

And that is why writing is like a Playtex bra: it lifts and separates speech from all the other forms of human expression – but also (to push the analogy to its limits, perhaps) offers a degree of support that is only bought at the expense of containment and subjugation.

The interesting corollary is that if our present mode of thinking is Language-based – in the sense of ‘Language’ that is used here, a fusion of writing and speech – then that, too, is a relatively recent development**; however much it might seem second nature to us, it is just that – second nature: our natural mode of thought – instinctive and intuitive, developed by our ancestors over several hundred thousand years – must be something quite other, with a different foundation (which is, I would suggest, metaphor).

*if you doubt this, examine it: ask how such an idea would first have occurred, that things should be written down and given a permanent form – to remember them? people had been remembering without the aid of writing for thousands of years – why should they suddenly feel the need to devise an elaborate system to do something they could do perfectly well already? Ask also why, of all things, they would select speech as the one thing to make a record of – only if it were the sort of speech we have now – very much formed and influenced by writing – would it seem the obvious thing to record. Finally, ask how they would go about it – devising a script for the purpose of recording speech requires the sort of analysis of speech we can only acquire through already having devised such a script.

**do not underestimate what I mean by this: it is more than using words to think with. It is the complete model of the world as an objective reality existing independently and governed by logic and reason and all that stems from that; all that can be shown to derive from Language, which in turn arises from the impact of writing on human expression, a process that is initiated about two and a half thousand years ago, in classical Greece, by Plato and Aristotle.

‘The sun is actually white, it just appears yellow to us through the Earth’s atmosphere.’

This is a line that appeared on Facebook a while ago, courtesy of my friend Else Cederborg, who posts all sorts of curious and interesting things.

It is a common form of argument that most will readily understand and generally accept without thinking too hard about it. Yet the sun is not actually white, nor does it just appear yellow, facts you can easily check for yourself.

Depending on the time of day and the atmospheric conditions, the sun is variously deep blood red, orange, dusky pink and (behind a veil of mist or thin cloud) a sort of milky white; much of the time it has an incandescent brilliance which can hardly be called a colour since we cannot bear to look at it directly. Though it may appear yellow, the usual place to encounter a yellow sun is in a representation of it: a child’s drawing, for instance (against a blue sky above the square white house, surrounded by green grass, with its four four-paned windows and red triangular roof with single chimney belching smoke) or else in the icons used in weather forecasting.

How we come to accept an argument that runs counter to all experience, and perversely insists on a condition for actuality (being seen outside the Earth’s atmosphere) which few of us will ever experience, is a case worth examining.

At the heart of it is that curious human thing, a convention (I say ‘human’ though it might be that other creatures have conventions, but that is for another time). A convention, effectively, is an agreement to treat things as other than they are – it is a form of pretence. This definition might seem surprising, since what is usually emphasised in conventions is their arbitrary character – the typical example being which side of the road we drive on, which varies from country to country.

However, what makes the rule of the road a convention is that we endow it with force, a force that it does not actually have: we say that you must drive on the left (or the right); that you have to – even though any one of us can show this not to be the case. Of course, if you engage in such a demonstration, you might find yourself liable to a range of legal sanctions, or worse still, involved in an accident. The fact that the rule has to be backed by force proves that it has no intrinsic force; on the other hand, the risk of accident shows that the pretence is no mere whim, but sound common sense: conventions are there because they are useful.

A great deal of human behaviour is conventional if you look at it closely: things are deemed to be the case which are not, actually (international borders is a good example – again, these are arbitrary, but it is the power with which they are endowed that makes them conventions). In this regard, it is worth recalling a remark that Wittgenstein makes somewhere (Philosophical Investigations, I think) to the effect that ‘explanation has to come to an end somewhere’. In other words, despite what we tell children, ‘just because’ is an answer. Conventions are so because we say they are so.

So what colour is the sun?

That takes us to the boldest and most fundamental of conventions, the one that underpins our standard way of thinking about the world, for which we have Plato to thank. Plato insists at the outset on discrediting the senses, saying that they deceive us, giving us Appearance only, not Reality.

This is a move of breathtaking boldness, since effectively it dismisses all experience as ‘unreal’ – as a starting position for philosophy, it ought to be quite hopeless, since if we cannot draw on experience, where are we to begin? If reality is not what we actually experience, then what can it be?

However, there is a different angle we can consider this from, which makes it easier to understand how it has come to be almost universally adopted. What Plato is proposing (though he does not see this himself) is that we should view the world in general terms: that we should not allow ourselves to be distracted by the individual and particular, but should see things as multiple instances of a single Idea or Form; or, as Aristotle develops it, as members of a class which can be put under a single general heading: trees, flowers, cats, horses, at one level, plants and quadrupeds at another, and so on.

Implied in this is the elimination of the subject: the world is not as it appears to me or to you or to any particular individual (since particular individuals exist only at the specific level) the world is to be considered as objective, as it is ‘in itself’, i.e. as a general arrangement that exists independently of any observation.

This is a very useful way of looking at things even though it involves a contradiction. Its utility is demonstrated by the extraordinary progress we have made in the 2.500 years or so since we invented it. It is the basis of science and the foundation of our systems of law and education.

Effectively, it starts by accepting the necessary plurality of subjective experience: we see things differently at different times – the sun is sometimes red, sometimes pink, sometimes incandescent; different people have different experiences: one man’s meat is another man’s poison. In the event of dispute, who is to have priority? That problem looks insoluble (though the extent to which it is an actual problem might be questioned).

So, we must set subjective experience and all that goes with it to one side, and rule it out as the basis of argument. Instead, we must suppose that the world has a form that exists independently of us and is not influenced by our subjective observation: we posit a state of ‘how things actually are in themselves’ which is necessarily the same and unchanging, and so is the same for everyone regardless of how it might appear.

This is how we arrive at the position stated at the outset, that the sun is ‘actually’ white, and that its ‘actual state’ should be taken as ‘how it appears from space,’ even though we live on earth and seldom leave it.

There is a confusion here, which has surfaced from time to time in the history of philosophy since Plato’s day. The strict Platonist would dismiss the whole question of the sun’s true colour in terms akin to Dr Johnson’s, on being asked which of two inferior poets (Derrick or Smart) was the greater: ‘Sir, there is no settling the point of precedency between a louse and a flea.’ Colour belongs to the world of Appearance, not the world of Forms or Ideas.

However, as the insertion of the argument about the earth’s atmosphere shows, we feel the need of a reason to dismiss the evidence of our own eyes (and this is where that little sleight-of-mind about ‘just appearing yellow’ comes in – so that we are not tempted to look too closely at the sun as it actually appears, we are fobbed off with a conventional representation of it that we have been accepting since childhood). The argument is that the atmosphere acts as a filter, much as if we looked at a white rose through coloured glass and variously made it appear green or red or blue: so just as we agree the rose is ‘actually’ white, so too the sun’s ‘actual’ colour must be as it appears unfiltered.

This, however, is specious. Leaving aside the fact that the rose is certainly not white (as you will soon discover if you try to paint a picture of it) at least our choice of default colour can be justified by adding ‘under normal conditions of light’ which most people will accept; but ‘as it appears outside the earth’s atmosphere’ can hardly be called ‘normal conditions of light.’

What has happened here is that the subjective ‘filter’ which Plato wished to circumvent by eliminating the subject – ‘let us ignore the fact that someone is looking, and consider only what he sees’ – has reappeared as an actual filter, so that the whole appearance/reality division has been inserted into the objective world by an odd sort of reverse metaphor. The need to give priority to one of the many possible states of the sun is still felt, and it is solved by arbitrarily selecting ‘the sun as it appears from space’ as its actual state, because that seems logical (though in fact it is not).

The subjectivity of colour in particular, and the subject-dependence of all perception in general, is like a submerged reef that appears at various periods in the sea of philosophy. Locke is troubled by colour, which he wants to class as a ‘secondary quality’ since it evidently does not inhere in the object, as he supposes the primary qualities – solidity, extension, motion, number and figure – to do. Colour is ranked with taste, smell and sound as secondary, which is just a restatement of Plato’s rejection of the senses in other terms.

Berkeley, however, sees that this is a specious distinction and gets to the heart of the matter with his axiom esse est percipi – ‘to be is to be perceived’. Everything we know requires some mind to perceive it: a point of view is implicit in every account we give. There can be no objective reality independent of a subject (as the terms themselves imply, since each is defined in terms of the other).

The response to this is illuminating. Berkeley’s dictum is rightly seen as fatal to the notion of an objectively real world, but instead of accepting this and downgrading Plato’s vision to a conventional and partial account employed for practical purposes, every effort is made to preserve it as the sole account of how things are, the one true reality.

Berkeley’s own position is to suppose that only ideas in minds exist, but that everything exists as an idea in the mind of God, so there is a reality independent of our minds, though not of God’s (a neat marriage of philosophy and orthodox christianity – Berkeley did become a bishop in later life. The Californian university town is named after him).

Kant’s solution is to assert that there is an independent reality – the ding-an-sich, or ‘thing-in-itself’ but logically it must be inaccessible to us: we know it is there (presumably as an article of faith) but we cannot know what it is like.

Schopenhauer’s solution is the most ingenious, and to my mind the most satisfying. He agrees that we cannot know the ding-an-sich as a general rule, but there is one notable exception: ourselves. We are objects in the world and so are known like everything else, via the senses – this is Plato’s world of Appearance, which for Schopenhauer is the World as Representation (since it is re-presented to our minds via our senses). But because we are conscious and capable of reflecting, we are also aware of ourselves as subjects – not via the senses (that would make us objects) but directly, by being us. (To put it in terms of pronouns: you know me and I know you, but I am I; I am ‘me’ to you (and in the mirror) but I am ‘I’ to myself)

And what we are aware of, Schopenhauer says, is our Will; or rather, that our inner nature, as it were, is will: the will to exist, the urge to be, and that this is the inner nature of all things – the elusive ding-an-sich: they are all objective manifestations of a single all-pervasive Will (which happens in us uniquely to have come to consciousness).

This idea is not original to Schopenhauer but is borrowed from Eastern, specifically Hindu and Buddhist philosophy, which belongs to a separate (and possibly earlier) tradition than Platonic thought does (it is interesting to note that a very similar way of looking at things has been likewise borrowed by evolutionary biologists, such as Richard Dawkins, with the notion of ‘the selfish gene’ taking the place of Schopenhauer’s ‘Will’ as the blind and indifferent driving force behind all life).

I find Schopenhauer’s account satisfactory (though not wholly so) because it is a genuine attempt to give an account of the World as we experience it, one that reconciles all its elements (chiefly us as subjects with our objective perceptions) rather than the conventional account of Plato (and all who have followed him) which proceeds by simply discounting the subject altogether, effectively dismissing personal experience and reducing us to passive observers. Although the utility of this convention cannot be denied (in certain directions, at any rate) its inherent limitations make it inadequate to consider many of the matters that trouble us most deeply, such as those that find expression in religion and art; and if, like those who wish to tell us that the sun is ‘actually’ white, we mistake the conventional account for an actual description of reality*, we end up by dismissing the things that trouble us most deeply as ‘merely subjective’ and ‘not real’ – which is deleterious, since they continue to trouble us deeply, but that trouble has now been reclassified as an imaginary ailment.

*perhaps I should say ‘the world’ or ‘what there is’. There is a difficulty with ‘reality’ and ‘real’ since they are prejudiced in favour of the convention of an objective world: this becomes clearer when you consider that they could be translated as ‘thingness’ and ‘thing-like’. The paradigm for ‘reality’ then becomes the stone that Dr Johnson kicked with such force that he rebounded from it, i.e. any object in the world, so that the subject and the subjective aspect of experience is already excluded.

‘Like, yet unlike,’ is Merry’s comment in The Lord of the Rings when he first sees Gandalf and Saruman together: Gandalf, returned from the dead, has assumed the white robes formerly worn by Saruman, who has succumbed to despair and been corrupted by evil and is about to be deposed. So we have two people who closely resemble one another yet are profoundly different in character.

Scene: a school classroom. Enter an ancient shuffling pedagogue. He sets on his desk two items. The first depicts a scene from the days of empire, with a khaki-clad officer of the Camel Corps holding a horde of savage Dervishes at bay, armed only with a service revolver.

Teacher (in cracked wheezing voice):The sand of the desert is sodden red,—
Red with the wreck of a square that broke; —
The Gatling’s jammed and the Colonel dead,
And the regiment blind with dust and smoke.
The river of death has brimmed his banks,
And England’s far, and Honour a name,
But the voice of a schoolboy rallies the ranks:
‘Play up! play up! and play the game! ‘

Cackling to himself, he unveils his second prop, a glass case in which a stuffed domestic tabby cat – now rather moth-eaten, alas! – has been artfully disguised to give it the appearance of a (rather small) African lion.

Teacher (as before) The lion, the lion
he dwells in the waste –
he has a big head,
and a very small waist –
but his shoulders are stark
and his jaws they are grim:
and a good little child
will not play with him!

Once recovered from his self-induced paroxysm of mirth, almost indistinguishable from an asthma attack, he resumes what is evidently a familiar discourse.

Teacher: We remember, children, that whereas the simile (put that snuff away, Hoyle, and sit up straight) says that one thing is like another, the metaphor says that one thing is another, in this case that the soldier was a lion in the fight. Now in what respects was he a lion? it can scarcely be his appearance, though I grant that his uniform has a tawny hue not dissimilar to the lion’s pelt; certes, he has no shaggy mane (did I say something amusing, Williams? stop smirking, boy, and pay attention) and instead of claws and teeth he has his Webley .45 calibre revolver. Nonetheless, he displays a fearless courage in the face of great odds that is precisely the quality for which the King of Beasts is renowned, so that is why we are justified in calling him a lion. What is that, Hoyle? Why do we not just say he is like a lion? Ha – hum – well, you see, it makes the comparison stronger, you see, more vivid.’

Hoyle does not see, but dutifully notes it down, and refrains from suggesting that ‘metaphor’ is just a long Greek word for a lie, since he knows that will get him six of the belt in those unenlightened days.

[curtain]

But young Hoyle the snuff-taker has a point. Aristotle, it will be recalled, writing in his Poetics, says that the poet ‘above all, must be a master of metaphor,‘ which he defines as ‘the ability to see the similarity in dissimilar things’. But this definition is as problematic as the teacher’s explanation: why is a comparison between two things whose most striking feature is their dissimilarity made stronger and more vivid by saying that they are actually the same?

The best that people seem able to manage in answer to this is that the literary metaphor has a kind of shock value. To illustrate the point, they generally allude to the conceits of the metaphysical poets, such as Donne, where what strikes us first as outrageous, is – once explained – redeemed by wit and ingenuity:

Oh stay, three lives in one flea spare, Where we almost, nay more than married are. This flea is you and I, and this Our marriage bed, and marriage temple is; Though parents grudge, and you, w’are met, And cloistered in these living walls of jet.

The best metaphor, it seems, is one where the dissimilarity is more striking than the resemblance.

But mention of the metaphysical poets recalls a different definition of metaphor, one provided by Vita Sackville-West in her book on Andrew Marvell:

‘They saw in it [metaphor] an opportunity for expressing … the unknown … in terms of the known concrete.’

That is in the form that I was wont to quote in my student days, when it made a nice pair with the Aristotle quoted above; but I think now that I did Vita Sackville-West a disservice by truncating it. Here it is in full:

‘The metaphysical poets were intoxicated—if one may apply so excitable a word to writers so severely and deliberately intellectual—by the potentialities of metaphor. They saw in it an opportunity for expressing their intimations of the unknown and the dimly suspected Absolute in terms of the known concrete, whether those intimations related to philosophic, mystical, or intellectual experience, to religion, or to love. They were ‘struck with these great concurrences of things’; they were persuaded that,Below the bottom of the great abyss There where one centre reconciles all things, The World’s profound heart pants,
and no doubt they believed that if they kept to the task with sufficient determination, they would succeed in catching the world’s profound heart in the net of their words.’

If I had my time again (for indeed that ancient pedagogue described above is me) and wished to illustrate this, I would go about it rather differently.

Let us suppose a scene where a child cowers behind her mother’s skirts while on the other side a large and overbearing man, an official of some sort, remonstrates with the mother demanding she surrender the child to his authority. Though she is small and without any looks or glamour – a very ordinary, even downtrodden sort – the woman stands up boldly to the man and defies him to his face with such ferocity that he retreats. I am witness to this scene and the woman’s defiance sends a thrill of excitement and awe coursing through me. In recounting it to a friend, I say ‘In that moment, I seemed to glimpse her true nature – I felt as if I was in the presence of a tiger, defending her cubs.’

This is a very different account of metaphor. It is no longer a contrived comparison for (dubious) literary effect between two external things that are quite unlike, in which I play no part save as a detached observer; instead, I am engaged, involved: the metaphor happens in me: the identity is not between the external objects, but in the feeling they evoke, which is the same, so that the sight before me (the woman) recalls a very different one (the tiger) which felt exactly the same.

The first point to note is that the contradiction implicit in Aristotle’s account has disappeared. There is no puzzle in trying to work out how a woman can be a tiger, because the unity of the two lies in the feeling they evoke. And as long as my response is typically human and not something unique to me, then others, hearing my account, will feel it too, and being stirred in the same way, will recognise the truth expressed by saying ‘I felt I was in the presence of a tiger.’

Further, the very point that seemed problematic at first – the dissimilarity – is a vital element now. It is the fact that the woman appears as unlike a tiger as it is possible to be that gives the incident its force: this is an epiphany, a showing-forth, one of those ‘great concurrences of things’ that seem like a glimpse of some reality beyond appearance, ‘the World’s profound heart’.

Yet that description – ‘some reality beyond appearance’ – is just what pulled me up short, and made me think of the Tolkien quote I have used as a heading. Is not this the very language of Plato, whose world of Forms or Ideas is presented as the Reality that transcends Appearance?

Yet the world as presented by Plato is essentially the same as that of Aristotle, which has become, as it were, our own default setting: it is a world of objective reality that exists independently of us; it is a world where we are detached observers, apprehending Reality intellectually as something that lies beyond the deceptive veil of Appearance. It is the world we opened with, in which metaphor is a contradiction and a puzzle, perhaps little better than a long Greek word for a lie.

Though both accounts – the Platonic-Aristotelian world on one hand, and Vita Sackville-West’s version on the other – seem strikingly similar (both have a Reality that lies beyond Appearance and so is to some extent secret, hidden), there are crucial differences in detail; like Gandalf and Saruman, they are like, yet unlike in the fundamentals that matter.

The Platonic world is apprehended intellectually. What does that mean? Plato presents it in physical terms, as a superior kind of seeing – the intellect, like Superman’s x-ray vision, penetrates the veil of Appearance to see the Reality that lies beyond. But the truth of it is less fanciful. What Plato has really discovered (and Aristotle then realises fully) is the potential of general terms. A Platonic Idea is, in fact, a general term: the platonic idea of ‘Horse’ is the word ‘horse’, of which every actual horse can be seen as an instance or embodiment. Thus, to apprehend the World of Forms is to view the actual world in general terms, effectively through the medium of language.

This can be imagined as being like a glass screen inserted between us and the landscape beyond, on which we write a description of the landscape in general terms, putting ‘trees’ where there is a forest, ‘mountains’ for mountains, and so on. By attending to the screen we have a simplified and more manageable version of the scene beyond, yet one that preserves its main elements in the same relation, much as a sketch captures the essential arrangement of a detailed picture.

But the Sackville-West world is not mediated in this way: we confront it directly, and engage with it emotionally: we are in it and of it. And our apprehension of a different order of reality is the opposite of that presented by Plato; where his is static, a world of unchanging and eternal certainties (which the trained intellect can come to know and contemplate), hers is dynamic, intuitive, uncertain: it is something glimpsed, guessed at, something wonderful and mysterious which we strive constantly (and never wholly successfully) to express, in words, music, dance, art.

The resemblance between the two is no accident. Plato has borrowed the guise of the ancient intuited world (which we can still encounter in its primitive form in shamanic rituals and the like) and used it to clothe his Theory of Forms so that the two are deceptively alike; and when you read Plato’s account as an impressionable youth (as I did) you overlay it with your own intimations of the unknown and the dimly suspected Absolute and it all seems to fit – just as it did for the Christian neoPlatonists (in particular, S. Augustine of Hippo) seeking a philosophical basis for their religion.

I do not say Plato did this deliberately and consciously. On the contrary, since he was operating on the frontier of thought and in the process of discovering a wholly new way of looking at the world, the only tools available to express it were those already in use: thus we have the famous Simile of the Cave, as beguiling an invitation to philosophy as anyone ever penned, and the Myth of Er, which Plato proposes as the foundation myth for his new Republic.

And beyond this there is Plato’s own intuition of a secret, unifying principle beyond immediate appearance, ‘the World’s profound heart’, which we must suppose him to have since it is persistent human trait: is it not likely that when he had his vision of the World of Forms, he himself supposed (just as those who came after him did) that the truth had been revealed to him, and he was able to apprehend steadily what had only been glimpsed before?

It would explain the enchantment that has accompanied Plato’s thought down the ages, which no-one ever attached to that of his pupil Aristotle (‘who is so very nice and dry,’ as one don remarked) even though Aristotelianism is essentially Plato’s Theory of Forms developed and shorn of its mysterious presentation.

So there we have it: a new explanation of metaphor that links it to a particular vision of the world, and an incidental explanation of the glamour that attaches to Plato’s Theory of Forms.

Its eye a dark pool
in which Sirius glitters
and never goes out.
Its melody husky
as though with suppressed tears.
Its bill is the gold
one quarries for amid
evening shadows. Do not despair
at the stars’ distance. Listening
to blackbird music is
to bridge in a moment chasms
of space-time, is to know
that beyond the silence
which terrified Pascal
there is a presence whose language
is not our language, but who has chosen
with peculiar clarity the feathered
creatures to convey the austerity
of his thought in song.

– R.S. Thomas

St Anselm was Archbishop of Canterbury and lived from 1033 to 1109 at the start of the intellectual renaissance that the High Middle Ages brought to Western Europe. It was a period of great intellectual ferment, an Age of both Faith and Reason, when the best minds of the day applied with passionate curiosity the learning they were rediscovering to the big topic of the day: God.

It takes some effort of the imagination in this secular age to realise that for the mediaeval mind, Theology was the Queen of Sciences, as exciting in its day as quantum physics is now. ‘What is God?’ is the question that the greatest of the mediaevals – one of the greatest intellects ever, Thomas Aquinas – asked at an early age and pursued the rest of his life.

The learning they were rediscovering had two principal strands, both of which had been kept alive elsewhere, since the Eastern Empire, centred on Constantinople, continued after the Western one, centred on Rome, had fallen, though increasingly encroached upon latterly by a new intellectual and religious power to the East and South: Islam.

The most immediately accessible strand, because it was written in Latin, was the neoPlatonism of the late Roman period, whose most notable exponent was Augustine of Hippo. Platonism, with its notion of a transcendent Reality composed of eternal, immutable Forms and a vision of Truth as a brilliant sun that is the source of all wisdom, is a good fit for Christianity – so little is needed to reconcile them that Plato (with the Christ-like Socrates as his literary mouthpiece) can seem almost a pagan prophet of Christianity.

The second strand was more difficult, because it took a circuitous route from the Greek-speaking Eastern empire through the Arabic of Islamic scholars (Avicenna and Averroes, principally) before being translated into Latin where the two cultures met in Spain. This second strand centred chiefly on the writings of Plato’s pupil, one of the greatest minds of any age, Aristotle.

It was Aquinas who met the challenge of reconciling this new influx of pagan (and heretic) thought into catholic teaching and did so with such effect that he remains to this day the chief philosopher of the catholic church, with his Summa Theologica his principal work*.

This period marks the second beginning of Western thought; its first beginning had been some thirteen centuries previously with the Classical Age of Greece, and the two giants, Plato and Aristotle. It is important to realise that what might seem at first glance a recovery of ancient wisdom was in reality nothing of the sort: it was the rediscovery of a new and startling way of looking at things, one that displaced and subjugated the traditionally accepted way of understanding our relation to the world that had held since time immemorial.

What made this new way of thought possible was the written word. For the first time, it was possible to separate one of the elements of human expression, speech, from the larger activity of which it was part, and give it what appeared to be an independent and objective form. This did not happen at once; indeed, it took about three thousand years from the invention of writing, around 5500 years ago, to the realisation of its potential in Classical Greece.

The word written on the page is the precondition of the relocation of meaning: from being a property of situations, inseparable from human activity and conveyed by a variety of methods, such as facial expression, gesture, bodily posture, with speech playing a minor role, meaning now becomes the property of words, and is deemed, by implication, to exist independently and objectively, and to be more or less fixed.

This one change is the foundation of modern thought: it is what allows Plato, with breathtaking audacity, to reverse the relation between the intellect and the senses and proclaim that what the senses tell us is mere Appearance, and that Reality is apprehended by the intellect – and consists of the world viewed from a general aspect: effectively, through the medium of language. It is the beginning of a world-view that casts us as detached spectators of an independent objective reality, a world-view that cannot be acquired naturally and instinctively, but only through a prolonged process of education, based on literacy.

When, some thirteen centuries later, Anselm devises his ‘ontological proof’ for the existence of God, it is squarely within this intellectual framework erected by Plato and Aristotle:
‘[Even a] fool, when he hears of … a being than which nothing greater can be conceived … understands what he hears, and what he understands is in his understanding.… And assuredly that, than which nothing greater can be conceived, cannot exist in the understanding alone. For suppose it exists in the understanding alone: then it can be conceived to exist in reality; which is greater.… Therefore, if that, than which nothing greater can be conceived, exists in the understanding alone, the very being, than which nothing greater can be conceived, is one, than which a greater can be conceived. But obviously this is impossible. Hence, there is no doubt that there exists a being, than which nothing greater can be conceived, and it exists both in the understanding and in reality.’

This is straightforward enough, if you take your time and attend to the punctuation: the expression ‘that than which nothing greater can be conceived’ is Anselm’s definition of God; and even a simpleton, he says, can understand it; but to exist in reality is better than to exist merely in the imagination, so a God that exists in reality is greater than one which exists only in the imagination, so if God is that than which nothing greater can be conceived, then God must exist in Reality (Because that leaves no room, as it were, to conceive of anything greater).

Much has been said and written about this argument since it was first made over 900 years ago, but I want to concentrate on a single aspect of it, which is the continuity it implies between the human understanding and reality. To use an image, if we conceive the intellect as a skyscraper, then by taking the lift to its utmost height and climbing, so to speak, onto the roof, we arrive at Reality, the only thing that is higher than the height of our understanding.

This is what leads us to suppose – via the notion that we are created in the image and likeness of God – that God must be the perfection of all that is best in us; and if we esteem our intellectual faculties above all else (as, in the ‘West’, we seem to do) then God must be the supreme intellect.

This presents a problem, one that has considerable force in arguments against the existence of God: though a lesser intellect cannot fully comprehend a greater one, they share a great deal of common ground, and the greater intellect can certainly attune itself to the capacity of the lesser: this is a familar case (though not always!) between adult and child, teacher and pupil. Why, then, does God not deal directly with us at our intellectual level? Why doesn’t God speak our language? He surely would, if he could; yet he must be able to, since he is God – so the fact that he does not makes him appear either perverse (like a parent playing a cruel sort of game where he pretends not to be there, and does not answer when his child calls out to him, though he may do something that indirectly suggests his presence, like throwing a ball or making the bushes move) or absent, since he would if he could, but does not.

Thomas’s poem is an answer to this conundrum, though it is not a comfortable one. Perhaps our assumption that reality is at the top of the skyscraper is an error: maybe it is outside, at ground level. Maybe God speaks to us all the time, but we do not recognise the fact, because ‘God’ is quite other than we suppose, and cannot be contained in the intellectual framework that Plato and Aristotle have bequeathed to us.

This would explain on the one hand why religion – in its broadest sense – is bound up with immemorial ritual (which belongs to the world before Plato and Aristotle) and on the other, why, in an age that puts it confidence in intellect and reason – the ‘new thinking’ that Plato and Aristotle invented, not so very long ago in terms of our earthly existence – God is proving increasingly difficult to find.

*in the context of this piece, it is worth recalling that Aquinas on his deathbed said that his work now seemed ‘all so much straw.’

‘What are you thinking of? What thinking? What?
I never know what you are thinking.’
– Eliot, The Waste Land

‘He’s the sort that you never know what he’s thinking’ defines a recognisable character but carries a curious implication. There is a strong suggestion of duplicity, of inner workings at odds with outer show. Even among long-time married couples you will sometimes hear it said (in exasperated tones) ‘all these years we’ve been married and I still have no idea what goes on in that head of yours’.

But that exasperated tone indicates the same curious implication of the first case – namely, that we expect to know what people are thinking; that not to know is what is considered remarkable, the exception that proves the rule. C.Auguste Dupin, a notable precursor of Sherlock Holmes created by Edgar Allan Poe, makes a striking demonstration of this in The Murders in the Rue Morgue:

‘One night we were walking down one of Paris’s long and dirty
streets. Both of us were busy with our thoughts. Neither had spoken
for perhaps fifteen minutes. It seemed as if we had each forgotten that
the other was there, at his side. I soon learned that Dupin had not
forgotten me, however. Suddenly he said:
“You’re right. He is a very little fellow, that’s true, and he would
be more successful if he acted in lighter, less serious plays.”
“Yes, there can be no doubt of that!” I said.
At first I saw nothing strange in this. Dupin had agreed with me,
with my own thoughts. This, of course, seemed to me quite natural.
For a few seconds I continued walking, and thinking; but suddenly
I realized that Dupin had agreed with something which was only a
thought. I had not spoken a single word.’

Dupin’s explanation of his apparent mind-reading runs to another page and three-quarters [you can read it here] and though something of a virtuoso performance, it is based on sound principles – Dupin observes his friend’s actions and expressions closely, and is able to follow his train of thought by skilful inference, both from what he sees, and what he already knows.

The incident starts when, in evading a hurrying fruit-selller, his companion stubs his toe on an ill-laid paving stone:

‘You spoke a few angry words to yourself, and continued walking. But you kept looking down, down at the cobblestones in the street, so I knew you were still thinking of stones.
“Then we came to a small street where they are putting down street stones which they have cut in a new and very special way. Here your face became brighter and I saw your lips move. I could not doubt that you were saying the word stereotomy, the name for this new way of cutting stones.’
….
‘Later I felt sure that you would look up to the sky. You did look up. Now I was certain that I had been following your thoughts as they had in fact come into your mind.’
….
‘I saw you smile, remembering that article and the hard words in it.
“Then I saw you stand straighter, as tall as you could make yourself. I was sure you were thinking of Chantilly’s size, and especially his height.’

Two things are worth noting here, I think. The first is Dupin’s attention to such things as the direction of his friend’s gaze, the expression on his face, and his whole bodily posture, all of which he reads as indicative of thought – I was going to say ‘accompaniments of thought’ but that would be the wrong word, I think, for reasons I will come to presently. The second thing is one particular detail – ‘I saw your lips move’ – and the observation the narrator makes at the end of the episode – ‘Dupin was right, as right as he could be. Those were in fact my thoughts, my unspoken thoughts’.

These highlight two important points about thought that are often overlooked: that it has a physical aspect, and that it is closely connected to speech. We use the expression ‘difficult to read’ of people like the man cited at the start, the ‘sort that you never know what he’s thinking’, and this reminds us that we do rely to a great extent on non-verbal physical indications of ‘mental’ activity.

Indeed, it is interesting to consider just how contrary to everyday experience is the notion that mental activity and thought are hidden, private processes that take place ‘in our heads’ so that only we ‘have access to them’. I put those expressions in quotes because I think they are misleading, in the same way that it is misleading to speak of facial expression etc. as ‘accompaniments’ to thought – I would say they are better considered as an integral part of thinking. We see this from the expression ‘I learned to hide my thoughts’ which is connected with controlling – indeed, suppressing – these external manifestations of thought.

The fact that we must make a conscious effort to conceal thought suggests that it is far from the ‘hidden process’ it is often supposed to be and calls into question the whole range of terms we use that suggest it is – such as the notion of thoughts being ‘in our head’ and our having ‘private access to them’ alluded to above. The implication there is that the head (or brain, or mind) is a sort of space in which our thoughts are stored (and where other mental activity takes place); furthermore, it is a private space, a sort of secret room to which we alone have access. (In this connection, consider the various fictional representations of telepathy and mind-reading, which often involve clutching the head, pressing the temples etc., either in an effort to keep its contents from being rifled, or in the attempt to pilfer them – thoughts are seen as something contained which can, by certain means, be extracted)

In St Ambrose’s day (c340-397) it was considered remarkable that he could read without moving his lips, from which we infer that most people then did so. I believe that this is now termed ‘subvocalisation’ and it appears to have been studied extensively in connection with reading but less so with thought. I am conscious that a great deal of my own thought consists of articulating sentences ‘in my head’ a process that I consider the same as speaking in all but the final act of voicing the words aloud (an interpretation supported by the fact that sometimes I do actually speak my thoughts aloud) – hence my interest in the expression Poe uses above, ‘my unspoken thoughts.’

It would be interesting to know whether the late Romans of St Ambrose’s day moved their lips when thinking, or indeed habitually spoke their thoughts aloud, openly or in an undertone. Even now, this is more common than we might suppose – people often blurt out their thoughts without meaning to, and most of us are familiar with the expression ‘did I just say that aloud?’ (and the feeling that accompanies it) when we say what might have been better kept to oneself. There are also people who have the habit of framing their thoughts as spoken questions, which can be disconcerting till you realise that they are not actually seeking an answer from you, personally: it is just another form of saying ‘I wonder if…’.

So it would seem that, just as we have we have learned for the most part to read without moving our lips, so we have also gradually shed (or learn to suppress) the more obvious physical manifestations of what we now consider ‘mental’ activities, such as thinking, imagining, remembering etc. though my guess (as with subvocalisation in relation to reading) is that there is probably still a fair bit that could be detected in muscle movement, brain activity and the like (though it would be an interesting experiment to see if these too – the brain activity in particular – can be controlled).

From the effort we must make to conceal thought, and our varying success in doing so, it is reasonable to infer that the ‘natural’ mode of thought is holistic, involving the body (at least) as much as the brain: consider, for instance, two rather different examples. One is the domestic cat, and how it is transformed on spying a bird that might be its prey or an intruder on its territory: its intentions can be read very clearly from its bodily posture and movement. The other is the recent emergence in sport – particularly at the highest level – of the practice of ‘visualisation’, which is rather more than simply picturing what you want to happen; it is a full-scale physical anticipation of it, typified by the rituals with which Jonny Wilkinson used to precede his kicking attempts in rugby.

It is interesting to set all this alongside the long-standing tradition in philosophy that regards mental activity as private, personal and inaccessible to others, which has led some to the extreme of solipsism, the doctrine that your own self is the only being you can confidently believe to exist. Much blame for this can be laid at the door of Descartes, often seen as the herald of the modern era in philosophy, though the mind-body dualism generally attributed to him can be dated back to Plato (much as his most noted dictum, cogito ergo sum – ‘I think, therefore I am’ – can be traced back to St Augustine a thousand years before). Descartes makes the classic error of supposing that because we are deceived in some cases, it is possible that we might be deceived in every case – overlooking the fact that such a state of affairs would render ‘being deceived’ and its allied concepts of mistake, illusion, hallucination and the like incomprehensible: if we were deceived in all things, we would not be aware of it; the fact that we have a word for it demonstrates that, in most cases, we are not deceived, and that we also recognise the special and generally temporary circumstances in which we are.

If we go back to Plato, I think we can find the real root of the notion that thoughts are private. It is bound up with what I consider the relocation of meaning that takes place around the time of Classical Greece, about 25 centuries ago, and is made possible by the invention of writing. Only once a word can be written on a page does it become possible to consider it apart from the milieu in which it naturally occurs, human activity involving speech. Such activity (what Wittgenstein calls ‘forms of life’ and ‘language games’) is the ultimate source of meaning (cp. Wittgenstein again, ‘the meaning of a word is its use in the language’). Prior to the invention of writing, there was neither the means nor indeed any reason to consider speech apart from the larger activity of which it formed a part; indeed it is doubtful whether people would even have the concept of words as components of speech, which presents itself as a rhythmic flow, rather than a concatenation of smaller elements.

With writing, all that changes. For the first time, language can be studied and analysed at leisure. A sentence written on a page is comprehensible in itself, or so it appears, without reference to who wrote it or in what context. From this it is an easy step to the notion that meaning is an inherent property of words, rather than situations (what we overlook in this, of course, is that we are able to supply the context in which the words have meaning; but as such instances as the Phaistos Disc remind us, that ability can be lost, so that the marks we see have no more meaning for us than if they were random scratches made in play).

This relocation of meaning is fundamental to Plato’s Theory of Forms (or Ideas) in which he argues that the senses are deceived by the world of Appearance and that only the intellect can apprehend the true nature of Reality, the transcendent and immutable Forms. As I have argued elsewhere there is a strong case to be made that Platonic Ideas are in fact words – specifically, general and abstract terms – so the Platonic Ideas of ‘Cat’ ‘Table’ ‘Justice’ and ‘Good’ are the words cat, table, justice, good which stand for these ‘things’ (i.e. the general idea of ‘cat’, the abstract idea of ‘justice’) just as a specific name stands for the actual thing it denotes. (Though Plato pictures the words pointing to the transcendent Form or Idea, in actual fact the words themselves, allied to the use we make of them, are all that is needed)

It is this objectification of general and abstract ideas that leads to the notion of mental processes as private and inaccessible to others. We can point to something as an act of justice or goodness, but once we acquire the notion of justice as an idea, we introduce a new class of objects, those which can be apprehended only by the intellect. Strictly speaking, ‘object’ is used metaphorically here, but with Plato’s insistence that the Forms are the true Reality, this gets overlooked, and we start to think of thoughts, memories, ideas, impressions and the like as ‘mental objects’ that exist ‘in our minds’ or our ‘imaginations’ which we conceive as a kind of space, a sort of private viewing room.

The point to note here is that the metaphor preserves the Subject-Object relation, which is easily grasped in relation to physical objects – I know what it is to look at a tree, a cat or indeed another person: I am here and it is there. However, a degree of mystery seeps in when this is extended to ideas, thoughts and suchlike, particularly as philosophy develops the account it gives of them. Thus by Hume’s time we no longer simply see a tree: we form a mental impression of one, of which we can then make a copy, which he calls an idea – and this copy is what we use in remembering or imagining a tree, ‘calling it to mind’. This development clearly goes hand in hand with a growing understanding of light and optics and the physiology of the eye, but it is facilitated by having the notion of ‘mental space’ and regarding ideas as objects.

However, what is of most interest is how this alters our view of the Subject. From being a holistic notion which makes no distinction between mind and body – ‘I am this person looking at that tree’ – the subject begins to retreat in what becomes an infinite regress: the tree that we see out the window becomes a representation of a tree – to use Schopenhauer’s term, or the impression of a tree, to use Hume’s – which is now ‘in the mind’ but is still, somehow, seen. And if we have memory of that tree – an idea, to use Hume’s term – or the thought of a tree, or the mental image of one, then that, too, seems to be an object which we somehow apprehend – so the seeing, knowing or thinking subject – ourself – is forever edging out of the picture, never able – as subject – to become itself the object of consideration.

This is what leads the earlier Wittgenstein to suppose, in the Tractatus, that the subject is the boundary of experience, that it does not exist in the world but somehow outside or on the edge of it. Others have suggested that the Subject is a temporary manifestation generated (not unlike an electrical charge) by the combination of our brain and body and nervous system: it exists while we are alive (perhaps only when we are awake) and simply ceases when the physiology that generated it dies.

Yet all this, I would argue, is simply the result of philosophy’s having painted itself into a corner by adopting the way of thinking about the world that starts out with Plato. By dismissing the objects of sense as mere Appearance, and substituting the objects of intellectual apprehension as Reality, we reduce the Subject from an active participant in the world to a passive, detached observer: Wittgenstein’s boundary of experience. Reality is redefined as solely objective, and there is no room in it for the subject: ‘objectivity’ is praised while the subjective (often qualified by ‘merely’) is dismissed as unreliable, partial, mere ‘personal opinion’.

But let us step back, go back indeed to where we started, with Dupin, and the notion of thinking as a holistic activity which involves us as a totality, which is both physical and ‘mental’ (if indeed that distinction can be made at all). The view mentioned earlier, that the Subject (which can be identified with consciousness) is a kind of transitory by-product of our physiology seems to be supported by the latest developments in brain-imaging, which allow us to observe electrical activity in the neural networks of the brain: there is a correlation between certain activities and the part of the brain that ‘lights up’ when we are engaged in them. This has even led some to say that what brain imaging shows us are our actual thoughts – that all they are is these patterns of electric activity.

But I wonder. It has been demonstrated that people can lower their blood pressure aided by an index of it in the form of a display; likewise, people can be trained to suppress the physiological symptoms which polygraph tests – so-called ‘lie detectors’ – depend on for their evidence. It would be interesting to see if the lighting-up of neural networks is something that can be similarly controlled or disguised – for if we can learn to ‘hide our thoughts’ by controlling outward appearances, why should we suppose that we cannot do likewise with other physical manifestations of them, once we are aware of them?

It is illuminating to look at this from the other side: not only can we suppress or disguise the physical manifestations of thought, we can also imitate them – that is what actors do. And of course a standard acting technique is to have a store of memories that move us, which can be called to mind when the requisite emotion is called for – so if I wish to portray a character stricken by grief, I conjure a memory when I myself was grieved and my outward aspect will conform, much as does the player’s in Hamlet, who

But in a fiction, in a dream of passion,

Could force his soul so to his own conceit

That from her working all his visage wann’d,

Tears in his eyes, distraction in’s aspect,

A broken voice, and his whole function suiting

With forms to his conceit
Wittgenstein asks somewhere how we know that we are imitating someone’s expression, and adds that it is not by studying our face in a mirror. Our response to a smile, from a very early age, is a smile; but we are not imitating what we see – after all, we do not know what our own face looks like. What guides us is rather the feeling that goes with the smile. The best way I can think to put this is that, as human beings, we know what an expression feels like from the inside.

And I would add a note of caution here: do not import the model of cause and effect that we use in analysing the objective world. The joy we feel within does not cause the smile; it is not prior to it – the two are aspects of the same thing. I am reminded of an expression I learned as a boy doing my catechism – ‘an outward sign of inward grace’. There are a range of things that we know, not through becoming acquainted with them, but by doing them, by being them. And although we speak of ‘seeing’ ‘hearing’ and the rest of the senses separately, we cannot actually turn them on and off, but do them all at once and all the time; what we vary is the attention we give each one, and for most of us, sight predominates. with hearing next and the rest a good way behind, except when they force themselves on our attention.

What we actually experience, unanalysed, is not simply ‘the world’ – that is only half the story; what we experience is ‘being in the world’. All experience has this dual aspect: we know it from the inside and the outside at the same time. That is what makes communication possible, what understanding, properly understood, consists of. It is what in art, in all its forms – music, painting, sculpture, poetry, dance – enables us to ‘get it’: by considering the outward sign, we experience what it is like from inside, we recognise the feeling it expresses as something we, too, have felt.

The clever model that Plato and Aristotle invented, that underpins all Western thought, has enabled us to achieve remarkable things, but only at the considerable expense of ignoring one half of our experience and pretending that it does not matter.

Perhaps what Descartes should have said is not cogito ergo sum, nor even sum ergo sum (since it is not something we know by deduction) but simply sum – I am.

‘See better, Lear!’ is the admonition Kent gives his King after he has petulantly banished his youngest daughter, Cordelia, because she ‘lacks that glib and oily art’ to flatter him as her false sisters have done. Sight and blindness is a central theme in King Lear, as is its corollary, deception, both of others and oneself.

Kent’s words came to me when I was ruminating on my latest occupation, drawing shiny things

One of the things that drawing teaches is seeing better, and that indeed is a large part of my reason for pursuing it recently, as a kind of philosophical experiment (since February I have been drawing a monkey a day, in response to a challenge by a friend)

The status of colour crops up in philosophical discussions at various periods – it is Locke, I think, who argues that colours are not ‘primary qualities’ (such as shape, extension and solidity) but only ‘secondary’ in that they involve an interaction between eye and object and cannot be said to inhere in the object itself as the primary qualities are supposed to do – but it is really a subset of a larger argument that takes us back (as always) to Plato.

Plato, it will be recalled, dismisses the world brought to us via the senses as deceptive Appearance, maintaining that the true nature of the world – Reality – can only be apprehended by the intellect: it is the world of Forms or Ideas. As I have argued elsewhere (‘In the beginning was the Word’) what Plato has really discovered is the power of general terms – the Platonic Idea or Form ‘table’ is not something that lies beyond the word ‘table’, to which it points, it is in fact the word ‘table’ itself – which can be used in thought to stand for any table, because – unlike a picture – it does not resemble any particular table.

This introduces a whole new way of thinking about the world, where it is no longer seen directly, through the despised senses, but apprehended by the intellect through the medium of language. And there is no better way of appreciating this than to try and draw something shiny.

What colour is the car? Why, black, of course – with some shiny bits. That is how it was described on the official documentation – Daimler DR450, Black. But what about all those other colours, then? Ah, now, that’s just reflections of one thing and another – you can ignore them; the car’s real colour is black (and its radiator grille etc aren’t coloured at all, they’re shiny chrome plate).

What trying to draw it teaches you is not only that you can’t ignore the many other colours that are there (if you want your picture to be any good at all) but it also brings home to you that your regular habit (or at least mine) is to dismiss a great deal of what your eyes tell you and pretend it isn’t there, that it doesn’t count: ‘that is just light reflected off a polished surface; that is just a reflection; that’s just a shadow.’

And that is Platonism in action: the intellect overrides the senses, reserves judgement to itself – and it does it through words: ‘light’ conveniently labels – and so keeps you from looking at – something that is very difficult to render faithfully in a drawing. You find that reflective surfaces, far from being bright, are often dark and dull; a tiny patch left uncoloured on a white page becomes a gleam of light when surrounded by greys and blues, even black. And your mind, on seeing the drawing, converts it back to an image of a plated surface – perhaps the most interesting part of the process.

It is as if we erect a glass screen between ourselves and the world, and on the screen we write the words that correspond to the things beyond – ‘mountains, trees, clouds, house, road, cars, people’ – and most of the time what we see is not what is in front of us, but only the words on the screen that give us the simplified general picture, at once a tool of immense power (enabling rapid thought unencumbered by distracting detail) and a great impoverishment of our experience – it inserts a carapace between us and the world.

Reflecting on the origin of words leads us into interesting territory. I do not mean the origin of particular words, though that can be interesting too; I mean the notion of words as units, as building blocks into which sentences can be divided.

How long have we had words? The temptation is to say ‘as long as we have had speech’ but when you dig a bit deeper, you strike an interesting vein of thought.

As I have remarked elsewhere [see ‘The Muybridge Moment‘] it seems unlikely that there was any systematic analysis of speech till we were able to write it down, and perhaps there was no need of such analysis. Certainly a great many of the things that we now associate with language only become necessary as a result of its having a written form: formal grammar, punctuation, spelling – the three things that probably generate the most unnecessary heat – are all by-products of the introduction of writing.

The same could be said of words. Till we have to write them down, we have no need to decide where one word ends and another begins: the spaces between words on a page do not reflect anything that is found in speech, where typically words flow together except where we hesitate or pause for effect. We are reminded of this in learning a foreign language, where we soon realise that listening out for individual words is a mistaken technique; the ear needs to attune itself to rhythms and patterns and characteristic constructions.

So were words there all along just waiting to be discovered? That is an interesting question. Though ‘discovery’ and ‘invention’ effectively mean the same, etymologically (both have the sense of ‘coming upon’ or ‘uncovering’) we customarily make a useful distinction between them – ‘discovery’ implies pre-existence – so we discover buried treasure, ancient ruins, lost cities – whereas ‘invention’ is reserved for things we have brought into being, that did not previously exist, like bicycles and steam engines. (an idea also explored in Three Misleading Oppositions, Three Useful Axioms)

So are words a discovery or an invention?

People of my generation were taught that Columbus ‘discovered’ America, though even in my childhood the theory that the Vikings got their earlier had some currency; but of course in each case they found a land already occupied, by people who (probably) had arrived there via a land-bridge from Asia, or possibly by island-hopping, some time between 42000 and 17000 years ago. In the same way, Dutch navigators ‘discovered’ Australia in the early 17th century, though in British schools the credit is given to Captain Cook in the late 18th century, who actually only laid formal claim in the name of the British Crown to a territory that Europeans had known about for nearly two centuries – and its indigenous inhabitants had lived in for around five hundred centuries.

In terms of discovery, the land-masses involved predate all human existence, so they were there to be ‘discovered’ by whoever first set foot on them, but these later rediscoveries and colonisations throw a different light on the matter. The people of the Old World were well used to imperial conquest as a way of life, but that was a matter of the same territory changing hands under different rulers; the business of treating something as ‘virgin territory’ – though it quite plainly was not, since they found people well-established there – is unusual, and I think it is striking where it comes in human, and particularly European, history. It implies an unusual degree of arrogance and self-regard on the part of the colonists, and it is interesting to ask where that came from.

Since immigration has become such a hot topic, there have been various witty maps circulating on social media, such as this one showing ‘North America prior to illegal immigration’

The divisions, of course, show the territories of the various peoples who lived there before the Europeans arrived, though there is an ironic tinge lent by the names by which they are designated, which for the most part are anglicised. Here we touch on something I have discussed before [in Imaginary lines: bounded by consent] – the fact that any political map is a work of the imagination, denoting all manner of territories and divisions that have no existence outside human convention.

Convention could be described as our ability to project or impose our imagination on reality; as I have said elsewhere [The Lords of Convention] it strikes me as a version of the game we play in childhood, ‘let’s pretend’ or ‘make-believe’ – which is not to trivialise it, but rather to indicate the profound importance of the things we do in childhood, by natural inclination, as it were.

Are words conventions, a form we have imposed on speech much as we impose a complex conventional structure on a land-mass by drawing it on a map? The problem is that the notion of words is so fundamental to our whole way of thinking – may, indeed, be what makes it possible – that it is difficult to set them aside.

That is what I meant by my comment about the arrogance and self-regard implied in treating America and Australia as ‘virgin territory’ – its seems to me to stem from a particular way of thinking, and that way of thinking, I suggest, is bound up with the emergence of words into our consciousness, which I think begins about two and a half thousand years ago, and (for Europeans at least) with the Greeks.

I would like to offer a model of it which is not intended to be historical (though I believe it expresses an underlying truth) but is more a convenient way of looking at it. The years from around 470 to 322 BC span the lives of three men: the first, Socrates, famously wrote nothing, but spoke in the market place to whoever would listen; we know of him largely through his pupil, Plato. It was on Plato’s pupil, Aristotle, that Dante bestowed the title ‘maestro di color che sanno’ – master of those that know.

This transition, from the talking philosopher to the one who laid the foundations of all European thought, is deeply symbolic: it represents the transition from the old way of thought and understanding, which was inseparable from human activity – conversation, or ‘language games’ and ‘forms of life’ as Wittgenstein would say – to the new, which is characteristically separate and objective, existing in its own right, on the written page.

The pivotal figure is the one in the middle, Plato, who very much has a foot in both camps, or perhaps more accurately, is standing on the boundary of one world looking over into another newly-discovered. The undoubted power of his writing is derived from the old ways – he uses poetic imagery and storytelling (the simile of the cave, the myth of Er) to express an entirely new way of looking at things, one that will eventually subjugate the old way entirely; and at the heart of his vision is the notion of the word.

Briefly, Plato’s Theory of Forms or Ideas can be expressed like this: the world has two aspects, Appearance and Reality; Appearance is what is made known to us by the senses, the world we see when we look out the window or go for a walk. It is characterised by change and impermanence – nothing holds fast, everything is always in the process of changing into something else, a notion for which the Greeks seemed to have a peculiar horror; in the words of the hymn, ‘change and decay in all around I see’.

Reality surely cannot be like that: Truth must be absolute, immutable (it is important to see the part played in this by desire and disgust: the true state of the world surely could not be this degrading chaos and disorder where nothing lasts). So Plato says this: Reality is not something we can apprehend by the senses, but only by the intellect. And what the intellect grasps is that beyond Appearance, transcending it, is a timeless and immutable world of Forms or Ideas. Our senses make us aware of many tables, cats, trees; but our intellect sees that these are but instances of a single Idea or Form, Table, Cat, Tree, which somehow imparts to them the quality that makes them what they are, imbues them with ‘tableness’ ‘catness’ and ‘treeness’.

This notion beguiled me when I first came across it, aged fourteen. It has taken me rather longer to appreciate the real nature of Plato’s ‘discovery’, which is perhaps more prosaic (literally) but no less potent. Briefly, I think that Plato has discovered the power of general terms, and he has glimpsed in them – as an epiphany, a sudden revelation – a whole new way of looking at the world; and it starts with being able to write a word on a page.

Writing makes possible the relocation of meaning: from being the property of a situation, something embedded in human activity (‘the meaning of a word is its use in the language’) meaning becomes the property of words, these new things that we can write down and look at. The icon of a cat or a tree resembles to some extent an actual cat or tree but the word ‘cat’ looks nothing like a cat, nor ‘tree’ like a tree; in order to understand it, you must learn what it means – an intellectual act. And what you learn is more than just the meaning of a particular word – it is the whole idea of how words work, that they stand for things and can, in many respects, be used in their stead, just as the beads on an abacus can be made to stand for various quantities. What you learn is a new way of seeing the world, one where its apparently chaotic mutability can be reduced to order.

Whole classes of things that seem immensely varied can now be subsumed under a single term: there is a multiplicity of trees and cats, but the one word ‘tree’ or ‘cat’ can be used to stand for all or any of them indifferently. Modelled on that, abstract ideas such as ‘Justice’ ‘Truth’ and ‘The Good’ can be seen standing for some immutable, transcendent form that imbues all just acts with justice and so on. Plato’s pupil Aristotle discarded the poetic clothing of his teacher’s thought, but developed the idea of generalisation to the full: it is to him that we owe the system of classification by genus and species and the invention of formal logic, which could be described as the system of general relations; and these are the very foundation of all our thinking.

In many respects, the foundations of the modern world are laid here, so naturally these developments are usually presented as one of mankind’s greatest advances. However, I would like to draw attention to some detrimental aspects. The first is that this new way of looking at the world, which apprehends it through the intellect, must be learned. Thus, at a stroke, we render natural man stupid (and ‘primitive’ man, to look ahead to those European colonisations, inferior, somewhat less than human). We also establish a self-perpetuating intellectual elite – those who have a vested interest in maintaining the power that arises from a command of the written word – and simultaneously exclude and devalue those who struggle to acquire that command.

The pernicious division into ‘Appearance’ and ‘Reality’ denigrates the senses and all natural instincts, subjugating them to and vaunting the intellect; and along with that goes the false dichotomy of Heart and Head, where the Head is seen as being the Seat of Reason, calm, objective, detached, which should properly rule the emotional, subjective, passionate and too-easily-engaged Heart.

This, in effect, is the marginalising of the old way of doing things that served us well till about two and a half thousand ago, which gave a central place to those forms of expression and understanding which we now divide and rule as the various arts, each in its own well-designed box: poetry, art, music, etc. (a matter discussed in fable form in Plucked from the Chorus Line)

So what am I advocating? that we undo all this? No, rather that we take a step to one side and view it from a slightly different angle. Plato could only express his new vision of things in the old way, so he presents it as an alternative world somewhere out there beyond the one we see, a world of Ideas or Forms, which he sees as the things words stand for, what they point to – and in so doing, makes the fatal step of discarding the world we live in for an intellectual construct; but the truth of the matter is that words do not point to anything beyond themselves; they are the Platonic Forms or Ideas: the Platonic Idea of ‘Horse’ is the word ‘Horse’. What Plato has invented is an Operating System; his mistake is in thinking he has discovered the hidden nature of Reality.

What he glimpsed, and Aristotle developed, and we have been using ever since, is a way of thinking about the world that is useful for certain purposes, but one that has its limitations. We need to take it down a peg or two, and put it alongside those other, older operating systems that we are all born with, which we developed over millions of years. After all, the rest of the world – animal and vegetable – seems to have the knack of living harmoniously; we are the ones who have lost it, and now threaten everyone’s existence, including our own; perhaps it is time to take a fresh look.