Monthly Archives: November 2015

The memorable Eadweard Muybridge invented a number of things, including his own name – he was born Edward Muggeridge in London in 1830. He literally got away with murder in 1872 when he travelled some seventy-five miles to shoot dead his wife’s lover (prefacing the act with ‘here’s the answer to the letter you sent my wife’) but was acquitted by the jury (against the judge’s direction) on the grounds of ‘justifiable homicide’. He is best known for the sequence of pictures of a galloping horse shot in 1878 at the behest of Leland Stanford, Governor of California, to resolve the question of whether the horse ever has all four feet off the ground (it does, though not at the point people imagined). To capture the sequence, Muybridge used multiple cameras and devised a means of showing the results which he called a zoopraxoscope, thereby inventing stop-motion photography and the cinema projector, laying the foundations of the motion-picture industry.

Muybridge’s achievement was to take a movement that was too fast for the human eye to comprehend and freeze it so that each phase of motion could be analysed. It was something that he set out to do – as deliberately and methodically as he set out to shoot Major Harry Larkyns, his wife’s lover.

It is interesting to consider that something similar to Muybridge’s achievement happened a few thousand years ago, entirely by accident and over a longer span of time, but with consequences so far-reaching that they could be said to have shaped the modern world.

We do not know what prompted the invention of writing between five and six thousand years ago, but it was not a desire to transcribe speech and give it a permanent form; most likely it began, alongside numbering, as a means of listing things, such as the contents of storehouses – making records for tax purposes, perhaps, or of the ruler’s wealth – and from there it might have developed as a means of recording notable achievements in battle and setting down laws.

We can be confident that transcribing speech was not the primary aim because that is not something anyone would have felt the need to do. For us, that may take some effort of the imagination to realise, not least because we live in an age obsessed with making permanent records of almost anything and everything, perhaps because it is so easy to do so – it is a commonplace to observe that people now seem to go on holiday not to enjoy seeing new places at first hand, but in order to make a record of them that they can look at once they return home.

to serve as propaganda for the written word and emphasise how vital it is to write things down. One of the tricks of propaganda is to take your major weakness and brazenly pass it off as a strength (‘we care what you think’ ‘your opinion matters to us’ ‘we’re listening!’ as banks and politicians say) and that is certainly the case with this particular Latin tag: it is simply not true that the spoken word is lost – people have been remembering speech from time immemorial (think of traditional stories and songs passed from one generation to the next); it is reasonable to suppose that retaining speech is as natural to us as speaking.

If anything, writing was devised to record what was not memorable. Its potential beyond that was only slowly realised: it took around a thousand years for anyone to use it for something we might call ‘literature’. It is not till the classical Greek period – a mere two and a half millennia ago (Homo Sapiens is reckoned at 200,000 years old, the genus Homo at 2.8 million) – that the ‘Muybridge moment’ arrives, with the realisation that writing allows us to ‘freeze’ speech just as his pictures ‘froze’ movement, and so, crucially, to analyse it.

When you consider all that stems from this, a considerable degree of ‘unthinking’ is required to imagine how things must have been before writing came along. I think the most notable thing would have been that speech was not seen as a separate element but rather as part of a spectrum of expression, nigh-inseparable from gesture and facial expression. A great many of the features of language which we think fundamental would have been unknown: spelling and punctuation – to which some people attach so much importance – belong exclusively to writing and would not have been thought of at all; even the idea of words as a basic unit of language, the building blocks of sentences, is a notion that only arises once you can ‘freeze’ the flow of speech like Muybridge’s galloping horse and study each phase of its movement; before then, the ‘building blocks’ would have been complete utterances, a string of sounds that belonged together, rather like a phrase in music, and these would invariably have been integrated, not only with gestures and facial expressions, but some wider activity of which they formed part (and possibly not the most significant part).

As for grammar, the rules by which language operates and to which huge importance is attached by some, it is likely that no-one had the least idea of it; after all, speech is even now something we learn (and teach) by instinct, though that process is heavily influenced and overlaid by all the ideas that stem from the invention of writing; but then we have only been able to analyse language in that way for a couple of thousand years; we have been expressing ourselves in a range of ways, including speech, since the dawn of humanity.

When I learned grammar in primary school – some fifty years ago – we did it by parsing and analysis. Parsing was taking a sentence and identifying the ‘parts of speech’ of which it was composed – not just words, but types or categories of word, defined by their function: Noun, Verb, Adjective, Adverb, Pronoun, Preposition, Conjunction, Article.

Analysis established the grammatical relations within the sentence, in terms of the Subject and Predicate. The Subject, confusingly, was not what the sentence was about – which puzzled me at first – but rather ‘the person or thing that performs the action described by the verb’ (though we used the rough-and-ready method of asking ‘who or what before the verb?’). The Predicate was the remainder of the sentence, what was said about (‘predicated of’) the Subject, and could generally be divided into Verb and Object (‘who or what after the verb’ was the rough and ready method for finding that).

It was not till I went to university that I realised that these terms – in particular, Subject and Predicate – derived from mediaeval Logic, which in turn traced its origin back to Aristotle (whom Dante called maestro di color che sanno – master of those that know) in the days of Classical Greece.

Socrates

Plato

Aristotle

Alexander the Great

Aristotle is the third of the trio of great teachers who were pupils of their predecessors: he was a student of Plato, who was a student of Socrates. It is fitting that Aristotle’s most notable pupil was not a philosopher but a King: Alexander the Great, who had conquered much of the known world and created an empire that stretched from Macedonia to India by the time he was 30.

That transition (in a period of no more than 150 years) from Socrates to the conquest of the world, neatly embodies the impact of Classical Greek thought, which I would argue stems from the ‘Muybridge Moment’ when people began to realise the full potential of the idea of writing down speech. Socrates, notably, wrote nothing: his method was to hang around the market place and engage in conversation with whoever would listen; we know him largely through the writings of Plato, who uses him as a mouthpiece for his own ideas. Aristotle wrote a great deal, and what he wrote conquered the subsequent world of thought to an extent and for a length of time that puts Alexander in eclipse.

In the Middle Ages – a millennium and a half after his death – he was known simply as ‘The Philosopher’ and quoting his opinion sufficed to close any argument. Although the Renaissance was to a large extent a rejection of Aristotelian teaching as it had developed (and ossified) in the teachings of the Schoolmen, the ideas of Aristotle remain profoundly influential, and not just in the way I was taught grammar as a boy – the whole notion of taxonomy, classification by similarity and difference, genus and species – we owe to Aristotle, to say nothing of Logic itself, from which not only my grammar lessons but rational thought were derived.

I would argue strongly that the foundations of modern thought – generalisation, taxonomy, logic, reason itself – are all products of that ‘Muybridge Moment’ and are only made possible by the ability to ‘freeze’ language, then analyse it, that writing makes possible.

It is only when you begin to think of language as composed of individual words (itself a process of abstraction) and how those words relate to the world and to each other, that these foundations are laid. Though Aristotle makes most use of it, the discovery of the power of generalisation should really be credited to his teacher, Plato: for what else are Plato’s Ideas or Forms but general ideas, and indeed (though Plato did not see this) those ideas as embodied in words? Thus, the Platonic idea or form of ‘table’ is the word ‘table’ – effectively indestructible and eternal, since it is immaterial, apprehended by the intellect rather than the senses, standing indifferently for any or all particular instances of a table – it fulfils all the criteria*.

Which brings us to Socrates: what was his contribution? He taught Plato, of course; but I think there is also a neat symbolism in his famous response to being told that the Oracle at Delphi had declared him ‘the wisest man in Greece’ – ‘my only wisdom is that while these others (the Sophists) claim to know something, I know that I know nothing.’ As the herald of Plato and Aristotle, Socrates establishes the baseline, clears the ground, as it were: at this point, no-one knows anything; but the construction of the great edifice of modern knowledge in which we still live today was just about to begin.

However, what interests me most of all is what constituted ‘thinking’ before the ‘Muybridge Moment’, before the advent of writing – not least because, whatever it was, we had been doing it for very much longer than the mere two and a half millennia that we have made use of generalisation, taxonomy, logic and reason as the basis of our thought.

How did we manage without them? and might we learn something useful from that?

I think so.

*seeing that ideas are actually words also solves the problem Hume had, concerning general ideas: if ideas are derived from impressions, then is the general idea ‘triangle’ isosceles, equilateral, or scalene or some impossible combination of them all? – no, it is just the word ‘triangle’. Hume’s mistake was in supposing that an Idea was a ‘(faint) copy’ of an impression; actually, it stands for it, but does not resemble it.

Meaning matters. It is bound up with so many things: understanding and misunderstanding, doubt and certainty, to say nothing of philosophy, poetry, music and art; so it is worth considering the mechanism by which it operates. ‘Mechanism’ is a useful image here: when mechanisms are hidden – as they generally are – their effects can seem mysterious, even magical (as in the marvels of the watchmaker or the stage magician); yet when they are revealed, they offer reassurance: the point of a mechanism is that, unless it is impaired or interfered with, it will go on working in the same way.

The problem with the mechanism of meaning is that the popular notion of it is misleading: we speak of meaning as something conveyed, like a passenger in a car, or transmitted, like a radio message; we also speak of it as being embodied or contained in things that have it, whether they are sentences, poems, works of art or the like. These two usages combine to suggest that meaning exists independently in some form, and that the business of ‘meaning’ and ‘understanding’ consists of inserting it into and extracting it from whatever is said to have it. That seems like common sense, but as we shall see, when scrutinised it proves problematic.

Wittgenstein points us in another direction with his observation that ‘the meaning of a word is its use in the language’ which he uses alongside ‘language game’ and ‘form of life’ when discussing meaning, to denote the (wider) activity of which language forms a part and from which it derives its meaning.

It strikes me that the basic mechanism of meaning lies in connection : meaning is only found where a connection is made, and that connection is made in the mind of an observer, the one who ascribes meaning. In other words, meaning is not a fixed property of things: a thing in itself, on its own, does not have meaning. But we must be careful here: this is a stick that some will readily grasp the wrong end of – to suggest that a tree or a person (say) ‘has no meaning’ is liable to provoke outrage and earnest outpourings about the inestimable value of trees and people. That is because ‘meaningless’ is a pejorative term, properly used in cases where we expect meaning but do not find it; it might be compared to our use of ‘flightless’ which we apply to certain birds that are exceptions to the general rule; we would not apply it to pigs or gorillas.

We can gain some insight into how meaning works – its mechanism – by considering an allied concept, purpose. Let us suppose some interplanetary traveller at a remote time of a quite different species to ourselves. Somewhere on his travels he comes upon this relic of a long-lost civilisation: a rectangular case constructed of semi-rigid, possibly organic material, which opens to disclose a cellular array – there are some twenty-four rectangular cells of the same organic material, each containing an identical object, rounded, hard and smooth to the touch. He is quite excited by the find as it reminds him of another he has come across – again a cellular array in a case of semi-rigid, possibly organic material, and again each cell containing a smooth, hard, rounded object, though there are differences of detail both in the shape of the cells and the objects. He may submit a learned paper to the Integalactic Open University speculating on the purpose of these strikingly similar discoveries; he is in no doubt that they are variants of the same thing, and share a common purpose, on account of the numerous points of resemblance.

Were we at his side we might smile, since one is a packet of lightbulbs and the other a box of eggs; it is likely that the resemblances that strike him as the best clues to their purpose might elude us altogether, since we would dismiss them as irrelevant – ‘that is just how they happen to be packaged, for ease of transport or storage: it has no bearing on what they are for. As to the slight similarities of shape and texture, that is mere coincidence. These objects are entirely unrelated, and could not be more unlike.’

It is worth considering the key difference between us and the interplanetary traveller that allows us to smile at his ill-founded speculation. These are familiar objects to us, and we can connect them at once to a context or situation in which they belong, where they fit in and have purpose; our ‘reading’ of them is entirely different from the alien traveller’s – we disregard all that seems to him most striking, because we know it is of no significance. We see that the apparent similarity has nothing to do with the objects themselves, but the fact that they are both in storage, awaiting use; neither is ‘active’, i.e. in the situation or context where they are used and have purpose.

How far an examination of the objects in detail might allow our traveller to deduce, on the one hand, a national grid for distributing electricity from power stations to homes and workplaces rigged with lighting circuits, and the delights of omelettes, fried, poached and scrambled eggs on the other depends on quite how alien he is – if he a gaseous life-form sustained by starlight, he is unlikely to penetrate far into their mystery. On the other hand, if his own existence has ‘forms of life’ or activities similar to ours, he might make much better and even surprisingly accurate guesses.

That, after all, is how we ourselves proceed if we come across artefacts or objects that are unfamiliar: we guess at their purpose by thinking of the kind of the thing they might be, the sort of use they might have, by analogy with our own activities or ‘forms of life’ (and it is no accident that truly mystifying objects are often tentatively described as having ‘possible religious or ritual significance’ since in our own experience this is where many things are found whose use could not easily be guessed; and in this connection consider the use made of everyday objects in burial rites – offerings of food put alongside the dead, or cooking or eating utensils for use on the onward journey).

I would suggest that, as far as the mechanism by which they operate goes, ‘purpose’ and ‘meaning’ are the same, since both are defined in the same way, viz. by placing the thing in question in relation to some context, situation or larger activity where it has a place, where it ‘makes sense’, if you like; (imagine our alien traveller’s reaction to being shown the circumstances in which a lightbulb is used – the mystery of the object disappears once the connections are made – literally, in this case).

This brings out important aspects of meaning that are often overlooked, not least because – as I observed at the outset – they are contradicted by most popular accounts of what meaning is. The first aspect is that meaning is not inherent: no amount of studying or dissecting the object in isolation will discover it – I emphasis ‘in isolation’ because discovering, say, the filament in the light bulb and how it is connected to the fitting at the base will advance our understanding only if we can relate them to other things: if we have no notion of electricity, or that it will make a wire filament glow brightly, then they will tell us nothing.

The second aspect is slightly trickier to explain but of greater significance. If we agree that meaning is not inherent, not something that can be found simply by examining the object no matter how minutely, then we can reasonably ask where it is located. One answer, from what we have said, is that it lies in the relation or connection to the context, situation or ‘form of life’; but I think that is not quite right.

Rather, it consists in being related to, or being connected with – in other words, it exists as the result of an action by the onlooker, and where it exists – where it means – is in that onlooker’s mind. This is not the usual account that is given of meaning, which is generally more like this, from Wikipedia:

‘meaning is what the source or sender expresses, communicates, or conveys in their message to the observer or receiver, and what the receiver infers from the current context.’

At first sight, this might not seem significantly different – we have relation to context, we have a process of inference; the main addition appears to be that the source or sender is taken into account, as well as the receiver. However, there is one slight-seeming but important difference, which is the notion of the meaning as something which retains its identity throughout, and which exists prior to the communication taking place and survives after it – the model that springs readily to mind is the letter, which the sender puts in the envelope, which is then conveyed to the recipient who takes it out and reads it.

The analogy with the letter is probably what makes this seem a ‘common sense’ account that most people would agree with, but the logic of it gives rise to problems. If we picture the process of sending a letter, we might start with the sender at her desk, pen in hand, poised to write; she puts the message on the page, folds the page and seals it in the envelope then sends it off; at the other end the recipient takes it out, reads it, and ‘gets the message’. What is the difficulty there, you might ask?
It begins to emerge if you try to make the analogy consistent. At first glance, it seems that

meaning=letter
message (that which conveys the meaning) = envelope

But there is a problem here: the message is in the letter, rather than the envelope; in actual fact, the envelope is superfluous – the message could be sent without it, by hand, say, simply as a folded note. Still, that seems trivial – the sender puts her ideas into words, the receiver reads the words and gets her ideas: isn’t that just the same?

Not quite. The question is whether the meaning (or the message) exists before it is put into words; and if so, in what form? Again, this may seem unproblematic: of course the message exists before she writes it down; and indeed she might change her mind and instead of writing, making a phone call and say what she means instead, directly, as it were.

But we must be careful here: the question is not whether the message exists before she writes it down – or even before she speaks it – but before she puts it into words. This is where the image of the letter in the envelope is at its most misleading: isn’t the meaning just something we put into words, in the same way we put the letter in the envelope?

That is what Wittgenstein calls the ‘private language’ argument – the notion that my thoughts are in my head in some form to which I have privileged access, and which I could choose to give public form if I wish, thereby making them accessible to others. Though this again seems like common sense, when examined closely, it is problematic. It forms the basis of popular notions of telepathy, but trying to imagine what such ‘direct transmission’ would actually consist of highlights the difficulty.

If you convey your thoughts to me, what do I experience? Do I hear your voice speaking in my head? if so, we are back to ‘putting things in words’ and no nearer any prior form our thoughts might take. The temptation is to fall back on images, as if these were somewhow more immediate (a picture is worth a thousand words, after all) but what would they be images of? And how, having received them, would I be able to infer your thought from them? A more illuminating (but no less problematic) possibility is that we might hear your thoughts as a musical phrase which we intuitively understand.

This works to some extent because we are accustomed to the idea that music can consistently evoke definite feelings in us – ‘that passage always makes me feel this way, invariably calls this to mind’ – though we have no idea how: ‘it just does’; so that seems consistent with our finding in it something that someone else has put there; but it still leaves the question of what happens at the other end – how would such a musical message originate?

The options here would seem to be either that the message is originally in some other form which I then embody in the music – which takes us back to where we started: if it’s comprehensible to me in that form, why can’t I convey that directly instead of ‘translating’ it into music? – or else we have to accept that only in expressing it do I find what I am thinking; what we experience prior to that is an urge, a sort of pressure which can only be relieved by giving it expression in some way – whether it is an inarticulate cry of rage, a musical phrase (the terrifying opening of the Dies Irae in Verdi’s Requiem, for instance), an image (Munch’s The Scream, maybe) or the words ‘I am very angry about this!’

This brings us by a roundabout route to something I have been trying to articulate for a while – the key distinction between language as the instrument of thought and as one means of expressing experience; but that is a subject for another article. In the meantime, I would conclude by saying that, if this account of the mechanism of meaning is accurate, then it has some interesting implications. It suggests, for instance, that meaning (like beauty) is in the eye (or mind) of the beholder; that it is not fixed, but variable; that it is impermanent; and – perhaps most importantly – it is inseparable from its context, the ‘form of life’ or wider activity of which it forms a part and on which it depends.