Following on from my last post, here are some more links to some interesting online content that is related to the topics I write about here, along with some commentary from me. First of all, have a look once again at the blogroll in the sidebar -- I've added some more blogs, including Whats in a brain? which has a recent post on linguistic relativity and time. Now in today's post I've got some longer lecture-type items, mostly by academics but aimed at a broad non-specialist audience. Wherever possible I'll try to include links to both video and audio versions for you to choose from.

First of all is a talk by James Burke titled "Admiral Shovel and the Toilet Roll" (also available in iTunes). In addition to his usual connections approach, this is an excellent argument for the importance of the interdisciplinary approach. It's also very witty and entertaining, as usual for Burke.

At the end of the last post I linked to some basic introductory linguistics videos, and here is another very good introduction to the basics of linguistics, "Linguistics as a Window to Understanding the Brain" by Steven Pinker. This lecture is part of the Floating University initiative, and in it Pinker does a pretty good job of not only presenting basic linguistic concepts but also introducing and giving a balanced treatment of some controversial issues such as language universals and linguistic relativity, subjects that he has fairly strong views on. Here is the lecture on YouTube:

On the subject of time, here is Claudia Hammond's RSA talk "Time Warped" based on her book of the same name (the full audio of the talk is also available in iTunes). Hammond discusses many interesting issues about time perception. Here is a YouTube video of the edited highlights of this talk:

Cognitive scientist David Eagleman also works on time perception (as well as a variety of other topics). Here are two lectures of his from The Up Experience. In the first, he gives good summary of his work on how we perceive time and how our sense of time is largely a construction by the brain:

In this second Eagleman talk, he discusses, among other things, the relationship between the present self and the future self, drawing on a story of Odysseus and the sirens from the ancient Greek epic The Odyssey to describe what he calls the Odysseus contract:

Economist M. Keith Chen also draws on the idea of future discounting, which Eagleman refers to in that last video, in his highly controversial connection between how languages handle the future tense and future planning (which I've discussed before here and here). Here is his TED talk presenting this theory:

A quick news article post today. This story has been making the rounds online, and is relevant to my earlier discussion of interdisciplinarity. Jonathon Allen, a biochemistry student with an interdisciplinary interest in history, hearing about tree ring evidence for a spike in carbon-14 levels in Japanese cedar trees, which is usually caused by a supernova or solar flares, in the year AD 774, has made the connection with a mention in the Anglo-Saxon Chronicle of a strange red crucifix in the sky:

"This year the Northumbrians banished their king, Alred, from York at Easter-tide; and chose Ethelred, the son of Mull, for their lord, who reigned four winters. This year also appeared in the heavens a red crucifix, after sunset; the Mercians and the men of Kent fought at Otford; and wonderful serpents were seen in the land of the South-Saxons." (OMACL translation)

He suggests it might have been caused by a supernova partially covered by a dust cloud. This is a nice example of an intersection between history and science. Of course there are many such strange portants recorded in chronicles like these (for instance the odd mention of wonderful snakes in Sussex in the F version of the Chronicle), so it's hard to know what they might mean, but I'd imagine that a medieval or ancient observer is much more reliable than the average modern one. Simply put, they knew their night sky better than we often do, since they had no electric lights. And it's a nice example of a clever insight through interdisciplinary knowledge.

I wanted to write today about something that is in many ways central to much of what I write about here on this blog, and has been fundamental to much of my research more generally: interdisciplinarity. For those not that familiar with academia, universities are generally rigidly divided into a variety of disciplines. At the larger level, there are main groupings such as the sciences and humanities, which are further divided into departments such as physics, biology, English literature, history, and philosophy. There are certainly good organizational reasons for this kind of division, as it fosters in depth interaction within disciplines with others working on similar areas, and puts researchers and students with others of a like mind, who might best be able to appreciate their shared material.
I suppose ultimately this kind of division goes back to the medieval arrangement of education into what is known as the trivium and the quadrivium. The first level, the trivium with three subjects, is what we might now think of as the arts: grammar, rhetoric, and dialectic (that is logic). The second level, the quadrivium with four subjects, consisted of what we might call the sciences: arithmetic, geometry, astronomy, and music (that is the theoretical study of harmonics). The word trivium means literally ‘three ways’ (referring obviously to the three parts) from Latin tri- ‘three’ and via ‘way, road’, and on a more literal level the same word was used to refer to a crossroad of three streets. From this also comes the modern English word trivial.1 Now I’m not suggesting that intense specialisation is trivial,2 but the research produced from increasingly specialised disciplines runs the risk of being appreciated by only a small few who are in the know. Now this kind of detailed work is very important at advancing and solidifying what we know about the world -- quantum physicists may discover the fundamental operations of reality or paleographers may determine the provenance of a particular manuscript. But sometimes great advances come from the unexpected intersection of different knowledge sets, and because few people can be experts in more than one field, these kinds of perspectives can be missed.

This serendipity is at the heart of much of the writing of the popular science historian James Burke, and indeed Burke makes a particularly strong case for this kind of boundary crossing in this podcast, if you care to listen. The point is, groundbreaking discoveries often come from people working outside of their normal, comfortable, well-defined area of study, exploring areas outside their comfort zone. Sometimes they connect the familiar with the unfamiliar. Or they connect an external idea with their usual subject matter. This is in part what lies behind interdisciplinarity.

My background, as I've mentioned before, is in medieval studies, which combines a variety of disciplines such as language, literature, palaeography, codicology, history, archaeology, musicology, and so forth, over a wide geographical range, with the unifying chronological parameter of the middle ages. My doctoral dissertation would not have been possible in an English literature department, combining as it did linguistics, literature, anthropology, philosophy, history, and a variety of other approaches. It may have left me as something of a jack of all trades, master of none, but it allowed me a range that I found quite interesting. Cognitive science, my latest interest, is also an example of an interdisciplinary field, even more so when applied to literature, culture, or history, as researchers like myself do.

Recently I've noticed some quite inventive interdisciplinary projects worthy of note. Culturomics, applying detailed quantitative analysis to a large database of texts with a view to making cultural connections, has garnered some broad attention, particularly with a recent paper examining the birth and death of words. This culturomics approach has led to the Google ngram tool, which is free for anyone to play around with. Searchable electronic corpora of texts are often only available to those within an academic institution, and take some specialised knowledge to use. When I work with Old English, for instance, I make extensive use of the Old English Corpus from the Dictionary of Old English Project. The ngram tool from Google allows anyone to experiment.

Closer to my own background in Old English, there is the Lexomics project, which bring statistical analysis and computer science to the study of Old English poetry. This project has taken the Old English Corpus and applied some quite sophisticated statistical analysis with a view to finding lexical connections between different texts. This could, for instance, be used for research on authorship or literary influence. And again, many of the Lexomics tools are freely available online.

And as I've written about before, M. Keith Chen has tried to explore the intersection of economics, language, and psychology, and Chen’s other work also employs similar cross-boundary approaches, as is clear from his website.

Of course, inevitably such boundary pushing research will be criticised by specialists in the fields, sometimes rightly and sometimes not. Naturally when working outside of your comfort zone you may make faulty assumptions that a specialist would not, and thus the scrutiny of specialists is important for refining this kind of work. I’ve already written about the criticisms of Chen’s linguistic relativistic effects on saving behaviour. There are, indeed, some potentially serious problems to his categorisation of languages, and perhaps with the type of statistical analysis he uses. Nevertheless, I think this research raises some interesting questions, which merit further investigation. While I don’t think Chen has definitively proven anything, I do think that he has cast a light on an intriguing correlation between language and thought, which can be further explored with both statistical and experimental work. And the Culturomics work has similarly been criticised due to problems with the corpus of texts they’re working with.3

In particular, I’m personally quite fascinated by scientific approaches to literature and the humanities in general, bridging that traditional trivium/quadrivium divide I mentioned earlier. Culturomics is one such example, conducted as it is by physicists and mathematicians. Another science/humanities crossover is the project to determine the provenance of medieval manuscripts by analysing the DNA of the parchment. My own post on the history of sailing technology in its way is a cross over between science and the humanities. And one thing the sciences do well, that is often not a factor in the humanities, is collaboration. I’d like to see more collaborative work in the humanities.

So in the end, all of this was to call attention to the basis of much of what I’m writing about on this blog. As I’ve said, my research, including my doctoral dissertation, is inherently interdisciplinary. I use language as an entry point to investigate history, culture (including literature), and thought. On this blog I may push the boundaries even further than I do in my other research in an effort to see what sticks. I do think that by taking those chances real progress can be made, but I certainly do welcome any constructive criticism.

Coming up soon, a related post on interconnectedness...

1 Presumably from the sense of 'commonplace', hence 'unimportant'. [back]

2 Nor am I suggesting that the arts are more trivial than the sciences. [back]

3 See here, and a humorous response to this kind of interdisciplinary approach can be seen here. [back]

I’ve been asked before what I mean by cognitive philology. I perhaps mean something slightly different, or at least something more specific, by it than what's described in the Wikipedia entry. My background training has been largely in literary studies and traditional philology, with some formal training in more theoretical areas of structural linguistics, generative grammar, and so forth, though recently I’ve taken to the cognitive school of linguistics with all the fervor of a new convert. So I thought I’d write a bit of a potted history of linguistics to show where I’m coming from. I must admit this is highly subjective, as I’ve focused on the areas of linguistics that have influenced me the most, and so I’m leaving out all kinds of stuff that is important and influential generally, but just not to me personally, and I’ve chosen often to focus on particular linguists who are interesting and/or influential. I’m also grossly oversimplifying things as it suits my purposes, so take it all with a bit of a grain of salt. This post is mainly intended for those with an interest in language, but not much background knowledge; actual linguists need not read any further.1

In traditional philology one studies the history of languages by looking at old texts. So for instance one studies history of the English language by analysing old texts such as, lets say, medieval poems like Beowulf. By looking at texts from different time periods, from different regions or dialects, and by comparing related languages, such as Old English with various related Germanic languages such as Old Norse or Old Saxon, one can make pronouncements about how the language has changed over time and some of the forces that may have driven these changes. A philologist records and reports on the evidence found in historical texts. This approach to the study of language is a very old one, dating back to the 19th century (and earlier) with such founding fathers as Jacob Grimm (yes that Jacob Grimm of the the Brothers Grimm). In addition to collecting folktales, Grimm pioneered the field of comparative philology, devising Grimm’s Law, which described the relationship between certain consonant sounds in various Indo-European languages and how they came into Germanic languages like English.2 Indeed many scholars of the 19th century and earlier, such as Franz Bopp, Thomas Young, Sir William Jones, theorised about the relationship between the various languages which have now become known as the Indo-European language family, and at least as far back as the 16th century comments had been made about such similarities.3 This interest in Indo-European comparative philology kicked off a whole cottage industry in comparative studies in Indo-European (and other) cultures, such as comparative mythology and folklore, which included Grimm's other great work, the collection of folktales/fairytales he compiled with his brother Wilhelm. Grimm's Law was one of the first systematic expressions of this correspondence. As for philology, Grimm and his contemporaries were some of the first to take a truly scientific approach to language, focusing largely on phonology (the sounds of language) and language change. Another term for this sort of study is historical linguistics.4

Incidentally, in addition to Grimm, I could have listed here any number of philologists who were important to the study of Old English in particular and the history of the English language in general. For instance Karl Verner who formulated Verner’s Law which explained the seeming exceptions to Grimm’s Law, or the many lexicographers from Samuel Johnson to James Murray, famed editor of the Oxford English Dictionary, which revolutionized how dictionaries are made and used, and many editors of various early editions, such as those involved in the Early English Text Society. One in particular worthy of at least brief mention is Henry Sweet, who in addition to writing various works on Germanic philology, including A Student’s Dictionary of Anglo-Saxon, was the model for Henry Higgins in George Bernard Shaw’s Pygmalion.5 All of these 19th century philologists pushed forward our understanding of language and its history during an explosively productive time.

Aside from historical linguistics, the study of language throughout the 20th century and into the 21st has become highly theoretical, and often focuses on the spoken language of living speakers, who can be test subjects in the lab. Linguists started to theorise how language worked at some deeper level, rather than just describing it and its history in great detail.6 To a large extent the late 19th and early 20th century linguist Ferdinand de Saussure is responsible for the theoretical and structural approach to language. Saussure, for instance, hit upon the distinction between the theoretical system and its actual practice of language, what he termed langue and parole. Saussure is often regarded as the founder of modern linguistics.7

Perhaps one of the most important and influential linguistic ideas of the 20th century, which comes out of this structural approach, is the idea of the universal grammar, proposed by the famous linguist Noam Chomsky.8 This theory holds that all human languages share a basic universal structure which is hard-wired into the human brain, and the differences between languages are mostly surface level, and not indicative of the deeper structure of language. This also presupposes that there is a special language centre of the brain and that language is a specialised function of the brain. Furthermore, this internal grammar is said to be generative, in that it is a set of basic rules through which a theoretically infinite number of expressions can generated.9 This is the difference between human language and the forms of communication that animals use. While human language is infinitely flexible and can generate new expressions and ideas, animal communication is limited to a certain number of fixed expressions. Animals, as far as we know, have no syntax or generative grammar with which to produce novel expressions. They simply repeat the same few messages as required by the circumstance. Followers of Chomsky have sought to define how this universal and generative grammar works, and how we acquire this ability. This approach to language is often referred to as generative linguistics or formal linguistics.

There have certainly been alternate opinions, such as the famous Sapir-Whorf hypothesis, associated with Edward Sapir10 and Benjamin Lee Whorf,11 which argues for linguistic relativity, that language difference is indicative of different ways of thinking, and that the language you speak can influence the way you perceive and think about the world.12 Linguistic relativity too has its foundations in Saussure’s ideas, particularly the relationship between thought and language and Saussure’s notions of linguistic signs and the concepts they signify. But the universal grammar model came to be the mainstream of linguistics.13 In the latter half of the 20th century the idea of linguistic relativity has fallen out of favour, and is even now not the mainstream opinion.

Also quite influential in the 20th century has been the field of sociolinguistics, which examines language in the context of society -- how language is used in different social contexts, such as in different social and socioeconomic groups, and how society effects language. A notable linguist in this field is William Labov, who is widely know for his study of Black English Vernacular (BEV). Labov argued that BEV should not be thought of as substandard, but was in fact remarkably expressive and grammatically consistent, and worthy of academic study.14 This is part of a larger trend in linguistics of the 20th century away from prescriptivism (instructing people about the so-called ‘correct’ way to speak and write) and towards descriptivism (studying how language is actually used in the real world rather than in theoretical and abstract grammar books, without value judgement). A related field is pragmatics, which studies context-dependent meaning, and seeks to uncover the elements of meaning that can’t be explained by the study of the structure of language in formal linguistics. Thus for instance the utterance “It’s cold in here” when said to someone sitting next to an open window is a request to close the window. It is the context of the discourse that determines the meaning. William Labov engaged in this kind of discourse analysis, which has led to my own interest in examining discourse markers (such supposedly meaningless, though actually significant and meaningful, filler phrases such as “so”, “you know”, “I mean”, and “actually”) and their use in medieval narrative.15

More recently, a new branch of linguistics has developed called cognitive linguistics, which attempts to explain language by drawing on more general theories of human cognition developed by cognitive scientists and cognitive psychologists. Simply put, according to the cognitive approach, language is explained with reference to the same basic principles that govern human cognition generally, as illuminated by the research being done by cognitive scientists. Thus cognitive linguistics can often stray into areas of neuroscience and cognitive psychology. Instead of seeing language as the product of a specialised language organ in the brain, as per the universal grammar model, in cognitive linguistics language is intricately linked to human conceptualisation generally. It’s almost as if language is a by-product of human cognition. And so general principles of human cognition, as explored by cognitive scientists, must be able to account for the way language works as much as possible. Thus cognitive linguistics is a necessarily interdisciplinary field, just as cognitive science is, incorporating areas such as neuroscience, psychology, artificial intelligence, linguistics, anthropology, and philosophy.16

One of the most well-known and influential ideas to come out of the cognitive linguistic approach is the emphasis on metaphor, which George Lakoff and Mark Johnson explore in Metaphors We Live By. The argument is that metaphor is fundamental to all language, and indeed to all thought. We use metaphors as a way of thinking about and talking about abstract things by relating them to more concrete things. Thus when we say we are “in trouble” we are metaphorically conceiving of trouble as a container.17 Or when you say you “spend time” doing something you are drawing on the “time is money” metaphor. Some other significant scholars working in these kinds of areas include Mark Turner and Rafael E. Núñez.

Along with the greater focus on the cognition of speech has very recently come a resurgence in interest in the linguistic relativity question. In particular, over the past decade cognitive scientists such as Lera Boroditsky have been exploring the ways in which language shapes thought. No longer holding to the hard-line strong version of the Sapir-Whorf hypothesis that claimed that language determined thought and thus that language was some kind of limiting factor, the argument now is that language can be a factor in shaping habitual thought.18

Thus finally we come back to my use of the term cognitive philology. I suppose what I’m driving at is the literary and philological implications of cognitive linguistics. Of course there are other literary scholars who have sought to apply the ideas of cognitive science, and cognitive linguistics in particular, to the study of literature, and this has led to a field often referred to as cognitive poetics. Frequently, for instance, such scholarship examines the use of conceptual metaphor in literary texts, and how that contributes to the meaning of literature.19 But what I’m aiming at is perhaps somewhat broader than just cognitive poetics, involving the intersection of language, thought, history and culture.20 Language is an entry point into the mind, and a particularly useful one when studying historical periods, as the people themselves are long dead. All human culture is the product of human cognition, and can thus not really be understood without the lens of cognitive science; and if language is intimately linked with cognition, it is an important window into the human cognitive process and therefore human culture. Through the study of language, we can study the history of human thought and culture, and I hope to be able to show, in a small way at least, why this is interesting and important for everyone to understand. Language is the monument of thousands of years of human culture.

So writing this post has served a double purpose -- it explains some of the theoretical background to what I want to blog about, especially important if you don’t already have a background in languages or linguistics, but it’s also an opportunity for me to think out loud a bit. If you’ve made it this far, thanks for reading! As a reward, here’s a sketch by Stephen Fry and Hugh Laurie discussing language, which picks up on notions of Saussurian lange and parole, Chomskian generative grammar, the relationship between language and culture, pragmatics, and the formulaic nature of language. See how much funnier it is now that you’ve read all that?21

1 Indeed it might actually infuriate you to read this post if you’re an actual linguist, so you may as well stop now. Still here? Okay, but don’t say I didn’t warn you! [back]

2 For instance the Germanic f in English words such as foot, father, and fish corresponds predictably with p in other IE languages, such as Latin pes/pedis, pater, and piscis (and also with Greek πούς and πατηρ, for that matter — feel free to find correspondences in other languages, or consider other basic words like numbers (one, two, three, etc.) in as many languages as you know. It’s fun! [back]

3 An interesting side note, the Danish philologist Rasmus Christian Rask more or less came up with the same idea as Grimm, and Grimm initially credited him with this, hence it’s sometimes referred to as Rask’s-Grimm’s Rule. [back]

4 At least that’s what linguists working in linguistics departments would call it. Today, scholars focusing on language in literary fields, such as my own medieval studies, still tend to use the term philology. [back]

5 Sweet, not known for a sweet disposition but instead something of an irascible figure -- the prefaces to his various works make for quite entertaining reading -- was somewhat embittered that he did not receive a university professorship position as he thought he deserved. I’ve often felt quite a kinship with Sweet, and have often said, though I wouldn’t have wanted to live in the middle ages, I would love to have been a Victorian medieval philologist. [back]

7 Saussure’s work also kicked off the field of semiotics, which has become significant in the literary critical world. [back]

8 Interestingly, I suspect as many people know of Chomsky for his political commentary as for his linguistics work. I guess politics is more flashy than linguistics. However, in a future post I’d like to explore how politics is really all about language anyway. [back]

9 Chomsky devised the sentence “Colorless green ideas sleep furiously”, demonstrating among other things that the generative grammar can produce a syntactically well-formed sentence even if it was semantically nonsense. Lewis Carroll’s poem “The Jabberwocky” achieves much the same effect. [back]

11 Whorf has often been criticised as a dilettante, as he was originally trained as a chemical engineer, and only came to the study of linguistics later in life. However, the strengths and weaknesses of his work should stand for themselves. [back]

12 Essentially the claim is that language can affect the way you think in very fundamental ways; language differences can lead to differences in cognition. The original strong version of this theory, linguistic determinism, claimed that the language you spoke determined the way you were able to think about the world. The more recent versions of this idea claim that language can have an effect on the way you think but is not an absolute determiner of it. You can read more about linguistic relativity on Wikipedia or even better in this easily approachable and very well explained entry by Lera Boroditsky in the Encyclopedia of Cognitive Science. [back]

14 BEV is now more commonly referred to as African American Vernacular English (AAVE) or Ebonics. For an interesting discussion of this topic, have a listen to episode 4 of Slate’s excellent language podcast Lexicon Valley. [back]

15 A number of medievalists, such as Suzanne Fleischman, Laurel J. Brinton, and Peter Richardson, have focused on pragmatics and discourse markers, and they've been big influences on me as well. I’ll probably write a fuller post on this topic in another post, as pragmatics and discourse markers were the focus of my research for the first five years or so after completing my doctorate. [back]

16 This interdisciplinarity is what lies behind the graphics I've included at the beginning and end of this post. The graphics use the hexagram and heptagram to show the interrelated nature of these fields in cognitive science, an image which also lies behind the name of this blog -- the endless knot is a phrase from the 14th century poem Sir Gawain and the Green Knight and refers to the emblem of the pentangle or pentagram which Gawain has emblazoned on his shield. It is an image of the interconnectedness of things. Interconnectivity and interdisciplinarity are at the very heart of what I'm doing, and I'll be blogging much more about them soon. [back]

17 Children learn the basic conceptual schema of “container” when they spend hours putting things in and then taking them out of a box or other container. It seems to fascinate them endlessly to repeat this simple act, which later becomes fundamental to their thinking about the world in later life. [back]