Foundations of Language

Brain, Meaning, Grammar, Evolution

The Object-Oriented Turn in Generative Grammar

The goal of generative grammar, since its initial fashioning in the hands of
Noam Chomsky in the 1950s, has been to assemble an account of the mechanisms by
means of which human minds make language work: not the physiology of ears or
throats, but the mental processes involved when we produce or comprehend
language, whether it be spoken, written, signed, whistled, or
what-not. Even the schools of linguistics which have arisen in vehement
reaction to Chomsky largely retain this vision of linguistics as a way of
understanding how the mind works.

In a sense, it didn't have to be this way. One could imagine linguists who,
say, described grammatical rules in various languages, their changes over time,
and so on, without taking the way they described those rules to imply
anything about the minds of people who spoke those languages. Such linguistics
would be pure phenomenology (in the physicists' sense, not the philosophers'),
and in fact this seems to be more or less how computational linguists working
on statistical language
processing think about their models. Nobody supposes that people have
trigram word-frequency models in their heads, estimated by means of the
Baum-Welch algorithm.

Still, since we're all good materialists and mechanists these days, we have
to suppose that language is implemented in the brain somehow, and it
would be nice to know how. Put a little differently: my version of English has
a certain structure to it; therefore there must be something in my mind, in my
brain and its interaction with its environment, which corresponds to that
structure, just as is true of my ability to throw a frisbee, cook qorma-e-behi, or find my way around downtown Ann Arbor. If there are various
ways of describing the mathematical structure of my version of English, which
there are, it would be nice to know which one most closely corresponded to the
mental mechanisms involved. These could, of course, be totally idiosyncratic,
but that would be very odd, and it seems more reasonable to assume that the way
I implement English is very similar to the way other speakers of my dialect do.
It's a bigger leap, but it's not unreasonable to assume that the way I
implement my native language has got basically the same organization as the way
my cousins in Tamil Nadu implement theirs, even though English and Tamil are
not related languages.

Thus the enterprise of generative grammar: characterize the structure of
human languages in ways which illuminate the mental mechanisms involved in its
use. Jackendoff has devoted his professional life to this ambitious
undertaking — in his book The
Linguistics Wars, Randy Allen Harris describes him
as "Chomsky's conscience", the guy who did the hard work of filling in the
messy details needed to make Uncle Noam's proposals actually work, nor has his
commitment to generative grammar (as opposed to Chomsky) weakened.
But Foundations of Language is not intended just as an incremental
advance in generative grammar, or even as a summary of what has been achieved,
but rather as a fairly significant reformulation and reorientation.

Over the last half century, Chomsky has put forward many versions of
generative grammar, but they have all been strikingly syntactocentric.
There is only one truly generative component, which is syntax, and it drives
both phonology and semantics. This is very odd, when one thinks about it: if
anything, one would presume that comprehension is driven by phonology (what we
hear or see) and production by semantics (what we have in mind to tell
someone). The most striking feature of Jackendoff's new scheme is that he
treats phonology, syntax and semantics as three parallel generative processes,
coordinated through interface processes. Indeed, following previous work by
others in phonology, he's inclined to further subdivide each of these three
processes into various "tiers", themselves coordinated by interfaces. (In
phonology, for instance, there are separate tiers for syllabification, for
metrical stress and for the phonemes.) These interfaces are not sensitive to
every aspect of the processes they coordinate, so, for instance, phonology is
affected by some aspects of syntax, but not vice versa. This lets Jackendoff
recover the sane, commonsensical view that phonology drives comprehension and
semantics drives production, while not forcing these to be simplistic
feed-forward processes.

What we normally think of as a word, in Jackendoff's account, is really an
interface rule, specifying a certain coordination between a sound pattern,
syntactic features and a conceptual scheme. He goes on to ingeniously account
for many linguistic phenomena by means of interface rules, and to finally argue
that grammatical rules themselves are just interface rules involving variables,
rather than constants. He is quite insistent and unapologetic about
formulating things in terms of rules, symbols and variables; these all have to
be implemented somehow by neural dynamics, but if current connectionist models
can't do that, so much the worse them as models of the brain. (The
argument here explicitly follows Gary Marcus in his
The
Algebraic Mind.) As a student of symbolic dynamics, I approve wholeheartedly.
Jackendoff is very far from providing a detailed account of how language, as he
conceives it, might be instantiated in the brain, but he is at least trying to
move generative grammar in that direction. Just as much of Jackendoff's
grammatical argument consists of respectful but firm disagreement with Chomsky,
much of his argument about the brain proceeds by way of picking apart Jerry
Fodor's ideas about mental modularity, until one is left wondering why anyone
ever found them plausible. (Well, I've always wondered that, and
I'm notalone.)

This leads me to one of the most remarkable things about Foundations
of Language. Jackendoff describes the language faculty as a collection
of modular computational processes, which communicate with each other through
message-passing interfaces with limited access to the processes' internal
states. These processes acquire their properties not just through inheritance
in a hierarchy of types, but through multiple inheritance. In other
words, it sounds just like object-oriented
programming, but he never uses that phrase or anything like it. Since I'm
reliably informed he knows nothing about either the theory or practice of OOP,
I can only conclude he came up with all this himself, and am suitably awed.

Fodor reappears as a foil in the concluding chapters on semantics, where
Jackendoff forcefully argues against Fodor's "language of thought" idea. He
also comes down against the philosophers' habit of trying to relate language
directly to the world. This is a mistake: the relation is mediated
through the conceptual schemes or mental models of language users. When
everything is working properly, this mediation is so transparent we can almost
ignore it, but analytically it's wrong to do so. I think Jackendoff concedes
too much to subjectivism on this basis, i.e., more than he has to, but that's
really an issue at the margin. One topic I wish had received more attention
here is the way a coherent discourse builds up a conceptual structure in the
minds of its recipients. One the one hand, some quite interesting work has
been done in this area of discourse processing, like Catherine
Emmott's models of Narrative Comprehension, and on the other
it's reminiscent of the picture of the old Russian formalist and Czech
structuralist literary critics, of literary works as a series of layers,
ascending from the sheer sound-pattern of a poem up through (supposedly) a
system of norms or values implicit in it. (A convenient and accessible
presentation of this view is in Wellek and Warren's
Theory
of Literature.)

Evolution completes the list of topics in Jackendoff's subtitle. If one
thinks of the language faculty as a single perfect block, weirdly driven by
syntax, it becomes very hard to see how it could have evolved, or indeed why.
By picking it apart into lot of messy sub-components, some of which are clearly
related to aspects of the mind we share with other primates, the puzzle is, if
not necessarily dissolved, then at least reduced. Jackendoff sketches a
plausible, though of course speculative, series of stages between primate vocal
calls and modern human language, in the course of which syntax emerges as a
kind of monstrously swollen interface between semantics and phonology.
Moreover, language is perfectly comprehensible as a means of communication
— a way
of coordinating
our thoughts, and so our actions. It is a remarkable that some linguists
have so lost sight of this fact that Jackendoff needs to argue for it!
(Cf. this
recent paper by Pinker and Jackendoff, also available as
a PDF
preprint.)

I'm not really competent to judge Jackendoff's presentation of the state of
the literature on various topics, except for cognitive modeling and, to a
lesser degree, evolutionary reasoning. (Here
is an opinion by an actual linguist.) I will permit myself to say that two
things bugged me about the way Jackendoff proceeds here, but that these seem
endemic among linguistic theorists. First, this is an armchair exercise:
almost all of his examples are made up, rather than being drawn from corpora or
other data about actual language use. Second, the promise to get at the
workings of the mind is never redeemed in an actual model of any part
of the language faculty, or even enough mechanistic detail to help construct a
model. This worries me especially about his interface processes, which end up
having to do a lot of work; it's not even clear to me that the jobs he's asking
them to do are computationally tractable.

Having made these disclaimers, I will throw caution to the winds and say
this is a really good book. It requires some prior familiarity with linguistic
ideas and jargon, but not an overwhelming amount, and Jackendoff goes out of
his way to keep his writing clear, and his argument fair, modest, and
intellectually open to other disciplines. Read this if you have any serious
interest in either language or cognition.

A precis of Foundations of Language appeared as an article
in Behavioral and Brain Sciences [vol. 26 (2003), pp. 651--665],
with extensive commentary by others [pp. 666--695] and a reply by Jackendoff
[pp. 695--702]. There's a preprint
version of the precis.

8 February 2005. Thanks to John Loutzenhiser and Mark Liberman. (Typo
correction 19 February 2008; thanks to Michael Meadon)