On Tool Grammar

Among Noam Chomsky’s major scientific contributions was to turn linguistic science toward these questions. What does a speaker know that enables her to speak her language? What inherent capabilities enabled the language to be learned? And what does our knowledge of the structure of language tell us about how it might have arisen during the evolutionary process? The formidable force and impetus of the intellectual process that issued from this paradigm-changing perspective cannot be denied, but it also directs us toward future enquiries. What is next? It takes only a step to one side to observe that through generative linguistic studies large domains of linguistic science have not only been explored, but, as Chomsky recognizes, opened to new inquiry.

The purpose of this section is first to identify, factor out, and catalogue a number of Chomsky’s most recent central directives and claims about the human language capacity with a view to comparing them with modestly different assumptions. Secondly, I seek to identify a set of particular assumptions which have been central to the generative program as working principles without evidently ever having any empirical foundation.

By way of preview, my perspective and prejudice is to ask: what dictates that generative grammar must be defined to connect expression to full semantic interpretation, speaker to hearer? Given the pervasive possibility of the misunderstanding of meaning, I ask if the generative paradigm can support a reconfiguration to connect instead linguistic intention to a form of expression.

To organize this survey, I itemize and summarize Chomsky’s points from a recent overview essay, “The Galilean Challenge.” I propose in particular to circumscribe the manners of creativity that Chomsky is addressing in his recent work, that is, to ask: what is the Chomskyan Notion of Creativity, and how does it differ from other possible conceptions such as our notion of Tool Grammar Creativity?

To understand the relationship of Chomsky’s notions to Tool Grammar, it is sufficient to keep in mind the very simple definition of what Tool Grammar is, namely the set of investigations that follow from the assumption that linguistic theory must have reference to linguistic intention on the part of the speaker: linguistic rules involve specification of linguistic intent.

Chomsky has been writing clearly and elegantly, so it is not hard to interpret his recent ideas, but we will use direct quotes extensively to minimize misrepresentation and distortion of context, etc. Quoted material in the following is taken from this article unless otherwise noted. The purpose of this section is to examine the larger philosophical issues in order to lead in contrastively to the approach of Tool Grammar. A more technical analysis based on Chomsky’s more detailed linguistic analyses is under development in another section.

Chomsky quotes Galileo on the challenge of Full Creativity:

what sublimity of mind was his who dreamed of finding means to communicate his deepest thoughts to any other person, though distant by mighty intervals of place and time! Of talking with those who are in India; of speaking to those who are not yet born and will not be born for a thousand or ten thousand years; and with what facility, by the different arrangements of twenty characters upon a page!

We note that Galileo clearly refers to the full creativity of human communication from the issuance of thought formulation through externalized speech, to include understanding by the listener. For this reason I propose to call Galileo’s idea Full Creativity or End-to-End Creativity in contradistinction to the more limited conception that Chomsky advocates.

Chomsky correlates his scientific inspiration with that of Antoine Arnauld and Claude Lancelot per a quote from their Grammaire générale:

…one of the great spiritual advantages of human beings compared to other animals, and which is one of the most significant proofs of reason: that is, the method by which we are able to express our thoughts, the marvelous invention by which using twenty five or thirty sounds we can create the infinite variety of words, which… permit us to express all our secrets, and which allow us to understand what is not present to consciousness, in effect, everything that we can conceive…

Consider also Chomsky’s own inclusivity:

In the generation of the language of thought, the atomic elements are word-like, though not exactly words; for each language, the set of these elements is its lexicon. Lexical items are commonly regarded as cultural products, varying widely with experience, and linked to extra-mental entities. The latter assumption is expressed in the titles of standard works, such as W. V. O. Quine’s influential study, Word and Object. Closer examination of even the simplest words reveals a very different picture, one that poses many mysteries.

This characterization factors into the creation of lexical vocabulary, so it is useful to separate this out by identification of Lexical Creativity, which, we shall see, Chomsky leaves as a challenge and mystery for future investigation. Chomskyan Creativity implicitly may include Lexical Creativity, but inclusion is postponed for the current state of science.

He also quotes Descartes who proposed a new creative principle to account for the “human capacity for the unbounded and appropriate use of language”:

our ability to do or not do something… when the intellect puts something forward for affirmation or denial or for pursuit or avoidance… we do not feel we are determined by any external force.

Chomsky takes from this that the creativity of language not only distinguishes human beings from non-human animals, but also human beings from machines. Chomsky notes “A machine may be impelled to act in a certain way, but it cannot be inclined; with human beings, it is often the reverse. Explaining why this is so is the Galilean challenge.”

Chomsky thereby implies a concept of Unmechanizable Creativity per current conceptions, and admits of the importance of any work showing that his linguistic framework is not mechanizable in a machine.

Chomsky also credits Wilhelm von Humboldt for attention to the general theme: “…language must provide for the possibility of producing an undefinable set of phenomena, defined by the conditions imposed upon it by thought… It must, therefore, make infinite use of finite means…”

Here, he circumscribes a separate concept wherein many objects may be generated from a finite system. This concept, which is putatively separable from other dimensions of creativity, is formal and mathematical in nature allowing a conception where it can be envisaged in a machine implementation. We can refer to this concept as Infinite Generation Creativity.

The foregoing separable unmechanizable dimension of linguistic creativity, which is a form of free-will instigation in expressive choice, is separated conceptually by its relatively free nature in functionally-relevant content selection. While it is the interface from which linguistic structural representation emanates and may be proposed as separate as a module or domain of knowledge, it nevertheless is inherently coupled to sentence formulation so it deserves consideration as Free Choice Creativity.

As we see, Chomsky fairly couches his goals for his linguistic program as the Galilean Challenge, but there are multiple dimensions of selectivity in formulating varieties of the general idea. In view of these distinctions we can seek to specify what Chomsky intends among these various possibilities.

Here is his definition and characterization of the subject matter to be investigated indicating an orientation to quite a general idea: “The capacity for language is species specific, something shared by humans and unique to them.”

In this preliminary, general statement he does not narrow the scope of this capacity for investigation, but exclusions are presented in the course of discussion.

Chomsky undertakes to understand what he calls the Basic Property of human language, which involves: “the means to construct a digitally infinite array of structured expressions; each is semantically interpreted as expressing a thought, and each can be externalized by some sensory modality, such as speech.”

Significantly, the Basic Property includes neither decisions from thought formation about the manner or content of expression, nor the production of a physical output. For this reason it is useful to refer to this Basic Property as a more limited Basic Property of Structured Expression.

He understands this

infinite set of semantically interpreted objects to be a language of thought capable of linguistic expression, and that enters into reflection, inference, planning, and other mental processes [which] when externalized, can be used for social interactions.

Here we make an essential observation, namely that the characterizations of these structured objects as essentially a language of thought capable of use in communications excludes the possibility that they are a language of communicable cognition adapted both to thought and communication. This contention, which is a formidable scientific hypothesis, requires proof and validation over time so it is useful to mark it for reference as the Internal Cognition Structure Hypothesis. This theme of structural analysis to the exclusion of communicative function is a hallmark of the Chomskyan Challenge which should be noted as having been proposed but not necessarily proven in empirical investigations. Nor may it be provable as an empirical modularization of human capacity except by its scientific byproducts over a long period. Notably, there is a conundrum insofar as other scientists may take the position that there are domains of thought oriented to the manner of communication, so that the proposed distinction between thought and communication may be confounded or convoluted until further operational distinctions are made.

Chomsky seems for the time being to encompass (without necessarily ruling out) the formation of structured expression in a representation which neither includes, nor interacts linguistically with whatever part of cognition is used to plan sentences and develop intentions for their formation. We term the separability of intention computation as Linguistic Intent Exclusion and refer to structure generation without such as the Intent Orthogonality Hypothesis.

Chomsky’s proposals do explicitly involve adjustment to original Galilean challenge.

…traditional formulations were in terms of the production of expressions. In these terms, the challenge overlooked some basic issues. Like perception, production accesses an internal language, but cannot be identified with it. We must distinguish the internalized system of knowledge from the processes that access it.

We follow the history of linguistic philosophy as spearheaded by Chomsky himself by referring to this as the Linguistic Competence Separation Hypothesis.

He seeks to justify his distinction by association with processes of computation: “The theory of computability enables us to establish the distinction, which is an important one, familiar in other domains.”

It is perhaps warranted here to point to a need for explication in such asides referring to covert assent that he assures us other scholars would offer. It is an important rhetorical technique because it opens the possibilities of interdisciplinary resonance but serves better if used as an introduction to specifics rather than a general reinforcement of the argument. In the present case, it is transparently the case that in computation one process may of course have reference to a self-contained module, database, or schema external to it so the point is relevant. But many questions arise about interaction effects depending on the nature of the interrelations as well as the content and structure of the system design.

Chomsky notes that science was limited in the theory of computation prior to the twentieth century:

…tools were not available for formulating the problem in a way clear enough to be seriously addressed. That changed thanks to the work of Alonzo Church, Kurt Gödel, Emil Post, and Alan Turing, who established the general theory of computability. Their work demonstrated how a finite object like the brain could generate an infinite variety of expressions.

Chomsky proposes a strict modularity for the structural component of language, comparing it to arithmetic

In studying human arithmetical competence, we distinguish the internal system of knowledge from the actions that access it. When multiplying numbers in our heads, we depend on many factors beyond our intrinsic knowledge of arithmetic. Constraints of memory are the obvious example. The same is true of language. Production and perception access the internal language, but involve other factors as well, including short-term memory.

In proposing the Linguistic Competence Separation Hypothesis, Chomsky characteristically advances science by bold provocations involving operationally helpful simplifications of a problem space. This has been fundamental to his role as a dramatic catalyst of scientific advancement. In this context, it is important to note, however, that the scientific advancement this represents must always necessarily discount dimensions of possibility that have not emerged because they have not yet been explored. This means that the separation can be imagined and evidenced in whatever possibilities are available but strictly speaking would not rise to the level of firm scientific grounding as long as domain contingencies are not yet understood. This pattern, wherein the beginnings of scientific exploration begin with intuition, speculation and conceptualization is inherent and unavoidable in the process of discovery involving major breakthroughs. These must begin as unconfirmed theories at least temporarily undermotivated by available data.

Chomsky adverts to the mysteries surrounding the fullness of creativity in human language but also circumscribes it to exclude a class of actions: “There has been considerable progress in understanding the nature of the internal language, but its free creative use remains a mystery.”

He cites the complexity involved even in the simple action of raising an arm and, quoting neuroscientists Emilio Bizzi and Robert Ajemian, points to the extent of the complication to emphasize how much more involved human language must be:

which critically involves coordinate and variable transformations from spatial movement goals to muscle activations [which need] to be elaborated further. Phrased more fancifully, we have some idea as to the intricate design of the puppet and the puppet strings, but we lack insight into the mind of the puppeteer.

I refer to this distinction as the Physical Action Exclusion Hypothesis.

At this juncture I am able to address a main theme of the analysis. While Chomsky sees clear to separating physical action including necessary intermediations from structures of internal thought, this distinction leaves unaddressed for the future the concept of internal thoughts which represent or implement the intention and conception of a linguistic expression. At the center of this question are the intentions or decisions to express a particular configuration of thought and related intentions and decisions how to configure the expression of thought. At the considerable depth where Chomsky leaves the analysis of the formulation of the Galilean Challenge, the role of internal mental intentional decisions has importantly not been taken up. That is, Chomsky leaves for the future the question of mental actions which are decisions to say so and so in such and such a manner. Crucially, for my own investigations, I term this mental intermediation between the well of thought and the drawing of expression as the Intention Inclusion Hypothesis. Chomsky places his concerns elsewhere, while Tool Grammar views this as a crucially important empirical juncture. The specification of linguistic intention as part of the generative capacity either does or does not advance the possibilities of simplicity, understanding, and explanation in linguistic science. This is the definition of the Tool Grammar extension to the Generative Program.

This brings us to the role of simplicity in Linguistic Theory, which has been a long-standing theme for the developments within the Minimalist Program.

Couching the Basic Property in the framework of computational systems Chomsky views it as “a set of atomic elements, and rules to construct more complex structures from them” and expects simplicity and efficiency to have an explanatory role: “we therefore expect to respect general conditions on computational efficiency.” Also:

…we are bound to seek the simplest computational procedure consistent with the data of language. Simplicity is implicit in the basic goals of scientific inquiry. …only simple theories can attain a rich explanatory depth. … It is the task of the scientist to demonstrate this, from the motion of the planets, to an eagle’s flight, to the inner workings of a cell, to the growth of language in the mind of a child.

Further, Chomsky interestingly speculates:

Linguistics seeks the simplest theory for an additional reason: it must face the problem of evolvability. Not a great deal is known about the evolution of modern humans. The few facts that are well established, and others that have recently been coming to light, are rather suggestive. They conform to the conclusion that the language faculty is very simple; it may, perhaps, even be computationally optimal, precisely what is suggested on methodological grounds.

I summarize Chomsky’s position as dominantly favoring simple and efficient theoretical postulations and refer to it as the Dominance of Simplicity Hypothesis.

This brings us to a second main point of contrast between the Chomskyan Challenge and the related endeavor envisaged by Tool Grammar. Chomsky relies more heavily on efficiency and simplicity than any notion of functional or operational purpose such as Tool Grammar proposes. Like all theories, those of Tool Grammar must aim for the simplest explanation, but the notion of simple may be affected by the range of data that is considered. For Tool Grammar the various intentions of speakers are themselves a form of data which add an empirical dimension of purpose. From this perspective it seems fair to speculate that Chomsky envisages a step-wise progression of science whereby new observational data is acquired and incorporated in stages over time allowing simplicity to function as an efficient metric at any particular stage. While this can be an effective methodology, there is slightly less emphasis on stasis and simplicity in Tool Grammar with more impetus to move more quickly to inclusion of new levels of data analysis. It remains an open question how effective this adjustment, which I term the Cyclic Scope-Simplicity Hypothesis, will be but my belief in its usefulness is perhaps in contrast with Chomsky’s approach,

Given the intellectual scaffolding for his scientific approach, Chomsky procedes to investigate the properties of the basic mechanism. The concepts and analysis presented so far can be explored further by considering Chomsky’s use of actual linguistic examples

First we must acknowledge the massive importance of Chomsky’s theories as a stimulus for major advances in linguistic science. The following claim by Chomsky on his contribution is a dominant truth about linguistics since the publication of Syntactic Structures; it would be an injustice to minimize it in any way: “Universal properties of the language faculty came to light as soon as serious efforts were undertaken to construct generative grammars.”

Among the massive accumulations of important innovations Chomsky is recently emphasizing the rather more simple matter of structure dependence. “These included simple properties that had never before been noticed… One such property is structure-dependence.”

By structure-dependence Chomsky refers to hierarchical embedding closely related to the concept of constituency which has long been central to linguistic analysis. Structure-dependence, importantly, is implied by the process he refers to as Merge, discussed later. By this he means parts that can have parts in contrast to a linear string of elements without embedding properties “ignoring properties of the externalized signal, even such simple properties as linear order.”

Here are basic illustrative examples of the Structure Dependence Hypothesis that he frequently cites:

Take, say, the sentence The boy and the girl are here. With marginal exceptions, no one is tempted to say is here, even though the closest local relation is “girl + copula.” … Without instruction, we rely on structure not linearity, taking the phrase and not the local noun to determine agreement. Or take the sentence He saw the man with the telescope, which is ambiguous, depending on what we take to be the phrases, although the pronunciation and linear order do not change under either interpretation.

To take a subtler example, consider the ambiguous sentence Birds that fly instinctively swim. The adverb “instinctively” can be associated with the preceding verb (fly instinctively), or the following one (instinctively swim). Suppose now that we extract the adverb from the sentence, forming Instinctively, birds that fly swim. The ambiguity is now resolved. The adverb is interpreted only with the linearly more remote but structurally closer verb swim, not the linearly closer but structurally more remote verb fly. The only possible interpretation—birds swim—is unnatural. That doesn’t matter. The rules apply rigidly, independent of meaning and fact, ignoring the simple computation of linear distance, and keeping to the far more complex computation of structural distance.

His summary includes a statement that there is no current understanding why such should be so: “The property of structure-dependence holds for all constructions in all languages, and it is, indeed, puzzling.”

Nor does he find that approaches based on patterns of language use adequately meet or address the fundamental problem:

Structure-dependence is one of the few non-trivial properties of language that usage-based approaches to language have sought to accommodate. … All fail. … Why does this property hold for all languages, and all constructions?

Chomsky orients his theoretical direction to formal simplicity, rather functional explanation, which he posits as dramatically beneficial in both biological and methodological contexts:

The computational operations of language are the simplest possible. This is the outcome that we hope to reach on methodological grounds, and that is to be expected in the light of evidence about the evolution of language.

Since it impinges on the discussion below, it is worthwhile here to point out that aspects of the sentence Instinctively, birds that fly swim are not addressed by an analysis relying solely on structure-dependence. By not addressing observable features of the sentence, important dimensions of data can be bypassed. Note that the word “instinctively” can be moved to the front of the sentence, leading to the important question why a speaker might or might not do that. Without diverging to details here it is most evident that the speaker must have had some intention for the particular decision. This illustrates that structure can reflect function, which effects are in many circumstances simply a layer of empirical observation that can be added to the analysis.

Moreover, one might ask why the speaker specified “birds that swim” in the first place, rather than simply “birds.” That extra predication was necessarily an intentional decision for a reason which can be explored for the purpose of linguistic understanding. In effect it is a restrictive action limiting the class of birds being discussed. The value in consideration of such additional data can be accessed to formulate alternative hypotheses of explanation. In the present case, we observe that the restricted class “birds that fly” is further restricted if “fly” is itself restricted to “fly instinctively.” Given this, it is natural to hypothesize that the reason for structural dependence is a general constraint that if something is to be restricted it should be restricted locally rather than dispersing sub-predications randomly throughout a structure. The core of Chomsky’s point about adherence to structure still holds since the adverb in either position is structurally local but the utility of the data of intent is also evident since it is not an obscure fact that to emphasize or focus the content of “instinctively” it can be moved to the front. If there is validity here it provides a beginning to an answer the very question that otherwise might be considered mysterious: Why is structure dependence a constant in human language?

The optional positioning of “instinctively” also highlights another question, namely whether the positioning of the adverb at the front represents a different thought as might be implied by one interpretation of Chomsky’s discussion. Without some intricacies of mechanism, the view that structure reflects the language of thought would seem to indicate that differences in structure would reflect differences in thought. This simple logic does lead to the conclusion that the functional or stylistic movement of the adverb does reflect a difference in mental intention and an action in thought. Under these assumptions, we may observe a necessary conclusion akin to a proof that intention is part and parcel of structural formation. By this thinking we see a possibility of inclusion of linguistic intent within the realm of theorizing from the logic of Chomsky’s own thinking.

Further, observe that the movement of the adverb is evidently for reasons of emphasis to the listener, in other words for communicative purposes so that a highlighted predication might not be so easily overlooked in the process of interpretation. If this is not true, we would have to wonder why a single thought might take two different formations of expression. To the extent that this is true we have perhaps in these examples a clear illustration that structure formation is adapted not only to reflect the core instigatory thought but also to configure the output for effect on the listener. Of course, such may be inclusively encompassed in the conception of the thought being expressed, but avoidance of this convolution would seem to make the very point. All this allows for the possibility that language structure may involve dimensions of communicative purpose. Does language serve a dual purpose related both to the structure of thought and the exigencies of efficient communication? If structure reflects thought, we see the possibility of a logical inference that it must be connected to communicative function even within the assumptions of the generative program.

Proceeding to the theoretical question of the provenance of structure-dependence and mathematical creativity, Chomsky posits that structure-dependence and simplicity come together as co-benefits in the elementary Merge operation he proposes to account for human language structure.

To see why this is the case, consider the simplest recursive operation, embedded in one or another way in all others. This operation takes two objects already constructed, say X and Y, and forms a new object Z, without modifying either X or Y, or adding any further structure to them. Z can be taken to be just the set {X, Y}. In current work, the operation is called Merge. Since Merge imposes no order, the objects constructed, however complex, will be hierarchically structured, but unordered; operations on them will necessarily keep to structural distance, ignoring linear distance. The linguistic operations yielding the language of thought must be structure-dependent, as indeed is the case.

The simplicity of Merge is seen as explanatory for the ubiquitous structure-dependence in language. “An appeal to simplicity appears to answer the question why all human languages must exhibit structure-dependence.”

We refer to this as the Mathematical Merge Orientation in linguistic theory.

Chomsky amplifies the scientific justification for Merge by positing an extension of its applicability to syntactic displacement:

Displacement is a ubiquitous and puzzling property of language. Phrases are heard in one position but interpreted in two, both in their original position and in some other position that is silent, but grammatically possible. The sentence, “Which book will you read?” means roughly, “For which book x, you will read the book x,” with the nominal phrase book heard in one position but interpreted in two.

As an aside, we note that Chomsky again adds a reference to computational systems in support of his argument without fully detailing the logic of the connection: “Displacement is never built into artificial symbolic systems for metamathematics, programming, or other purposes. It had long been assumed to be a peculiar and puzzling imperfection of natural language.”

Chomsky’s approach benefits from casting displacement phenomena simply as an internal Merge involving two copies in different places. He does not, however, here reflect in detail on factors triggering the copy nor the implications should copies be massively generated in anticipation of filtered reduction at transformational or interpretive stage; “Merge automatically yields displacement with copies—in this case, two copies of which book. The correct semantic interpretation follows directly.”

The value of simplicity is seen as reinforced by the proposal that displacement conflates with Merge: “Far from being an imperfection, displacement with copies is a predicted property of the simplest system. Displacement is, in some respects, even simpler than Merge, since it calls on far more limited computational resources.”

Here, it is fair to observe that Chomsky’s scientific advance leaves considerable work in his wake which will impact the ultimate validity of his principal claims. Specifically, one looks forward to discovering whether the extreme degree of fundamental simplicity will or will not have an inverse effect by creating or not creating unwarranted complexity when others try to solve myriad linguistic problems using the core principles. Chomsky is methodologically correct but of course not yet historically confirmed in assuming that increased simplicity will be pervasive in larger solutions. We term this question the Monotonic Complexity Hypothesis.

Chomsky reinforces his argument by extending it to other phenomena and further expanding elsewhere “for such properties as referential dependence and quantifier-variable interaction.”

Consider the sentence “the boys expect to see each other,” where “each other” refers to the boys, thus obeying an obvious locality condition of referential dependency. Consider now the sentence, “Which girls do the boys expect to see each other?” The phrase “each other” does not refer back to the closest antecedent, “the boys,” as such phrases universally do; rather, it refers back to the more remote antecedent, “which girls.” … That is what reaches the mind under Merge-based computation with automatic copies…

Chomsky proceeds to argue that deletions in displacements are actually inefficient from the point of view of the listener, raising the question why there are deletion phenomena at all.

Deletion of the copy in externalization causes processing problems. [They] are among the major problems of automatic parsing and perception. In … “Who do you think ____ left the show early?” the gap marks the place from which the interrogative has been moved, creating a long-distance dependency between the interrogative and the finite verb. If the interrogative copy were not deleted, the problem would be much reduced. Why is it deleted?

Chomsky associates his analysis with “principles of efficient computation” to explain which copy is retained: “At least one copy must appear or there is no evidence that displacement took place at all. In English and languages like English, that copy must be structurally the most prominent one.”

But no explanation is provided for why there should be deletion given the consequence that interpretation by the hearer becomes complicated: “The result is to leave gaps that must be filled by the hearer. This is a matter that can become quite intricate.”

Chomsky sees evidence here that language is not so much a tool of communication as one of thought, putting to one side the evident possibility that deletion itself is for communication efficiency:

Language design appears to maximize computational efficiency, but disregards communicative efficiency. In every known case in which computational and communicative efficiency conflict, communicative efficiency is ignored. These facts run counter to the common belief, often virtual dogma, that communication is the basic function of language.

Our own perspective in Tool Grammar is that linguistic phenomena become less puzzling as functional and intentional factors are considered. In fact, it will be observed below that the basis of linguistic recombination Chomsky refers to as the Merge operation might be viewed as historiographically implicit in the tradition of Dependency Grammar, or even genetically cognate with the very idea of constituency present in many theoretical approaches. Since Dependency Grammar views the notion of predication as formative and fundamental, we can easily imagine a necessary implication of its recursive power as roughly suggested in phrase structure rules, or other formulations:

Predication => {Predicator, (Support, …)}

Support => {NP, (Predication)}

(This is indicative only and not intended as a formal statement)

This modestly different perspective has the attribute that it provides an extra-mathematical functional explanation for the existence of Merge or Predication, namely that expression requires the application of attributes to things. Otherwise there is no information. This interconnects linguistic theory in the realm of information theory as well as the mathematical simplicity of theories.

On the notion of complexity generally, it should be noted that it is frequently the case in symbolic systems that when they are compiled and realized for real-world application it is often advantageous to abandon simplicity in favor of redundancy for the purpose of speed and efficiency. Similarly, one observes tradeoffs among interacting systems where reduced complexity in one results in increased complexity in another. Of course these comparisons are only suggestive oblique references rather than empirical arguments, but they might be considered together with those similar references to computation that Chomsky includes in his analysis. It is certainly the case that in the past some generative solutions that have been proposed involve a level of complexity that makes them seem counter-intuitive. The main conclusion should be that the question of simplicity must perhaps be considered quite globally in any complex set of hypotheses about the nature of human language.

Summarizing, Chomsky carefully adjusts the Galilean challenge putatively to include only a simple computational system of internal knowledge and to exclude sentence production and speech, i.e. linearization and pronunciation.

He has distinguished “…language from speech, and… production from internal knowledge.” And posited “a language of thought, a system that might be remarkably simple” while attributing the complexity and variety of language to externalization:

Secondary processes map the structures of language to one or another sensorimotor system for externalization. These processes appear to be the locus of the complexity and variety of linguistic behavior, and its mutability over time.

Apart from the foregoing discussion of syntactic structure, Chomsky steps back from an analysis of the lexical items from which sentences are built: “The origins of computational atoms remain a complete mystery.”

He recognizes further that the notion of creativity which he seeks to capture in his proposals excludes the primary conception of the scholars who inspired him:

So does the Cartesian question of how language can be used in its normal creative way, in a manner appropriate to situations, but not caused by them, incited and inclined, but not compelled. The mystery holds for even the simplest forms of voluntary motion.

Justifiably, there is a valid claim as to the enormous progress that has been made in beginning to understand human language since the advent of his generative program:

A great deal has been learned about language since the Biolinguistic Program was initiated. It is fair to say, I think, that more has been learned about the nature of language, and about a very wide variety of typologically different languages, than in the entire two-thousand-five-hundred-year prior history of inquiry into language.

Finally, Chomsky inspires a new generation of linguists much in the way he has inspired two or three previously: “The more we learn, the more we discover what we do not know. And the more puzzling it often seems.”

Finally, we can summarize the limited contrasts between the Chomskyan Challenge and the approach taken in Tool Grammar. Chomsky is generally flexible, modest, and forward looking with regard to dimensions of inquiry for which proper avenues of inquiry seem mysterious at the present time. He excludes Lexical Knowledge and fundamental explanations for the Basic Principle and Structure-Dependence on these grounds. Further, he orients formal scientific investigation of the core linguistic capacity away from the instigation of linguistic expressions and communicative processes, as well as the physical actions of linguistics, and restricts linguistic science to the discipline of simplicity.

Tool Grammar is similar in many respects but views the intermediary mental intentions to select and structure information for expression as amenable to empirical investigation of widened scope. In addition, Tool Grammar seeks explanations beyond (and perhaps even explanatory of) mathematical simplicity in the functional processes that can be observed in human language. Chomsky highlights the human capacity as being quite different from that of a machine but prefers to consolidate knowledge of linguistic structure in a separable domain using methodologies and analogies that refer significantly to the properties of machines. Tool grammar in contrast incorporates immediately and directly a set of properties of human behavior that seem quite apart from machines we know or can imagine. By incorporating specifications of intention in linguistic analysis Tool Grammar seeks to advance more fully the understanding of human abilities that have motivated Chomsky and those that inspired him.

Larry Smith

Larry Smith is an entrepreneur, former academic, and a specialist in the Inuktitut (Eskimo) language for computational linguistics research.