Humans and nonhuman primates can learn about the organization of stimuli in the environment using implicit sequential pattern learning capabilities. However, most previous artificial grammar learning studies with nonhuman primates have involved relatively simple grammars and ...MORE ⇓

Humans and nonhuman primates can learn about the organization of stimuli in the environment using implicit sequential pattern learning capabilities. However, most previous artificial grammar learning studies with nonhuman primates have involved relatively simple grammars and short input sequences. The goal in the current experiments was to assess the learning capabilities of monkeys on an artificial grammar-learning task that was more complex than most others previously used with nonhumans. Three experiments were conducted using a joystick-based, symmetrical-response serial reaction time task in which two monkeys were exposed to grammar-generated sequences at sequence lengths of four in Experiment 1, six in Experiment 2, and eight in Experiment 3. Over time, the monkeys came to respond faster to the sequences generated from the artificial grammar compared to random versions. In a subsequent generalization phase, subjects generalized their knowledge to novel sequences, responding significantly faster to novel instances of sequences produced using the familiar grammar compared to those constructed using an unfamiliar grammar. These results reveal that rhesus monkeys can learn and generalize the statistical structure inherent in an artificial grammar that is as complex as some used with humans, for sequences up to eight items long. These findings are discussed in relation to whether or not rhesus macaques and other primate species possess implicit sequence learning abilities that are similar to those that humans draw upon to learn natural language grammar.

Proceedings of the Royal Society B: Biological Sciences 285(1871):e8559-578, 2018

Languages with many speakers tend to be structurally simple while small communities sometimes develop languages with great structural complexity. Paradoxically, the opposite pattern appears to be observed for non-structural properties of language such as vocabulary size. These ...MORE ⇓

Languages with many speakers tend to be structurally simple while small communities sometimes develop languages with great structural complexity. Paradoxically, the opposite pattern appears to be observed for non-structural properties of language such as vocabulary size. These apparently opposite patterns pose a challenge for theories of language change and evolution. We use computational simulations to show that this inverse pattern can depend on a single factor: ease of diffusion through the population. A population of interacting agents was arranged on a network, passing linguistic conventions to one another along network links. Agents can invent new conventions, or replicate conventions that they have previously generated themselves or learned from other agents. Linguistic conventions are either Easy or Hard to diffuse, depending on how many times an agent needs to encounter a convention to learn it. In large groups, only linguistic conventions that are easy to learn, such as words, tend to proliferate, whereas small groups where everyone talks to everyone else allow for more complex conventions, like grammatical regularities, to be maintained. Our simulations thus suggest that language, and possibly other aspects of culture, may become simpler at the structural level as our world becomes increasingly interconnected.

Structured sequence processing tasks inform us about statistical learning abilities that are relevant to many areas of cognition, including language. Despite the ubiquity of these abilities across different tasks and cognitive domains, recent research in humans has demonstrated ...MORE ⇓

Structured sequence processing tasks inform us about statistical learning abilities that are relevant to many areas of cognition, including language. Despite the ubiquity of these abilities across different tasks and cognitive domains, recent research in humans has demonstrated that these cognitive capacities do not represent a single, domain-general system, but are subject to modality-specific and stimulusspecific constraints. Sequence processing studies in nonhuman primates have provided initial insights into the evolution of these abilities. However, few studies have examined similarities and/or differences in sequence learning across sensory modalities. We review how behavioural and neuroimaging experiments assess sequence processing abilities across sensory modalities, and how these tasks could be implemented in nonhuman primates to better understand the evolution of these cognitive systems.

Human language is composed of sequences of reusable elements. The origins of the sequential structure of language is a hotly debated topic in evolutionary linguistics. In this paper, we show that sets of sequences with language-like statistical properties can emerge from a ...MORE ⇓

Human language is composed of sequences of reusable elements. The origins of the sequential structure of language is a hotly debated topic in evolutionary linguistics. In this paper, we show that sets of sequences with language-like statistical properties can emerge from a process of cultural evolution under pressure from chunk-based memory constraints. We employ a novel experimental task that is non-linguistic and non-communicative in nature, in which participants are trained on and later asked to recall a set of sequences one-by-one. Recalled sequences from one participant become training data for the next participant. In this way, we simulate cultural evolution in the laboratory. Our results show a cumulative increase in structure, and by comparing this structure to data from existing linguistic corpora, we demonstrate a close parallel between the sets of sequences that emerge in our experiment and those seen in natural language.

If human language must be squeezed through a narrow cognitive bottleneck, what are the implications for language processing, acquisition, change, and structure? In our target article, we suggested that the implications are far-reaching and form the basis of an integrated account ...MORE ⇓

If human language must be squeezed through a narrow cognitive bottleneck, what are the implications for language processing, acquisition, change, and structure? In our target article, we suggested that the implications are far-reaching and form the basis of an integrated account of many apparently unconnected aspects of language and language processing, as well as suggesting revision of many existing theoretical accounts. With some exceptions, commentators were generally supportive both of the existence of the bottleneck and its potential implications. Many commentators suggested additional theoretical and linguistic nuances and extensions, links with prior work, and relevant computational and neuroscientific considerations; some argued for related but distinct viewpoints; a few, though, felt traditional perspectives were being abandoned too readily. Our response attempts to build on the many suggestions raised by the commentators and to engage constructively with challenges to our approach.

Memory is fleeting. New material rapidly obliterates previous material. How, then, can the brain deal successfully with the continual deluge of linguistic input? We argue that, to deal with this "Now-or-Never" bottleneck, the brain must compress and recode linguistic input as ...MORE ⇓

Memory is fleeting. New material rapidly obliterates previous material. How, then, can the brain deal successfully with the continual deluge of linguistic input? We argue that, to deal with this "Now-or-Never" bottleneck, the brain must compress and recode linguistic input as rapidly as possible. This observation has strong implications for the nature of language processing: (1) the language system must "eagerly" recode and compress linguistic input; (2) as the bottleneck recurs at each new representational level, the language system must build a multilevel linguistic representation; and (3) the language system must deploy all available information predictively to ensure that local linguistic ambiguities are dealt with "Right-First-Time"; once the original input is lost, there is no way for the language system to recover. This is "Chunk-and-Pass" processing. Similarly, language learning must also occur in the here and now, which implies that language acquisition is learning to process, rather than inducing, a grammar. Moreover, this perspective provides a cognitive foundation for grammaticalization and other aspects of language change. Chunk-and-Pass processing also helps explain a variety of core properties of language, including its multilevel representational structure and duality of patterning. This approach promises to create a direct relationship between psycholinguistics and linguistic theory. More generally, we outline a framework within which to integrate often disconnected inquiries into language processing, language acquisition, and language change and evolution.

Networks of interconnected nodes have long played a key role in Cognitive Science, from artificial neural networks to spreading activation models of semantic memory. Recently, however, a new Network Science has been developed, providing insights into the emergence of global, ...MORE ⇓

Networks of interconnected nodes have long played a key role in Cognitive Science, from artificial neural networks to spreading activation models of semantic memory. Recently, however, a new Network Science has been developed, providing insights into the emergence of global, system-scale properties in contexts as diverse as the Internet, metabolic reactions, and collaborations among scientists. Today, the inclusion of network theory into Cognitive Sciences, and the expansion of complex-systems science, promises to significantly change the way in which the organization and dynamics of cognitive and behavioral processes are understood. In this paper, we review recent contributions of network theory at different levels and domains within the Cognitive Sciences.

We propose a simple model for genetic adaptation to a changing environment, describing a fitness landscape characterized by two maxima. One is associated with “specialist” individuals that are adapted to the environment; this maximum moves over time as the environment changes. ...MORE ⇓

We propose a simple model for genetic adaptation to a changing environment, describing a fitness landscape characterized by two maxima. One is associated with “specialist” individuals that are adapted to the environment; this maximum moves over time as the environment changes. The other maximum is static, and represents “generalist” individuals not affected by environmental changes. The rest of the landscape is occupied by “maladapted” individuals. Our analysis considers the evolution of these three subpopulations. Our main result is that, in presence of a sufficiently stable environmental feature, as in the case of an unchanging aspect of a physical habitat, specialists can dominate the population. By contrast, rapidly changing environmental features, such as language or cultural habits, are a moving target for the genes; here, generalists dominate, because the best evolutionary strategy is to adopt neutral alleles not specialized for any specific environment. The model we propose is based on simple assumptions about evolutionary dynamics and describes all possible scenarios in a non-trivial phase diagram. The approach provides a general framework to address such fundamental issues as the Baldwin effect, the biological basis for language, or the ecological consequences of a rapid climate change.

In contrast with animal communication systems, diversity is characteristic of almost every aspect of human language. Languages variously employ tones, clicks, or manual signs to signal differences in meaning; some languages lack the noun-verb distinction (e.g., Straits Salish), ...MORE ⇓

In contrast with animal communication systems, diversity is characteristic of almost every aspect of human language. Languages variously employ tones, clicks, or manual signs to signal differences in meaning; some languages lack the noun-verb distinction (e.g., Straits Salish), whereas others have a proliferation of fine-grained syntactic categories (e.g., Tzeltal); and some languages do without morphology (e.g., Mandarin), while others pack a whole sentence into a single word (e.g., Cayuga). A challenge for evolutionary biology is to reconcile the diversity of languages with the high degree of biological uniformity of their speakers. Here, we model processes of language change and geographical dispersion and find a consistent pressure for flexible learning, irrespective of the language being spoken. This pressure arises because flexible learners can best cope with the observed high rates of linguistic change associated with divergent cultural evolution following human migration. Thus, rather than genetic adaptations for specific aspects of language, such as recursion, the coevolution of genes and fast-changing linguistic structure provides the biological basis for linguistic diversity. Only biological adaptations for flexible learning combined with cultural evolution can explain how each child has the potential to learn any human language.

Abstract It is generally assumed that hierarchical phrase structure plays a central role in human language. However, considerations of simplicity and evolutionary continuity suggest that hierarchical structure should not be invoked too hastily. Indeed, recent ...

This article addresses the logical problem of language evolution that arises from a conventional universal grammar (UG) perspective and investigates the biological and cognitive constraints that are considered when explaining the cultural evolution of language. The UG prespective ...MORE ⇓

This article addresses the logical problem of language evolution that arises from a conventional universal grammar (UG) perspective and investigates the biological and cognitive constraints that are considered when explaining the cultural evolution of language. The UG prespective states that language acquisition should not be viewed as a process of learning at all but it should be viewed as a process of growth, analogous to the growth of the arm or the liver. UG is intended to characterize a set of universal grammatical principles that hold across all languages. Language has the same status as other cultural products, such as styles of dress, art, music, social structure, moral codes, or patterns of religious beliefs. Language may be particularly central to culture and act as the primary vehicle through which much other cultural information is transmitted. The biological and cognitive constraints helps to determine which types of linguistic structure tend to be learned, processed, and hence transmitted from person to person, and from generation to generation. The communicative function of language is likely to shape language structure in relation to the thoughts that are transmitted and regarding the processes of pragmatic interpretation that people use to understand each other's behavior. A source of constraints derives from the nature of cognitive architecture, including learning, processing, and memory. The language processing involves generating and decoding regularities from highly complex sequential input, indicating a connection between general-purpose cognitive mechanisms for learning and processing sequential material, and the structure of natural language.

Abstract Although there may be no true language universals, it is nonetheless possible to discern several family resemblance patterns across the languages of the world. Recent work on the cultural evolution of language indicates the source of these patterns is unlikely to ...

Abstract 1. Recent research has demonstrated that systematic mappings between phonological word forms and their meanings can facilitate language learning (eg, in the form of sound symbolism or cues to grammatical categories). Yet, paradoxically from a learning ...

Proceedings of the 8th International Conference on the Evolution of Language, pages 26-33, 2010

Understanding language evolution in terms of cultural transmission across generations of language users raises the possibility that some of the processes that have shaped language evolution can also be observed in historical language change. In this paper, we explore how ...MORE ⇓

Understanding language evolution in terms of cultural transmission across generations of language users raises the possibility that some of the processes that have shaped language evolution can also be observed in historical language change. In this paper, we explore how constraints on production may affect the cultural evolution of language by analyzing the emergence of the Romance languages from Latin. Specifically, we focus on the change from Latin's flexible but OV (Object-Verb) dominant word order with complex case marking to fixed SVO (Subject-Verb-Object) word order with little or no noun inflections in Romance Languages. We suggest that constraints on second language learners' ability to produce sentences may help explain this historical change. We conclude that historical data on linguistic change can provide a useful source of information relevant to investigating the cognitive constraints that affect the cultural evolution of language.

Recent research suggests that language evolution is a process of cultural change, in which linguistic structures are shaped through repeated cycles of learning and use by domain-general mechanisms. This paper draws out the implications of this viewpoint for understanding the ...MORE ⇓

Recent research suggests that language evolution is a process of cultural change, in which linguistic structures are shaped through repeated cycles of learning and use by domain-general mechanisms. This paper draws out the implications of this viewpoint for understanding the problem of language acquisition, which is cast in a new, and much more tractable, form. In essence, the child faces a problem of induction, where the objective is to coordinate with others (C-induction), rather than to model the structure of the natural world (N-induction). We argue that, of the two, C-induction is dramatically easier. More broadly, we argue that understanding the acquisition of any cultural form, whether linguistic or otherwise, during development, requires considering the corresponding question of how that cultural form arose through processes of cultural evolution. This perspective helps resolve the 'logical' problem of language acquisition and has far-reaching implications for evolutionary psychology.

Language acquisition and processing are governed by genetic constraints. A crucial unresolved question is how far these genetic constraints have coevolved with language, perhaps resulting in a highly specialized and species-specific language 'module,' and how much language ...MORE ⇓

Language acquisition and processing are governed by genetic constraints. A crucial unresolved question is how far these genetic constraints have coevolved with language, perhaps resulting in a highly specialized and species-specific language 'module,' and how much language acquisition and processing redeploy preexisting cognitive machinery. In the present work, we explored the circumstances under which genes encoding language-specific properties could have coevolved with language itself. We present a theoretical model, implemented in computer simulations, of key aspects of the interaction of genes and language. Our results show that genes for language could have coevolved only with highly stable aspects of the linguistic environment; a rapidly changing linguistic environment does not provide a stable target for natural selection. Thus, a biological endowment could not coevolve with properties of language that began as learned cultural conventions, because cultural conventions change much more rapidly than genes. We argue that this rules out the possibility that arbitrary properties of language, including abstract syntactic principles governing phrase structure, case marking, and agreement, have been built into a 'language module' by natural selection. The genetic basis of human language acquisition and processing did not coevolve with language, but primarily predates the emergence of language. As suggested by Darwin, the fit between language and its underlying mechanisms arose because language has evolved to fit the human brain, rather than the reverse.

Proceedings of the 31st Annual Conference of the Cognitive Science Society, 2009

The past couple of decades have seen an explosion of research on language evolution, initially fueled by Pinker and Bloomas (1990) groundbreaking article arguing for the natural selection of biological structures dedicated to language. The new millennium has seen a shift toward ...MORE ⇓

The past couple of decades have seen an explosion of research on language evolution, initially fueled by Pinker and Bloomas (1990) groundbreaking article arguing for the natural selection of biological structures dedicated to language. The new millennium has seen a shift toward explaining language evolution in terms of cultural evolution rather than biological adaptation. Crucially, this research has many important implications for cognitive science, not only in terms of the nature of the biases to consider in language acquisition but also for cognition, more generally. In this symposium, we therefore take stock of current work on the cultural evolution of language, highlighting key implications of this work for cognitive scientists from different perspectives, ranging from philosophical considerations (Chater) and Bayesian analyses (Griffiths) to evolutionary psycholinguistics (Kirby) and molecular genetics (Christiansen).

This article discusses human language in the context of the major evolutionary transitions in the history of life. Because of its unique structure, language enables the transmission of unlimited cultural information in our species. Understanding its evolution is therefore an ...MORE ⇓

This article discusses human language in the context of the major evolutionary transitions in the history of life. Because of its unique structure, language enables the transmission of unlimited cultural information in our species. Understanding its evolution is therefore an important topic for both cognitive science and evolutionary theory more widely. This article highlights points of consensus on what is crucial for progress in this area: understanding preadaptations; the necessity for interdisciplinarity; and the importance of modeling, comparative approaches, and genetics. It also discusses current controversies: biological versus cultural evolution, vocal versus manual origins, and the nature of protolanguage.

Most current approaches to linguistic structure suggest that language is recursive, that recursion is a fundamental property of grammar, and that independent performance constraints limit recursive abilities that would otherwise be infinite. This article presents a usage-based ...MORE ⇓

Most current approaches to linguistic structure suggest that language is recursive, that recursion is a fundamental property of grammar, and that independent performance constraints limit recursive abilities that would otherwise be infinite. This article presents a usage-based perspective on recursive sentence processing, in which recursion is construed as an acquired skill and in which limitations on the processing of recursive constructions stem from interactions between linguistic experience and intrinsic constraints on learning and processing. A connectionist model embodying this alternative theory is outlined, along with simulation results showing that the model is capable of constituent-like generalizations and that it can fit human data regarding the differential processing difficulty associated with center-embeddings in German and cross-dependencies in Dutch. Novel predictions are furthermore derived from the model and corroborated by the results of four behavioral experiments, suggesting that acquired recursive abilities are intrinsically bounded not only when processing complex recursive constructions, such as center-embedding and cross-dependency, but also during processing of the simpler, right- and left-recursive structures.

This chapter begins with a brief discussion of the general perspective in the linguistic community on language universals. It then presents an overview of the subsequent chapters in this book. This is followed by a discussion of the importance of interdisciplinary research and a ...MORE ⇓

This chapter begins with a brief discussion of the general perspective in the linguistic community on language universals. It then presents an overview of the subsequent chapters in this book. This is followed by a discussion of the importance of interdisciplinary research and a multidisciplinary approach towards understanding language universals.

Abstract: A key challenge for theories of language evolution is to explain why language is the way it is and how it came to be that way. It is clear that how we learn and use language is governed by genetic constraints. However, the nature of these innate constraints has been ...MORE ⇓

Abstract: A key challenge for theories of language evolution is to explain why language is the way it is and how it came to be that way. It is clear that how we learn and use language is governed by genetic constraints. However, the nature of these innate constraints has been ...

Language has a fundamentally social function. Processes of human interaction along with domain-general cognitive processes shape the structure and knowledge of language. Recent research in the cognitive sciences has demonstrated that patterns of use strongly affect how language ...MORE ⇓

Language has a fundamentally social function. Processes of human interaction along with domain-general cognitive processes shape the structure and knowledge of language. Recent research in the cognitive sciences has demonstrated that patterns of use strongly affect how language is acquired, is used, and changes. These processes are not independent of one another but are facets of the same complex adaptive system (CAS). Language as a CAS involves the following key features: The system consists of multiple agents (the speakers in the speech community) interacting with one another. The system is adaptive; that is, speakers' behavior is based on their past interactions, and current and past interactions together feed forward into future behavior. A speaker's behavior is the consequence of competing factors ranging from perceptual constraints to social motivations. The structures of language emerge from interrelated patterns of experience, social interaction, and cognitive mechanisms. The CAS approach reveals commonalities in many areas of language research, including first and second language acquisition, historical linguistics, psycholinguistics, language evolution, and computational modeling.

Studies of language change have begun to contribute to answering several pressing questions in cognitive sciences, including the origins of human language capacity, the social construction of cognition and the mechanisms underlying culture change in general. Here, we describe ...MORE ⇓

Studies of language change have begun to contribute to answering several pressing questions in cognitive sciences, including the origins of human language capacity, the social construction of cognition and the mechanisms underlying culture change in general. Here, we describe recent advances within a new emerging framework for the study of language change, one that models such change as an evolutionary process among competing linguistic variants. We argue that a crucial and unifying element of this framework is the use of probabilistic, data-driven models both to infer change and to compare competing claims about social and cognitive influences on language change.

It is widely assumed that language in some form or other originated by piggybacking on pre-existing learning mechanism not dedicated to language. Using evolutionary connectionist simulations, we explore the implications of such assumptions by determining the effect of constraints ...MORE ⇓

It is widely assumed that language in some form or other originated by piggybacking on pre-existing learning mechanism not dedicated to language. Using evolutionary connectionist simulations, we explore the implications of such assumptions by determining the effect of constraints derived from an earlier evolved mechanism for sequential learning on the interaction between biological and linguistic adaptation across generations of language learners. Artificial neural networks were initially allowed to evolve ``biologically'' to improve their sequential learning abilities, after which language was introduced into the population. We compared the relative contribution of biological and linguistic adaptation by allowing both networks and language to change over time. The simulation results support two main conclusions: First, over generations, a consistent head-ordering emerged due to linguistic adaptation. This is consistent with previous studies suggesting that some apparently arbitrary aspects of linguistic structure may arise from cognitive constraints on sequential learning. Second, when networks were selected to maintain a good level of performance on the sequential learning task, language learnability is significantly improved by linguistic adaptation but not by biological adaptation. Indeed, the pressure toward maintaining a high level of sequential learning performance prevented biological assimilation of linguistic-specific knowledge from occurring.

Natural languages share common features known as linguistic universals but the nature and origin of these features remain controversial. Generative approaches propose that linguistic universals are defined by a set of innately specified linguistic constraints in universal grammar ...MORE ⇓

Natural languages share common features known as linguistic universals but the nature and origin of these features remain controversial. Generative approaches propose that linguistic universals are defined by a set of innately specified linguistic constraints in universal grammar (UG). The UG hypothesis is primarily supported by Poverty of Stimulus (POS) arguments that posit that the structure of language cannot be learned from exposure to the linguistic environment. This chapter reviews recent computational and empirical research in statistical learning that raises serious questions about the basic assumptions of POS arguments. More generally, these results question the validity of UG as the basis for linguistic universals. As an alternative, the chapter proposes that linguistic universals should be viewed as functional features of language, emerging from constraints on statistical learning mechanisms themselves and from general functional and pragmatic properties of communicative interactions.

It is widely assumed that human learning and the structure of human languages are intimately related. This relationship is frequently suggested to derive from a language-specific biological endowment, which encodes universal, but communicatively arbitrary, principles of language ...MORE ⇓

It is widely assumed that human learning and the structure of human languages are intimately related. This relationship is frequently suggested to derive from a language-specific biological endowment, which encodes universal, but communicatively arbitrary, principles of language structure (a Universal Grammar or UG). How might such a UG have evolved? We argue that UG could not have arisen either by biological adaptation or non-adaptationist genetic processes, resulting in a logical problem of language evolution. Specifically, as the processes of language change are much more rapid than processes of genetic change, language constitutes a both over time and across different human populations, and, hence, cannot provide a stable environment to which language genes could have adapted. We conclude that a biologically determined UG is not evolutionarily viable. Instead, the original motivation for UG arises because language has been shaped to fit the human brain, rather than vice versa. Following Darwin, we view language itself as a complex and interdependent which evolves under selectional pressures from human learning and processing mechanisms. That is, languages themselves are shaped by severe selectional pressure from each generation of language users and learners. This suggests that apparently arbitrary aspects of linguistic structure may result from general learning and processing biases deriving from the structure of thought processes, perceptuo-motor factors, cognitive limitations, and pragmatics.

Our target article argued that a genetically specified Universal Grammar (UG), capturing arbitrary properties of languages, is not tenable on evolutionary grounds, and that the close fit between language and language learners arises because language is shaped by the brain, rather ...MORE ⇓

Our target article argued that a genetically specified Universal Grammar (UG), capturing arbitrary properties of languages, is not tenable on evolutionary grounds, and that the close fit between language and language learners arises because language is shaped by the brain, rather than the reverse. Few commentaries defend a genetically specified UG. Some commentators argue that we underestimate the importance of processes of cultural transmission; some propose additional cognitive and brain mechanisms that may constrain language and perhaps differentiate humans from nonhuman primates; and others argue that we overstate or understate the case against co-evolution of language genes. In engaging with these issues, we suggest that a new synthesis concerning the relationship between brains, genes, and language may be emerging.

We conducted a large-scale corpus analysis indicating that pronominal object relative clauses are significantly more frequent than pronominal subject relative clauses when the embedded pronoun is personal. This difference was reversed when impersonal pronouns ...

Proceedings of the 6th International Conference on the Evolution of Language, pages 27-34, 2006

Human languages are characterized by a number of universal patterns of structure and use. Theories differ on whether such linguistic universals are best understood as arbitrary features of an innate language acquisition device or functional features deriving from cognitive and ...MORE ⇓

Human languages are characterized by a number of universal patterns of structure and use. Theories differ on whether such linguistic universals are best understood as arbitrary features of an innate language acquisition device or functional features deriving from cognitive and communicative constraints. From the viewpoint of language evolution, it is important to explain how such features may have originated. We use computational simulations to investigate the circumstances under which universal linguistic constraints might get genetically fixed in a population of language learning agents. Specifically, we focus on the Baldwin effect as an evolutionary mechanism by which previously learned linguistic features might become innate through natural selection across many generations of language learners. The results indicate that under assumptions of linguistic change, only functional, but not arbitrary, features of language can become genetically fixed.

Proceedings of the 6th International Conference on the Evolution of Language, pages 333-340, 2006

Simultaneous acquisition of multiple languages to a native level of fluency is common in many areas of the world. This ability must be represented in any cognitive mechanisms used for language. Potential explanations of the evolution of language must also account for the ...MORE ⇓

Simultaneous acquisition of multiple languages to a native level of fluency is common in many areas of the world. This ability must be represented in any cognitive mechanisms used for language. Potential explanations of the evolution of language must also account for the bilingual case. Surprisingly, this fact has not been widely considered in the literature on language origins and evolution. We consider any array of potential accounts for this phenomenon, including arguments by selectionists on the basis for language variation. We find scant evidence for specific selection of the multilingual ability prior to language origins. Thus it seems more parsimonious that bilingualism ``came for free'' along with whatever mechanisms did evolve. Sequential learning mechanisms may be able to accomplish multilingual acquisition without specific adaptations. In support of this perspective, we present a simple recurrent network model that is capable of learning two idealized grammars simultaneously. These results are compared with recent studies of bilingual processing using eyetracking and fMRI showing vast overlap in the areas in the brain used in processing two different languages.

Considerable research in language acquisition has addressed the extent to which basic aspects of linguistic structure might be identified on the basis of probabilistic cues in caregiver speech to children. In this chapter, we examine systems that have the capacity ...

Much ink has been spilled arguing over the idea that ontogeny recapitulates phylogeny. The discussions typically center on whether developmental stages reflect different points in the evolution of some specific trait, mechanism, or morphological structure. For example, the ...

The leading scholars in the rapidly growing field of language evolution give readable accounts of their theories on the origins of language and reflect on the most important current issues and debates. As well as providing a guide to their own published research ...

What is it that makes us human? If we look at the impact that we have had on our environment, it is hard not to think that we are in some way'special'—a qualitatively different species from any of the ten million others. Perhaps we only feel that way because it is hard ...

Prior to the emergence of writing systems, no direct evidence remains to inform theories about the evolution of language. Only by amassing evidence from many different disciplines can theorizing about the evolution of language be sufficiently constrained to remove it ...

Why is language the way it is? How did language come to be this way? And why is our species alone in having complex language? These are old unsolved questions that have seen a renaissance in the dramatic recent growth in research being published on the origins and evolution of ...MORE ⇓

Why is language the way it is? How did language come to be this way? And why is our species alone in having complex language? These are old unsolved questions that have seen a renaissance in the dramatic recent growth in research being published on the origins and evolution of human language. This review provides a broad overview of some of the important current work in this area. We highlight new methodologies (such as computational modeling), emerging points of consensus (such as the importance of pre-adaptation), and the major remaining controversies (such as gestural origins of language). We also discuss why language evolution is such a difficult problem, and suggest probable directions research may take in the near future.

There are an enormous number of communication systems in the natural world (Hauser, 1996). When a male Tungara frog produces `whines' and `chucks' to attract a female, when a mantis shrimp strikes the ground to warn o a competitor for territory, even when a bee is attracted to a ...MORE ⇓

There are an enormous number of communication systems in the natural world (Hauser, 1996). When a male Tungara frog produces `whines' and `chucks' to attract a female, when a mantis shrimp strikes the ground to warn o a competitor for territory, even when a bee is attracted to a particular flower, communication is taking place. Humans as prodigious communicators are not unusual in this respect. What makes human language stand out as unique (or at least very rare indeed, Oliphant, 2002) is the degree to which it is learned.

The frog's response to mating calls is determined by its genes, which have been tuned by natural selection. There is an inevitability to the use of this signal. Barring some kind of disaster in the development of the frog, we can predict its response from birth. If we had some machine for reading and translating its DNA, we could read-off its communication system from the frog genome. We cannot say the same of a human infant. The language, or languages, that an adult human will come to speak are not predestined in the same way. The particular sounds that a child will use to form words, the words themselves, the ways in which words will be modi ed and strung together to form utterances - none of this is written in the human genome.

Whereas frogs store their communication system in their genome, much of the details of human communication are stored in the environment. The information telling us the set of vowels we should use, the inventory of verb stems, the way to form the past tense, how to construct a relative-clause, and all the other facts that make up a human language must be acquired by observing the way in which others around us communicate. Of course this does not mean that human genes have no role to play in determining the structure of human communication. If we could read the genome of a human like we did with the frog, we would find that, rather than storing details of a communication system, our genes provide us with mechanisms to retrieve these details from the behaviour of others.

From a design point of view, it is easy to see the advantages of providing instructions for building mechanisms for language acquisition rather than the language itself. Human language cannot be completely innate because it would not t in the genome. Worden (1995) has derived a speed-limit on evolution that allows us to estimate the maximum amount of information in the human genome that codes for the cognitive di erences between us and chimpanzees. He gives a paltry gure of approximately 5 kilobytes. This is equivalent to the text of just the introduction to this chapter.

The implications of this aspect of human uniqueness are the subject of this chapter. In the next section we will look at the way in which language learning leads naturally to language variation, and what the constraints on this variation tell us about language acquisition. In section three, we introduce a computational model of sequential learning and show that the natural biases of this model mirror many of the human learner's biases, and help to explain the universal properties of all human languages.

If learning biases such as those arising from sequential learning are to explain the structure of language, we need to explore the mechanism that links properties of learning to properties of what is being learned. In section four we look in more detail at this issue, and see how learning biases can lead to language universals by introducing a model of linguistic transmission called the Iterated Learning Model. We go on to show how this model can be used to understand some of the fundamental properties of human language syntax.

Finally, we look at the implications of our work for linguistic and evolutionary theory. Ultimately, we argue that linguistic structure arises from the interactions between learning, culture and evolution. If we are to understand the origins of human language, we must understand what happens when these three complex adaptive systems are brought together.

After having been plagued for centuries by unfounded speculations, the study of language evolution is now emerging as an area of legitimate scientific inquiry. Early conjectures about the origin and evolution of language suffered from a severe lack of empirical evidence to ...

The acquisition and processing of language is governed by a number of universal constraints, many of which undoubtedly derive from innate properties of the human brain. These constraints lead to certain universal tendencies in how languages are structured ...MORE ⇓

Introduction

The acquisition and processing of language is governed by a number of universal constraints, many of which undoubtedly derive from innate properties of the human brain. These constraints lead to certain universal tendencies in how languages are structured and used. More generally, the constraints help explain why the languages of the world take up only a small part of the considerably larger space de. ned by the logically possible linguistic subpatterns. Although there is broad consensus about the existence of innate constraints on the way language is acquired and processed, there is much disagreement over whether these constraints are linguistic or cognitive in nature. Determining the nature of these constraints is important not only for theories of language acquisition and processing, but also for theories of language evolution. Indeed, these issues are theoretically intertwined because the constraints on language define the endpoints for evolutionary explanations: theories about how the constraints evolved in the hominid lineage are thus strongly determined by what the nature of these constraints is taken to be.

The Chomskyan approach to language suggests that the constraints on the acquisition and processing of language are linguistic, rather than cognitive, in nature. Th e constraints are represented in the form of a Universal Grammar (UG)--a large biological endowment of linguistic knowledge (e.g. Chomsky 1986). It is assumed that this knowledge-base is highly abstract, comprising a complex set of linguistic rules and principles that could not be acquired from exposure to language during development. Opinions differ about how UG emerged as the endpoint of language evolution. Some researchers have suggested that it evolved through a gradual process of natural selection (e.g., Newmeyer 1991; Pinker 1994; Pinker and Bloom 1990), whereas others have argued for a sudden emergence through non-adaptationist evolutionary processes (e.g., Bickerton 1995; Piattelli-Palmarini 1989). An important point of agreement is the emphasis in their explanations of language evolution on the need for very substantial biological changes to accommodate linguistic structure.

More recently an alternative perspective is gaining ground, advocating a refocus in thinking about language evolution. Rather than concentrating on biological changes to accommodate language, this approach stresses the adaptation of linguistic structures to the biological substrate of the human brain (e.g., Batali 1998; Christiansen 1994; Christiansen and Devlin 1997; Deacon 1997; Kirby 1998, 2000, 2001). Languages are viewed as dynamic systems of communication, subject to selection pressures arising from limitations on human learning and processing. Some approaches within this framework have built in a certain amount of linguistic machinery, such as context-free grammars (Kirby 2000). In this chapter we argue that many of the constraints on linguistic adaptation derive from non-linguistic limitations on the learning and processing of hierarchically organized sequential structure. Th ese mechanisms existed prior to the appearance of language, but presumably also underwent changes aft er the emergence of language. However, the selection pressures are likely to have come not only from language but also from other kinds of complex hierarchical processing, such as the need for increasingly complex manual combinations following tool sophistication. Consequently, many language universals may re. ect nonlinguistic, cognitive constraints on learning and processing of sequential structure rather than an innate UG.

Sequential learning plays a role in a variety of common tasks, such as human language processing, animal communication, and the learning of action sequences. In this article, we investigate sequential learning in non-human primates from a comparative perspective, focusing on ...MORE ⇓

Sequential learning plays a role in a variety of common tasks, such as human language processing, animal communication, and the learning of action sequences. In this article, we investigate sequential learning in non-human primates from a comparative perspective, focusing on three areas: the learning of arbitrary, fixed sequences; statistical learning; and the learning of hierarchical structure. Although primates exhibit many similarities to humans in their performance on sequence learning tasks, there are also important differences. Crucially, non-human primates appear to be limited in their ability to learn and represent the hierarchical structure of sequences. We consider the evolutionary implications of these differences and suggest that limitations in sequential learning may help explain why non-human primates lack human-like language.

Naturally occurring speech contains only a limited amount of complex recursive structure, and this is reflected in the empirically documented difficulties that people experience when processing such structures. We present a connectionist model of human performance in processing ...MORE ⇓

Naturally occurring speech contains only a limited amount of complex recursive structure, and this is reflected in the empirically documented difficulties that people experience when processing such structures. We present a connectionist model of human performance in processing recursive language structures. The model is trained on simple artificial languages. We find that the qualitative performance profile of the model matches human behavior, both on the relative difficulty of center-embedding and cross-dependency, and between the processing of these complex recursive structures and right-branching recursive constructions. We analyze how these differences in performance are reflected in the internal representations of the model by performing discriminant analyses on these representations both before and after training. Furthermore, we show how a network trained to process recursive structures can also generate such structures in a probabilistic fashion. This work suggests a novel explanation of people's limited recursive performance, without assuming the existence of a mentally represented competence grammar allowing unbounded recursion.

Abstract Across the languages of the world there is a high degree of consistency with respect to the ordering of heads of phrases. Within the generative approach to language these correlational universals have been taken to support the idea of innate linguistic ...

The performance of any learning system may be assessed by its ability to generalize from past experience to novel stimuli. Hadley (this issue) points out that in much connectionist research, this ability has not been viewed in a sophisticated way. Typically, the'test-set' ...

This thesis presents a connectionist theory of how infinite languages may fit within nite minds. Arguments are presented against the distinction between linguistic competence and observable language performance. It is suggested that certain kinds of finite state automata--i.e., ...MORE ⇓

This thesis presents a connectionist theory of how infinite languages may fit within nite minds. Arguments are presented against the distinction between linguistic competence and observable language performance. It is suggested that certain kinds of finite state automata--i.e., recurrent neural networks|are likely to have sufficient computational power, and the necessary generalization capability, to serve as models for the processing and acquisition of linguistic structure. These arguments are further corroborated by a number of computer simulations, demonstrating that recurrent connectionist models are able to learn complex recursive regularities and have powerful generalization abilities. Importantly, the performance evinced by the networks are comparable with observed human behavior on similar aspects of language. Moreover, an evolutionary account is provided, advocating a learning and processing based explanation of the origin and subsequent phylogenetic development of language. This view construes language as a nonobligate symbiant, arguing that language has evolved to fit human learning and processing mechanisms, rather than vice versa. As such, this perspective promises to explain linguistic universals in functional terms, and motivates an account of language acquisition which incorporates innate, but not language-specific constraints on the learning process. The purported poverty of the stimulus is re-appraised in this light, and it is concluded that linguistic structure may be learnable by bottom-up statistical learning models, such as, connectionist neural networks.