We use cookies to enhance your experience on our website. By continuing to use our website, you are agreeing to our use of cookies. You can change your cookie settings at any time.Find out moreJump to
Content

PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (linguistics.oxfordre.com). (c) Oxford University Press USA, 2018. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy and Legal Notice).

date: 19 November 2018

Artificial Languages

Summary and Keywords

Artificial languages—languages which have been consciously designed—have been created for more than 900 years, although the number of them has increased considerably in recent decades, and by the early 21st century the total figure probably was in the thousands. There have been several goals behind their creation; the traditional one (which applies to some of the best-known artificial languages, including Esperanto) is to make international communication easier. Some other well-known artificial languages, such as Klingon, have been designed in connection with works of fiction. Still others are simply personal projects.

A traditional way of classifying artificial languages involves the extent to which they make use of material from natural languages. Those artificial languages which are created mainly by taking material from one or more natural languages are called a posteriori languages (which again include well-known languages such as Esperanto), while those which do not use natural languages as sources are a priori languages (although many a posteriori languages have a limited amount of a priori material, and some a priori languages have a small number of a posteriori components). Between these two extremes are the mixed languages, which have large amounts of both a priori and a posteriori material. Artificial languages can also be classified typologically (as natural languages are) and by how and how much they have been used.

Many linguists seem to be biased against research on artificial languages, although some major linguists of the past have been interested in them.

The term artificial languages can refer to several types of objects. The subject of this article will not be computer languages such as BASIC and C++ or systems of logic such as predicate logic, both of which sometimes are called artificial languages, but rather languages which have been constructed to be similar in function (broadly speaking) to natural languages.

Some proponents of artificial languages apparently feel that the term artificial languages has negative connotations and thus use other terms for them, planned languages or constructed languages. Another terminological issue is that many Esperantists refer to Esperanto as “the international language”; this is misleading, since many artificial languages were designed to be international languages, and some of them have functioned as such (to a limited extent), not to mention the fact that some natural languages have functioned or currently function as international languages, such as Latin and English.

It might be claimed that it is also misleading to refer to these languages as artificial, since many of them consist of material gathered from several natural languages, or even a single natural language. However, if one holds this view, to be consistent one would have to agree with the assertion that there is nothing artificial in the physical world, as even highly artificial things, such as computers and spacecraft, ultimately consist of materials which occur naturally, such as iron. What is artificial about those artificial languages which draw on natural languages is the choice and arrangement of the items from natural languages which they use (and often also the modifications to these items).

On the other hand, one could argue that many or all supposedly natural languages are in part artificial, as they also have been subject to conscious modification, or attempts at it, for example, prescriptivist efforts to try to prevent the “degeneration” of a language.

Artificial languages are informally called conlangs (constructed languages), and the study of artificial languages and related matters is interlinguistics.

2. The History of Artificial Languages

Although most artificial languages were created in the last 150 years (and many in the last 25 years), the idea of designing a language goes back at least several centuries. Some early attempts at language creation include the Lingua Ignota by Hildegard of Bingen (1098–1179) (see Higley, 2007) and Balaibalan (possibly designed in the 16th century).

However, it was only in the 17th century that artificial language creation picked up momentum. Some major thinkers of the era, such as Descartes, Newton, and Leibniz, were concerned with it. (See Slaughter, 1982 for 17th-century artificial languages and Knowlson, 1975 for those of the 17th and 18th centuries.) The 17th century was the era of philosophical languages (see section 3.1.1); these languages turned out to be difficult to use, and none of them met with widespread success.

Artificial language designing continued in the 18th century, but few languages of significance were created: Pei (1968, p. 93) states, “Once we are past the seventeenth century . . . popular interest seems to flag. Throughout the seventeen hundreds, only four plans are found that need be mentioned, and only two of the four have features that make them worth mentioning.”

The 19th century saw the appearance of artificial languages which attracted a relatively large number of proponents and speakers, Solresol, Volapük, and Esperanto, created by François Sudre, Johann Martin Schleyer, and Ludwik Lejzer Zamenhof respectively, as well as some less popular ones. (See Porset, 1979 for artificial languages of the 19th century.) However, there was disagreement within both the Volapük and Esperanto communities about details of their languages, which led to splits within these movements. The Volapük movement did not survive; Esperanto, a language with less inflectional morphology than Volapük, has survived until the present, although some former Esperantists created a rival language, Ido (a modified version of Esperanto), which is also still in use today. However, there are far fewer Idists than Esperantists. The relative success of Esperanto is perhaps shown not only by the large number of periodicals that exist and have existed in the language, but also by the considerable amount of literature composed in it (see Sutton, 2008). However, Esperanto has not achieved the goal that Zamenhof would have hoped for it, to be the world’s main language for international communication (English is currently filling that role).

More artificial languages were created in the 19th century than in previous centuries, but even more artificial languages appeared in the 20th century, created for a variety of purposes, and a large number have already been designed in the 21st century. The internet has proven to be a popular means of disseminating information about artificial languages, and one might even think that this has encouraged the creation of more artificial languages in recent decades.

Most artificial languages, and in particular most international auxiliary languages, have been created by Europeans or Americans, but there have been exceptions, for example Frater (1957), Zilengo (circa 1890), and Panamane (1936), created by authors from Vietnam, Japan, and Panama respectively.

3. The Classification of Artificial Languages

Artificial languages can be classified in various ways, including in terms of their envisaged function(s) or the extent to which they draw on natural languages, although these two classifications are not entirely independent of each other. We can also classify them typologically and in terms of the degree of use which they have seen.

3.1 Classification by Function

Until fairly recently, the most common envisaged function for artificial languages was as a means of international communication, that is, as an international auxiliary language (or, informally, auxlang); this was the motivation behind the creation of Esperanto and of Volapük before it. It should perhaps be emphasized that the vast majority of such languages were meant to be second languages and not to replace native languages.

Volapük and Esperanto were meant to be used by speakers of any natural language, but there have also been artificial auxiliary languages aimed at particular groups of speakers. Most of these are artificial zonal languages, languages intended for use by speakers of a group of related languages or by speakers in a particular geographical area. Many of these languages have been intended for speakers of Slavic languages. The earliest example of a pan-Slavic zonal language may be Juraj Križanić’s work in the mid-17th century. Recent efforts include Slovio and Interslavic.

Tutonish, designed by Elisa Molee, is an example of a zonal language for speakers of Germanic languages. Molee (1904, p. 6) justifies the limited intended usership of his language as follows:

The advantages of confining ourselves to our race only, and let other races form union tongues if they will, is that we should obtain so many previously well-known words and idioms that the common language could gradually be introduced through the schools as the only final national tongue for home use. If we should undertake to introduce an impartial world’s union language, there would be so many new and strange words and idioms in it, that it would not become easy enough to any people to learn it. No race would take enough pride in it, to introduce it as a supplementary study in all the nation’s schools. [. . .] The several races are so different in taste, feeling and religion, that they would not enjoy the same language and literature.

However, Couturat and Leau (1907, p. 62) are critical of Tutonish (and apparently of zonal languages in general):

(it is inspired by motives absolutely opposed to the humanitarian and civilizing goal of the international language and to the neutrality which is required of it. Moreover, even from a practical point of view, a single international language would be better than two or three; if the auxiliary language were not unique, it would lose much of its usefulness and reason for existing.)

Zonal languages have been created in more recent times as well. Romanova, made public in 1999, was intended for speakers of Romance languages, drawing upon French, Italian, Spanish, and Portuguese (although, as with many recent internet-based auxiliary languages, it is difficult to know whether it was an entirely serious project). Budinos (from the first decade of the 21st century) was created for speakers of Finno-Ugric languages, and Jalpi Türk Tili (which also appears to be a product of the 21st century) was designed for speakers of Turkic languages.

The artificial language American is aimed at a group of people which is geographically defined; O’Conner (1917, p. 3) states, “The following pages contain an attempt much more modest than a world language. American . . . is designed for use among only those persons living in North and South America, Hawaii and the Philippine Islands.”

As one might guess from its name, North American was designed for North Americans. One might hesitate to call it an artificial language, as it basically consists of code-switching among English, French, and Spanish (Grevor, 1966, p. 12, translation of non-English parts p. 13):

(1)

(The capital letters at the end of “commencemenT” and “dE” indicate that these words are French words, and thus to be pronounced as such; Spanish words are marked by a space between their last letter and the preceding letter, and English words are marked by a space between their first letter and the following letter, as in “convalescent.” If a word has only two letters and there is a space between them, as in “e n,” that word is Spanish.)

Two artificial zonal languages were designed for use by Africans, Afrihili (circa 1970) and Guosa. The latter is an auxiliary language, but not an international one, as it was meant for use by speakers of different languages in Nigeria. Igbinẹ́wẹ́ká (1987, p. 5) says the following about Guosa: “As of now, there are at least 21 different Nigerian languages and dialects incorporated into the language.” However, Hausa, Igbo, and Yoruba may be the main sources.

Intended userships of auxiliary languages could be restricted in other ways than by native language family or geographical location. Silarg is meant for lesbians, gays, bisexuals, and transsexuals.

A small number of artificial language projects were not meant to actually function as auxiliary languages, but rather were created to illustrate a point relating to them or as a sort of practice in creating an auxiliary language. For example, Elam (1932, p. 5) says the following in the introduction to his book on the language Oz: “This book is not offered as a solution of the Auxiliary International Language problem, but is intended merely to present the case for the a priori method of language construction.”

Many, if not all, philosophical languages were intended to be auxiliary languages, and thus there is overlap between these two categories. However, philosophical languages also had the goal of representing the world in a clearer and more precise way than natural languages. For this reason, most or all philosophical languages were of the a priori type (see section 3.2). As mentioned in section 2, the heyday of these languages was the 17th century, and the languages set out in Dalgarno (1661) and Wilkins (1668) were of this kind. Philosophical languages generally involved a complex classification of things, organisms, and other categories in the universe, and words with similar meanings had similar forms.

Also overlapping to some extent with auxiliary languages are logical languages, artificial languages whose grammar is based on logic. The best-known such languages are Loglan and its offshoot Lojban. Loglan was created “to supply an instrument for experimental investigation of the Leibniz-Whorf hypothesis [i.e., the Sapir-Whorf hypothesis]” (Brown, 1960, p. 55). Both it and Lojban are difficult to use, but the movements supporting them seem at least open to the possibility of the use of their languages as international auxiliary languages.

3.1.2 Fictional Languages

Some artificial languages, fictional languages, have been created in connection with a work of fiction, or a series of fictional works. The best-known of these languages are Klingon (which was developed by the linguist Marc Okrand), Newspeak (from George Orwell’s novel 1984), and the languages created by J. R. R. Tolkien for his Middle-earth books. An early fictional language is Utopian, from Thomas More’s 1516 book Utopia. Other fairly well-known recent languages of this type are Dothraki (from the A Song of Ice and Fire series of books and the television version, Game of Thrones), Láadan (from the Native Tongue series of books), Na’vi (from the movie Avatar), and Tsolyáni (from the role-playing game Empire of the Petal Throne). Although the languages mentioned so far in this paragraph are well developed, it is not uncommon for an author of a science-fiction or fantasy novel to include only a relatively small number of words and/or phrases of a language spoken by some characters in the book. (For artificial languages of science fiction see Cheyne, 2008)

There is a sort of combination of an auxiliary language and a fictional language: an auxiliary language connected with a work of fiction or with an imagined country, world, etc. Such languages have been called fauxlangs (cf. auxlang). Related to these are micronational languages, languages supposedly spoken in micronations (diplomatically unrecognized nations created for various reasons, often or usually for fun). Perhaps the most developed micronational language is Talossan, connected with the Kingdom of Talossa (located partly in Milwaukee). It has been provided with more than 35,000 words. One might argue that some micronational languages are not fictional languages, since they are used by real people who are “citizens” of micronations in places which actually exist.

3.1.3 Other Functions

Many fictional languages are supposed to be (in the fictional works involved) spoken by extraterrestrial aliens. There has also been a serious attempt to create a language for communication with aliens (if any should be discovered), Lincos, described in Freudenthal (1960).

Languages can also be created for one’s own fulfillment. Such artificial languages are called personal languages. An example of a personal language is Jim Henry’s gjâ-zym-byn (which dates from 1998); it has a well-developed grammar and an extensive vocabulary.

One should be cautious when using the system of classification given in this section (it is not an exhaustive list of functions, and note that classifications in some other sources will differ in major or minor points). The boundaries between different functions may not always be clear, and an artificial language may be created for more than one function, or it may not have the function stated by its designer. For example, an artificial language may be referred to as an auxiliary language, but it may not be seriously intended for that function, and may have been created (mainly) for fun. (Given the existence of so many artificial auxiliary languages, one might wonder about the seriousness or judgment of those who create yet another one.) In addition, a language might be used for a purpose other than its intended one. For example, Volapük has seen a small revival in recent years, but it is unlikely that its current users have learned it as a serious means of international communication; rather, it is probably learned as a sort of hobby.

3.2 Classification by Sources of Material (if Any)

Many artificial languages are based mainly on one or more natural languages; these languages are known as a posteriori languages. The majority of the most successful artificial auxiliary languages, including Esperanto, are of this type. This can be seen in the Esperanto example below (from Stuttard, 1973, p. 33), in which most of the roots clearly come from European languages:

(2)

On the other hand, some artificial languages have been attempts to build a language “from scratch”; such languages are called a priori languages. In general these languages have not met with much success. Among the better known a priori languages are Solresol and Ro; some other languages of this time are aUI, Sona, and Suma. At least on the surface, a priori languages appear alien and unusual, as shown by the following examples (from Weilgart, 1979, p. 28 and Okamoto, 1962, p. 25):

(3)

(4)

The classification of artificial languages in these terms is a spectrum rather than a strict dichotomy: some a posteriori languages have a priori components, and some a priori languages have a small number of a posteriori items. When an artificial language has a substantial amount of both a priori and a posteriori material, it is referred to as a mixed language. Volapük, which became quite popular (compared to other artificial languages) in the late 19th century, was a language of this type.

We can further classify the a posteriori languages in terms of which natural language(s) they draw upon. Some of them have a single language as their (main) source; usually they are a simplified version of this language. An obvious way of simplifying a natural language is to use only a subset of the lexicon, grammatical possibilities, and semantic possibilities of that language. These languages are called controlled (natural) languages. The best-known of these languages is Basic English, which has a main vocabulary of 850 words. Some controlled versions of English have been created for particular types of situations, for example Seaspeak for maritime operations. One might argue that such languages are not artificial languages, as all of their components occur in a natural language. However, artificial limitations are placed on the language, and these limitations can have considerable effects on the feel or style of that language.

Various modified versions of Latin have been designed (see Libert, 2004), the best-known of which is Latino sine Flexione, created by the mathematician Giuseppe Peano (1858–1932). Some of these involve only slight changes from Latin, while others are substantially different.

At the other extreme from artificial languages based on a single language are those drawing on languages from different families, sometimes a large number of them. Ardano is supposed to have drawn on all the world’s languages (although the language does not seem to have been developed enough to realize this ambition). Unish draws upon 14 major natural languages, including Arabic, Hindi, Chinese, Japanese, and Korean, as well upon Esperanto.

In between these extremes are artificial languages which use a fairly limited number of source languages, with these sources (largely) being genetically or areally related. Esperanto is such a language: all of its major source languages are Indo-European and spoken in Europe. Such languages have been criticized for their lack of internationality or neutrality; Esperanto is probably considerably easier for a native speaker of a major West European language than for a native speaker of a non-Indo-European language who does not know a West European language. The same is true of most of the other main artificial auxiliary languages, such as Interlingua and Ido. The term Euroclone was coined to describe (apparently with a negative connotation) artificial auxiliary languages which use Western European Indo-European languages as sources. Such languages can be read without much difficulty if one knows one or more major Western European languages, as shown by the following example from Eurolengo (Jones, 1972, p. 12, translation p. 13):

(5)

One could defend Eurolengo against any criticism for lack of internationality, because it was apparently a zonal language (see section 3.1), as indicated by the subtitle of Jones (1972): “The Language for Europe.” That is, it was only meant to be an inter-European language, not a language for the entire world, so a lack of internationality is not relevant. This defense will not work for Esperanto, since it was not intended to be only a zonal language.

Sambahsa-mundialect (presented publicly in 2007) might be said to take a middle position between languages based on (West) European languages and languages drawing on a wide range of languages: its main sources are European languages, but it also includes words from other languages such as Arabic and Chinese. Pandunia (2012) takes yet another approach: it has two main sources, the unrelated languages English and Chinese, but in addition draws upon various other languages, both European and non-European, such as French, Spanish, Arabic, Indonesian, and Swahili.

There are artificial languages which are based mainly on a single other artificial language; such languages generally arise because of dissatisfaction with that artificial language, and so are modified versions of it. The best-known example is Ido (1907), a modification of Esperanto, and there have been many other modified versions of Esperanto, such as Ekselsioro (1906), Esperanto sen Fleksio (1996), and Modern Esperanto (possibly from 1958) (see Libert, 2008 for such languages). There have also been revisions of Ido, such as Ido Avancit (circa 1924).

Several revised versions of Volapük were designed, for example Balta (1887) and Veltparl (1896), none of which came close to the success of the original Volapük. A possible exception is Idiom Neutral, which was a radical revision of Volapük. Idiom Neutral itself was modified by its designer, Waldemar Rosenberger (1848–1918); he called the new version of the language Reform-Neutral (1912). Other modifications of Idiom Neutral were Idiom Neutral modifiked (or modifiket; sources differ on the name) (1909) and Idiom Neutral reformed (1907).

In addition, some artificial language designers repeatedly revised their own language, sometimes changing the name, sometimes not. Edward P. Foster wrote several books on his a priori language Ro with at least some minor changes in some of the books. For example, according to Foster (1910, p. 15) the Ro word for ‘cloud’ is bodef, but in Foster (1913, p. 20) it is bifka, and it changes to bifab, bif, and bifa in Foster (1919, p. 10), Foster (1921, p. 6), and Foster (1931, p. 3) respectively. In such cases we must speak of different versions of the language, qualifying them by date, for example (1913) Ro.

3.3 Typological Classification

One can apply the same sorts of typological classification to artificial languages as is done with natural languages. For example, artificial languages can be classified in terms of their basic word order (if there is one) and whether they have fixed or free word order. Many artificial languages, including some a priori languages such as aUI and Babm, and some mixed languages such as Volapük, have SVO as their basic word order. This should not be too surprising, since SVO is a common order among West European languages; because so many artificial languages were created by native speakers of these languages, this word order might have appeared the natural or only choice, even for an a priori language. (However, Babm was designed by a native speaker of Japanese, in which the basic order is SOV.)

Artificial languages can also be classified in terms of the medium through which they were intended to be used. Most artificial languages were meant to be both spoken and written, and some artificial languages have been provided with their own writing systems. Artificial languages which were only meant to be written have been called pasigraphies; as Guérard (1921, p. 80) says, a “symbol [of a pasigraphic system] is understood by all, but pronounced in many different ways” (depending on the language of a reader). Many or most of the earliest attempts at artificial languages were of this type, like the system presented in Wilkins (1668). A recent pasigraphy is Blissymbolics (which dates from 1942). Some of these use ideographs or pictographs, and some use numerals. The latter are referred to as numerical languages, one of which is Timerio (1921). The Timerio sentence meaning ‘I love you’ is 1-80-17 (a sentence cited in several books on artificial languages).

The 19th-century language Solresol was musically based: each “syllable” is a musical note (e.g., do-re means ‘I, me’), although other media, such as writing, could also be used with it. A much more recent musically based language is Eaiea.

Gestural artificial languages are also possible. In the mid-20th century, Stephen Streeter designed a gestural system for international use which attracted some attention (see, e.g., Burchardt, 1965). However, it seems to have been rather limited, since it consisted of only 72 signs (several borrowed from the Plains Indian Sign Language), although they could be joined together.

Related to artificial languages are reformed spelling systems and shorthand systems, as they both involve modification of (written) language. When language designers design an artificial language based on a particular natural language, the modifications which they make sometimes involve spelling, as with Voldu (1945), a modified form of English. One of the early artificial languages, that proposed in Dalgarno (1661), “started out as an endeavor to improve a shorthand system” (Maat, 2004, p. 34). Dutton Speedwords (1943) is both a shorthand system and an auxiliary language.

3.4 Classification by Extent and Ways of Use

Blanke (1989, p. 68) asserts that “it is necessary to add a classification according to the real role of communication which definite planned language systems play or played,” as “insignificant projects and a functioning language such as Esperanto are treated at the same level.” (This was probably meant to apply only to artificial auxiliary languages, though it could be applied to some other types of artificial languages.) Those who are interested primarily in the structural details of artificial languages may not see this as a problem (except in cases when a language designer only gives a limited amount of information about his language, which is often the case), but Blanke (1989, p. 68) feels that “a language should not be reduced to structural elements, but is realized only within society as an instrument of communication, thought, and information exchange.” Nevertheless, Blanke’s classification present an interesting and possibly useful way of viewing artificial languages. On pp. 69–70, he gives 19 stages of development with respect to extent of use:

The life of more than 900 projects ended immediately after the (1) publication of its structure. In other cases, there often followed (2) a production of texts, sometimes appearing in a small journal, accompanied by discussion of linguistic details and information to be used as propaganda. [. . .] Often the authors of the projects succeeded in finding a few interested persons from different countries who learned the system and used it, mainly for (3) international correspondence. [. . .] Further steps were characterized by (4) a certain organization of the adepts and somewhat systematic publicity. [. . .] Further steps worth mentioning as steps toward becoming a language are (5) the creation of literature, (6) the appearance of certain (small) journals, and (7) a certain application to specialized texts. [. . .] Ido, Occidental-Interlingue and Interlingua-Gode were also (8) taught to a certain extent and were (9) applied internationally in speech. [. . .] Only Esperanto went further: (10) further specialized practical usage (specialized journals and organizations), (11) a developed network of national and international organizations, (12) a wide range of literature, (13) relatively wide instruction (sometimes state-supported), (14) large periodically occurring international events, (15) regular radio programs, (16) clear social and political distinctions in the already formed language community and its linguistic reflection, (17) an independent youth movement, (18) a certain evolution of independent cultural elements linked to the language community, (19) bilingualism (involving an ethnic and a planned language) of children in (most often international) families.

Blanke (1989, p. 70) admits that this is not a full list of stages, and one can imagine a step prior to his (1), in which a description of a language has been written down but not published, and there are indeed some artificial languages which did not get past this stage (and thus some artificial languages have been permanently lost). One could also imagine several stages in publication: in the case of many artificial languages, published material only describes some parts of the language (e.g., only a fairly small number of vocabulary items are given).

4. Artificial Languages and Linguistics

Linguists have generally not done research on artificial languages or been interested in designing them; Martinet (1946, p. 37) speaks of the “prejudice” which linguists have against their fellow linguists who are involved in language creation. More recently Versteegh (1993, pp. 539–540) has stated, “It is not an exaggeration to say that most linguists feel that Esperanto, Ido, Volapük, etc., not being natural languages, do not belong to the domain of linguistics.” However, several prominent linguists have written about artificial languages, including de la Grasserie (1892) and Harris (1962). Otto Jespersen designed the artificial language Novial.

More recently Comrie (1996, p. 54) has written that “The study of international languages [including Esperanto] . . . is an interesting theoretical study.” On the other hand, Chomsky has asserted that “Esperanto is not a language” (<https://www.youtube.com/watch?v=XR9oboNAxkI>).

Several areas of the grammar of artificial languages have been examined by Moskovsky and Libert (2009); Libert and Moskovsky (2011) have looked at some phonetic and grammatical features of artificial languages as well as at parts of their vocabulary. Libert (2013) has looked at various aspects of pragmatics in artificial languages; see also Traunmüller (1991). There are now some native speakers of Esperanto, so first-language acquisition of it, and its properties when it is a first language, can be studied, as in Versteegh (1993) and Bergen (2001).

Links to Digital Materials

As with other subjects, internet sources on artificial languages should be used with caution. In the case of artificial languages, not only may there be inaccuracies, but also much available material could be biased in favor of one or another artificial language (as also happens with hard-copy publications from different artificial-language movements). The websites listed here are recommended as sources for learning (about) particular artificial languages, although they may be partisan. In addition, one can get an idea of how active artificial language proponents are on the internet, and how many artificial languages have a substantial internet presence. Most or all of the major currently used artificial languages, as well as several major older languages, are represented in the following list.