... Based on data from 561 languages, the survey reveals that unnatural classes are widespread: among 6077 unique classes of sounds which are targets or triggers of phonological processes, analyzed in three popular feature theories ..., no single theory is able to characterize more than 71% of the classes, and over 24% are not characterizable in any of the theories. While other theories are able to account for specific subsets of these classes, none is able to predict the wide range of classes which actually occur and recur. ...

The argument the author makes agrees with my view of linguistics in general.

Undoubtedly there are processing constraints for languages - no-one has found a stack-based human language for example, and almost certainly there isn't one. In the case of phonetics and phonology, the patterns are also driven by the ability of humans to perceive and produce different sounds. And the evidence is that there is almost certainly some kind of basic built-in language faculties, since there are types of brain damage that only seem to affect language use.

But I think that a lot of what gets codified as supposedly universal grammar is simply the product of the interaction between broader biological constraints, the communicative needs of speakers, and the environment, and are more diachronic tendencies than inviolable synchronic rules. What's more, a lot of the assumptions seem to be driven by some sense of theoretical niceness that has no apparent evidential basis whatsoever. For example, as I said in this thread, there seems no reason to assume that humans don't store redundant lexical information rather than applying complex rules in many cases:

And some of the current theoretical preferences seem to me to be truly bizarre. Take, for example, the assumption that any feature framework should be binary in nature. I think it's widely recognised that, at least at the phonetic (production) level, there are gradients rather than discrete points. But even if we assume that there is some universal underlying set of discrete points, why this obsession with binary ones?

Take +-high, +-low. These are used in some features to differentiate, say, the front vowels i e a. But this is nonsense, because it is not physically possible to produce a consonant or vowel that is +high +low for most definitions of those features. The only reason for these two features to be separate is because of this insistence on binarity, which is not supported by any evidence whatsoever. A ternary feature would cover exactly the same range of sounds but recognise directly the impossibility of +high +low. This pattern is repeated for other sets of incompatible binary features, such as +constricted glottis / +spread glottis, which should also be a single ternary feature IMO, if we have to have such a feature theory. How can the people creating these theories consistently fake ternary ones in this way without being tempted to take the logical step of combining them?

Exactly the same argument can be levelled against strict binary branching in syntactic theories, and it has been. I particularly enjoyed reading the following book, which argues for grammar based on constructions (essentially templates) which need not be binary:

In the process, the book strongly attacks the current notions about Universal Grammar.

EDIT: I should say that my sound change applier does not assume binary features, mainly because I don't like the binarity assumption. It has some special syntactic sugar for them, but you can have features with as many potential values as you like (well, up to an architecture/compiler dependent limit of at least 536,870,910, assuming you don't run out of memory before you've defined that many value names).

EDIT EDIT: This book may also be worth reading for arguing against the penchant for binary branching and nulls all over the place, although I found it to be somewhat repetitive and monotonous:

Exactly the same argument can be levelled against strict binary branching in syntactic theories, and it has been. I particularly enjoyed reading the following book, which argues for grammar based on constructions (essentially templates) which need not be binary:

Maybe I should do, although as a Englishman I don't use Amazon.com much... I don't think the Amazon sites share reviews, which is a pity. So any review on Amazon.co.uk won't make it onto Amazon.com or vice versa.

I really did enjoy the book. It wasn't purely dedicated to arguing against generative grammar and other theories, since it was an exposition of Croft's own view of what a theory of syntax should look like. Of course some of the book does justify why he thinks his approach is better than other approaches, but it's not purely an attack on other theories. But it was refreshing because Croft starts by taking the typological facts available seriously and moving from there, whereas a lot of other theories of syntax (and the people who develop them) seem to start with a few languages and then try to shoe-horn everything else in somehow.

http://www.antoniodenebrija.org/index.htmlSite on Antonio de Nebrija. You can find an online version of his GRAMMATICA (late 15th century), albeit greatly adapted to modern orthography. (Things like double <ss> are conserved, but you can see some clear editing here.)

Here are the resources I had posted on the KneeQuickie (except the standard Japanese portion wasn't me) with a few edits and several with a ton of additions. The list features only articles and documents that are available online (a select few might require university or library access), and excludes most general wordlists (since we could find a few hundred lists for Okinawan).

This list is kept up to date and focuses primarily on Southern Japanese, Ryukyuan and Hachijo. Some titles have been modified to be more meaningful to English readers. If someone who is a researcher stumbles upon this list, please consider making your work openly accessible.

The argument the author makes agrees with my view of linguistics in general.

Undoubtedly there are processing constraints for languages - no-one has found a stack-based human language for example, and almost certainly there isn't one. In the case of phonetics and phonology, the patterns are also driven by the ability of humans to perceive and produce different sounds. And the evidence is that there is almost certainly some kind of basic built-in language faculties, since there are types of brain damage that only seem to affect language use.

A stack is a concept in computing. So here it's a reference to computer languages.

More specifically: a stack is a last in - first out structure. Essentially, it's a bit like a pile of papers on your desk - the last thing you add onto the top of the pile is the first thing you take off.

Stacks are used for a wide variety of things in computer. Commonly, they are a way of structuring the calls to functions or the passing of information. Consider the following commands:

"+" takes the top 2 things on the stack and then adds the result of adding them onto the top"*" takes the top 2 things on the stack and then adds the result of multiplying them onto the toppush adds something new onto the top of the stack

Now look at the following sequence:

push 2push 3*push 5push 6*+

The result of executing this is (2 * 3) + (5 * 6) = 36. As it is executed, the stack goes through the following states:

[2][3,2][6][5, 6][6, 5, 6][30, 6][36]

(here, the newest item on the stack is to the left, and older items are to the right)

This is essentially how the instruction sets of high level virtual machines normally work. But no human language would work like this, probably because of the demands on working memory.

If you're looking for a good word generator, this works fairly well. I don't think it handles Unicode, and it was originally a Dwarf Fortress tool, but it might be useful. Play around with it.

While this does indeed look useful, I'm afraid that, in my infinite ignorance of technology, I can't figure out how to run it (on a Mac). I've tried directing it to the proper file in terminal, but whenever I try to get it to do something, terminal tells me,

Quote:

/Library/Frameworks/Python.framework/Versions/2.6/Resources/Python.app/Contents/MacOS/Python: can't open file 'DFLang': [Errno 2] No such file or directory

_________________"A positive attitude may not solve all your problems, but it will annoy enough people to make it worth the effort."–Herm AlbrightEven better than a proto-conlang, it's the *kondn̥ǵʰwéh₂s

Who is online

Users browsing this forum: No registered users and 1 guest

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum