Why there must be universal grammar

The guardian ran an interview with Daniel Everett yesterday. Everett is a linguist most famous for his claim that universal grammar (the belief that some rules of grammar are "hard wired" into the brain) as popularized by Chomsky, is false. Specifically, he believes that the Pirahã language lacks recursion.

His claims are quite controversial, but one thing which is worth mentioning is that universal grammar is (for a reasonable definition of "proof") provably correct. By this I mean:

Theorem: Learning grammar is so hard that the only way humans (or anyone) can do it is if they have innate structures.

This is related to Chomsky's poverty of the stimulus argument.

It can be proven in the following way: suppose we restrict ourselves to just the subset of English sentences consisting only of nouns and verbs. "I like John" and "You are here" would be two examples. These both follow the pattern "noun verb noun". A sentence like "jump run you" is non-grammatical, because "verb verb noun" is not an acceptable pattern in English.

Now let's consider how long it would take a learner to learn these patterns. There are 23 = 8 possible patterns of length three, so if a learner thinks they're all possible, it will have to test out all eight of them. ("Mommy, is 'jump run you' a sentence?")

Most sentences have much more than three words of course, so a learner will need to test out the 24 = 16 four word patterns, the 25 = 32 five word patterns, etc. In general, there are 2n possible sentences with n words, meaning that the number of tests that the learner will need to run is exponential in the number of words.

The Cobham-Edmonds thesis states that any problem which takes exponential time is, in practice, unsolvable.

Why is this true? There are, depending on your definition of "part of speech", about 20 parts of speech in English. If you tested one grammar per second, it would take you about a month to learn all the five word grammars. The six word grammars would take you two years, and you would be forty before you learned all the seven word grammars. That last sentence had 22 words, and it would take you 1021 years to test all of the 22-word-grammars. The universe is only 1013 years old.

So who knows whether all languages are recursive. But it seems unlikely that human children consider all possible grammars equally. They must use some shortcuts and those shortcuts must, by definition, be innate.

6 comments:

  1. Curiously, you use the argument that there isn't enough time in a human lifespan to test all grammars. But you further show that there isn't enough time in the universe to do so. If your argument implies that we cannot learn the rules of English from scratch in a lifetime, then, by the same reasoning, doesn't it also imply that the rules of the English language couldn't have evolved in the span of human history?

    I think the problem is that you're assuming the only way to learn a grammar is to test all possible word sequences. In reality, the Kolmogorov complexity of English grammar is _much_ smaller than 2^n where n is the number of English words.

    A discussion of this post has been started at Reddit: http://www.reddit.com/r/math/comments/rdcp0/why_there_must_be_universal_grammar/

    ReplyDelete
    Replies
    1. Anonymous: thank you for your comment.

      You have hit the nail on the head: there must be some way of learning a grammar besides testing all possible word sequences. This is precisely the claim that universal grammar makes!

      Wrt "doesn't it also imply that the rules of the English language couldn't have evolved in the span of human history?" - I don't think so. I can come up with some random sequence 1011001001... in linear time that it would take you an exponential amount of time to guess.

      Delete
    2. I hasten to add though, that the point of the article is to convince you that brute force is too slow. If you believe that English cannot be created, much less learned, by brute force, so much the better!

      Delete
    3. This is all way outside of my area of specialization, so apologies if I'm misunderstanding some concepts, but consider the following example:

      Consider the grammar consisting of all binary sequences with length<=100 that have an equal number of 0s and 1s. I don't think it's a stretch to suppose that a neural net or other AI algorithm could learn this grammar reasonably quickly, even though the language itself is quite large.

      Does that mean that neural nets have a universal grammar that includes the grammar I defined? Unless I completely misunderstand what is meant by a universal grammar, I would argue not. The neural net can learn the grammar because it has an extremely low Kolmogorov complexity.

      Delete
    4. Anonymous: You're right, with one tiny modification: the learner needs to *know* that the target language is simple. Otherwise they will spend all their time guessing complex strings.

      In other words, the learner needs "innate knowledge".

      Delete
  2. Read a typical comment on Youtube and tell me people have learned grammar completely.

    ReplyDelete