Cognitive Psychology
About

Universal Grammar

Chomsky's theory that humans are born with an innate set of linguistic principles and parameters that constrain the possible forms of human languages.

Universal Grammar (UG), proposed by Noam Chomsky, is the theory that all human languages share a common structural foundation — an innate set of principles that constrain the forms languages can take. Children are born not with specific grammatical knowledge but with a "language acquisition device" that provides the structural framework into which the specific grammar of any particular language can be mapped from exposure. UG is one of the most debated ideas in the cognitive sciences.

Principles and Parameters

The Principles and Parameters framework proposes that UG consists of universal principles (all languages have nouns and verbs; all languages have hierarchical phrase structure) and parameters — binary options that vary across languages (e.g., the head-direction parameter: heads of phrases precede their complements in English, "read books," but follow them in Japanese). Language acquisition, in this view, involves setting the parameters of UG based on input from the ambient language, greatly constraining the hypothesis space the child must search.

Arguments For UG

The poverty of the stimulus argument remains the strongest case for UG: children reliably acquire complex grammatical knowledge despite input that is allegedly insufficient to determine the grammar without innate constraints. Language universals — structural properties shared across all known languages — provide evidence for common underlying principles. The speed of acquisition (complex grammar mastered by age 5) and the uniformity of developmental sequence across languages suggest biological preparation.

Criticisms of UG

Critics argue that usage-based approaches can account for language acquisition through general-purpose learning mechanisms (statistical learning, analogy, categorization) without positing language-specific innate knowledge. Linguistic diversity — the enormous variation in grammatical structures across languages — challenges the claim that a rich UG constrains possible grammars. And computational models have demonstrated that neural networks can learn aspects of grammar from distributional information alone, suggesting the input may be richer than the poverty of the stimulus argument claims.

Related Topics

External Links