Finite set of formal rules that will produce all the grammatical sentences of a language. The idea of a generative grammar was first definitively articulated by Noam Chomsky in Syntactic Structures (1957). The generative grammarian's task is ideally not just to define the interrelation of elements in a particular language, but also to characterize universal grammar—that is, the set of rules and principles intrinsic to all natural languages, which are thought to be an innate endowment of the human intellect. Seealso grammar, syntax.
Learn more about generative grammar with a free trial on Britannica.com.
Generative grammar originates in the work of Noam Chomsky, beginning in the late 1950s. (Early versions of Chomsky's theory were called Transformational Grammar.) There are a number of competing versions of generative grammar currently practiced within linguistics. Chomsky's current theory is known as the Minimalist Program. Other prominent theories include or have included Head-driven phrase structure grammar, Lexical functional grammar, Categorial grammar, Relational grammar, and Tree-adjoining grammar.
Noam Chomsky has argued that many of the properties of a generative grammar arise from an "innate" Universal grammar, which is common to all languages. Proponents of generative grammar have argued that most grammar is not the result of communicative function and is not simply learned from the environment. In this respect, generative grammar takes a point of view different from functional and behaviourist theories.
Most versions of generative grammar characterize sentences as either grammatically correct (also known as well formed) or not. The rules of a generative grammar typically function as an algorithm to predict grammaticality as a discrete (yes-or-no) result. In this respect, it differs from stochastic grammar which considers grammaticality as a probabilistic variable. However, some work in generative grammar (e.g. recent work by Joan Bresnan) uses stochastic versions of Optimality theory.
Generative grammar has been under development since the late 1950s, and has undergone many changes in the types of rules and representations that are used to predict grammaticality. In tracing the historical development of ideas within generative grammar, it is useful to refer to various stages in the development of the theory.
At a higher level of complexity are the context-free grammars (type 2). The derivation of a sentence by a grammar can be depicted as a derivation tree. Linguists working in generative grammar often view such derivation trees as a primary object of study. According to this view, a sentence is not merely a string of words, but rather a tree with subordinate and superordinate branches connected at nodes.
The resulting sentence could be The dog ate the bone. Such a tree diagram is also called a phrase marker. They can be represented more conveniently in text form, (though the result is less easy to read); in this format the above sentence would be rendered as:
[S [NP [D The ] [N dog ] ] [VP [V ate ] [NP [D the ] [N bone ] ] ] ]
Chomsky has argued that phrase structure grammars are also inadequate for describing natural languages, and has formulated the more complex system of transformational grammar.
In any case the reality is that most native speakers would reject many sentences produced even by a phrase structure grammar. For example, although very deep embeddings are allowed by the grammar, sentences with deep embeddings are not accepted by listeners, and the limit of acceptability is an empirical matter that varies between individuals, not something that can be easily captured in a formal grammar. Consequently, the influence of generative grammar in empirical psycholinguistics has declined considerably.