Generative grammar, a very formal system is the first of the three models for the description of the language discussed by linguist Noam Chomsky in Syntactic Structures. In linguistics, generative grammar is grammar (the set of language rules) that indicates the structure and interpretation of sentences that native speakers of a language accept as belonging to their language. Adopting the term ‘generative’ from mathematics, Chomsky introduced the concept of generative grammar in the 1950s. This theory is also known as transformational grammar, a term still used today.
- Generative grammar is a theory of grammar, first developed by Noam Chomsky in the 1950s, that is based on the idea that all humans have an innate language capacity.
- Linguists who study generative grammar are not interested in prescriptive rules; rather, they are interested in uncovering the foundational principles that guide all language production.
- Generative grammar accepts as a basic premise that native speakers of a language will find certain sentences grammatical or ungrammatical and that these judgments give insight into the rules governing the use of that language.
Grammar refers to the set of rules that structure a language, including syntax (the arrangement of words to form phrases and sentences) and morphology (the study of words and how they are formed). Generative grammar is a theory of grammar that holds that human language is shaped by a set of basic principles that are part of the human brain (and even present in the brains of small children). This “universal grammar,” according to linguists like Chomsky, comes from our innate language faculty.
Generative vs. Prescriptive Grammar
Generative grammar is distinct from other grammars such as prescriptive grammar, which attempts to establish standardized language rules that deem certain usages “right” or “wrong,” and descriptive grammar, which attempts to describe language as it is used (including the study of pidgins and dialects). Instead, generative grammar attempts to get at something deeper—the foundational principles that make language possible across all of humanity.
For example, a prescriptive grammarian may study how parts of speech are ordered in English sentences, to layout rules (nouns precede verbs in simple sentences, for example). A linguist studying generative grammar, however, is more likely to be interested in issues such as how nouns are distinguished from verbs across multiple languages.
Principles of Generative Grammar
The main principle of generative grammar is that all humans are born with an innate capacity for language and that this capacity shapes the rules for what is considered “correct” grammar in a language. The idea of an innate language capacity—or a “universal grammar”—is not accepted by all linguists. Some believe, to the contrary, that all languages are learned and, therefore, based on certain constraints.
Proponents of the universal grammar argument believe that children, when they are very young, are not exposed to enough linguistic information to learn the rules of grammar. That children do learn the rules of grammar is proof, according to some linguists, that there is an innate language capacity that allows them to overcome the “poverty of the stimulus.”
Examples of Generative Grammar
As generative grammar is a “theory of competence,” one way to test its validity is with what is called a grammaticality judgement task. This involves presenting a native speaker with a series of sentences and having them decide whether the sentences are grammatical (acceptable) or ungrammatical (unacceptable).
E.g., 1. The man is happy. 2. Happy man is the.
Here, a native speaker would judge the first sentence to be acceptable and the second to be unacceptable.
Language is described by a particular grammar as the set of all the sentences it generates. The set of sentences may be, in principle, either finite or infinite in number. But English comprises an infinite number of sentences because there are sentences and phrases in the language that can be extended indefinitely and will be yet accepted as perfectly normal by native speakers. But the point is that no definite limit can be set to the length of English sentences. Therefore it is to be accepted that in theory, the number of grammatical sentences in the language is infinite.
But the number of words in the vocabulary of English is assumed, finite. There is a considerable variation in the words known to different native speakers and there may well be some difference between the active and passive vocabulary of every individual. Indeed, neither the active nor the passive vocabulary of any native speaker of English is fixed and static even for relatively short periods.
Sentences can be represented, at two levels: at the syntactic level as a sequence of words, and the phonological level as the sequence to phonemes. Following Chomsky, it can be said that every different sequence of words is a different sentence. Under this definition, not only are The dog bit the man and The man bit the dog are different sentences but so also are I had an idea on my way home and On my way home I had an idea. From the purely syntactic point of view, the phonological structure of words is irrelevant; and we could represent them in any one of a variety of ways.
Terminal and Auxiliary Elements
Terminal elements are those which occur in sentences: words at the syntactic level and phonemes at the phonological level. All other terms and symbols that are employed in the formulation of grammatical rules may be described as auxiliary elements.
In generative grammar, the fact that a particular word belongs to a particular class must be made explicit within the grammar. In effect, this means, in grammars of the type formalised by Chomsky, that every word in the vocabulary must be assigned to the syntactic class or classes to which it belongs and not leave it to the person referring grammar to decide whether a particular word satisfies the definition or not.
The simplest grammars discussed by Chomsky that are capable of generating an infinite number of sentences using a finite number of recursive rules operating upon a finite vocabulary are what he calls finite-state grammars. These are based on the view that sentences are generated using a series of choices made from left to right: that is to say, after the first, or leftmost, the element has been selected, every subsequent choice is determined by the immediately preceding elements. One way of representing graphically what has just been said in words is using the ‘state diagram’.
We can think of grammar as a machine, or device which moves through a finite number of the initial state(start) as it passes from the initial state(start) to the final stage(stop) in the generation of sentences.
Chomsky’s proof of the inadequacy of finite-state grammar rests upon the fact that there may be dependencies holding between non-adjacent words and that these interdependent words may themselves be separated by a phrase or clause containing another pair of non-adjacent interdependent words. For example, in a sentence like Anyone who says that is lying there is a dependency between the words anyone and is lying. They are separated by the simple clause who says that (in which there is the dependency between who and says).
E.g., Anyone who says that people who deny that…are wrong is foolish.
The result is a sentence with ‘mirror image properties’ that is to say a sentence of the form a+b+c…x+y+ z, where there is a relationship of compatibility or dependency between the outermost constituents (a and z), between the next outermost (b and y) and so on. Any language that contains an indefinitely large number of sentences with such ‘mirror image properties’ are beyond the scope of finite-state grammar.