Jump to content

Transformational grammar

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Jonsafari (talk | contribs) at 03:19, 25 May 2006 (Mathematical representation: +ref; pad previous ref w/ spaces; del unnecessary comment; etc.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Transformational grammar is a broad term describing grammars (almost exclusively those of natural languages) which have been developed in a Chomskyan tradition. The term is usually synonymous with the slightly more specific transformational-generative grammar (TGG).

Deep structure and surface structure

In the early to mid 1960s, Noam Chomsky developed the idea that each sentence in a language has two levels of representation - a deep structure and a surface structure.[1] The deep structure was (more-or-less) a direct representation of the basic semantic relations underlying a sentence, and was mapped onto the surface structure (which followed the phonological form of the sentence very closely) via transformations. Chomsky believed that there would be considerable similarities between the deep structures of different languages, and that these structures would reveal properties, common to all languages, which were concealed by their Surface Structures. However, this was perhaps not the central motivation for introducing Deep Structure. Transformations themselves had been proposed prior to the development of Deep Structure, essentially as a means of increasing the mathematical and descriptive power of Context free grammars. Similarly, Deep Structure was devised largely for narrow technical reasons relating to early semantic theory. Chomsky emphasizes the importance of modern formal mathematical devices in the development of grammatical theory:

But the fundamental reason for [the] inadequacy of traditional grammars is a more technical one. Although it was well understood that linguistic processes are in some sense "creative", the technical devices for expressing a system of recursive processes were simply not available until much more recently. In fact, a real understanding of how a language can (in Humboldt's words) "make infinite use of finite means" has developed only within the last thirty years, in the course of studies in the foundations of mathematics.
(Aspects of the Theory of Syntax, p. 8 [1])

Development of basic concepts

Though transformations continue to be important in Chomsky's current theories, he has now abandoned the original notion of Deep Structure and Surface Structure. Initially, two additional levels of representation were introduced (LF — Logical Form, and PF — Phonetic Form), and then in the 1990s Chomsky sketched out a new program of research known as Minimalism, in which Deep Structure and Surface Structure no longer featured and PF and LF remained as the only levels of representation.

To complicate the understanding of the development of Chomsky's theories, the precise meanings of Deep Structure and Surface Structure have changed over time — by the 1970s, the two were normally referred to simply as D-Structure and S-Structure by Chomskian linguists. In particular, the idea that the meaning of a sentence was determined by its Deep Structure (taken to its logical conclusions by the generative semanticists during the same period) was dropped for good by Chomskian linguists when LF took over this role (previously, Chomsky and Ray Jackendoff had begun to argue that meaning was determined by both Deep and Surface Structure).[2][3]

Innate linguistic knowledge

Terms such as "transformation" can give the impression that theories of transformational generative grammar are intended as a model for the processes through which the human mind constructs and understands sentences. Chomsky is clear that this is not in fact the case: a generative grammar models only the knowledge that underlies the human ability to speak and understand. One of the most important of Chomsky's ideas is that most of this knowledge is innate, with the result that a baby can have a large body of prior knowledge about the structure of language in general, and need only actually learn the idiosyncratic features of the language(s) it is exposed to. Chomsky was not the first person to suggest that all languages had certain fundamental things in common (he quotes philosophers writing several centuries ago who had the same basic idea), but he helped to make the innateness theory respectable after a period dominated by more behaviorist attitudes towards language. Perhaps more significantly, he made concrete and technically sophisticated proposals about the structure of language, and made important proposals regarding how the success of grammatical theories should be evaluated.

Chomsky goes so far as to suggest that a baby need not learn any actual rules specific to a particular language at all. Rather, all languages are presumed to follow the same set of rules, but the effects of these rules and the interactions between them can vary greatly depending on the values of certain universal linguistic parameters. This is a very strong assumption, and is one of the more subtle ways in which Chomsky's current theory of language differs from most others.

Grammatical theories

In the 1960s, Chomsky introduced two central ideas relevant to the construction and evaluation of grammatical theories. The first was the distinction between competence and performance. Chomsky noted the obvious fact that people, when speaking in the real world, often make linguistic errors (e.g. starting a sentence and then abandoning it midway through). He argued that these errors in linguistic performance were irrelevant to the study of linguistic competence (the knowledge that allows people to construct and understand grammatical sentences). Consequently, the linguist can study an idealised version of language, greatly simplifying linguistic analysis (see the "Grammaticalness" section below). The second idea related directly to the evaluation of theories of grammar. Chomsky made a distinction between grammars which achieved descriptive adequacy and those which went further and achieved explanatory adequacy. A descriptively adequate grammar for a particular language defines the (infinite) set of grammatical sentences in that language; that is, it describes the language in its entirety. A grammar which achieves explanatory adequacy has the additional property that it gives an insight into the underlying linguistic structures in the human mind; that is, it does not merely describe the grammar of a language, but makes predictions about how linguistic knowledge is mentally represented. For Chomsky, the nature of such mental representations is largely innate, so if a grammatical theory has explanatory adequacy it must be able to explain the various grammatical nuances of the languages of the world as relatively minor variations in the universal pattern of human language. Chomsky argued that, even though linguists were still a long way from constructing descriptively adequate grammars, progress in terms of descriptive adequacy would only come if linguists held explanatory adequacy as their goal. In other words, real insight into the structure of individual languages could only be gained through the comparative study of a wide range of languages, on the assumption that they are all cut from the same cloth.

"I-Language" and "E-Language"

In 1986, Chomsky proposed a distinction between I-Language and E-Language, similar but not identical to the competence/performance distinction.[4] I-Language is taken to be the object of study in syntactic theory; it is the mentally represented linguistic knowledge that a native speaker of a language has, and is therefore a mental object — from this perspective, most of Linguistics is a branch of Psychology. E-Language encompasses all other notions of what a language is, for example that it is a body of knowledge or behavioural habits shared by a community. Thus, E-Language is not itself a coherent concept[5], and Chomsky argues that such notions of language are not useful in the study of innate linguistic knowledge, i.e. competence, even though they may seem sensible and intuitive, and useful in other areas of study. Competence, he argues, can only be studied if languages are treated as mental objects.

"Grammaticalness"

Chomsky argued that the notions "grammatical" and "ungrammatical" could be defined in a meaningful and useful way. In contrast an extreme behaviorist linguist would argue that language can only be studied through recordings or transcriptions of actual speech, the role of the linguist being to look for patterns in such observed speech, but not to hypothesize about why such patterns might occur, nor to label particular utterances as either "grammatical" or "ungrammatical". Although few linguists in the 1950s actually took such an extreme position, Chomsky was at an opposite extreme, defining grammaticality in an unusually (for the time) mentalistic way.[6] He argued that the intuition of a native speaker is enough to define the grammaticalness of a sentence; that is, if a particular string of English words elicits a double take, or feeling of wrongness in a native English speaker, it can be said that the string of words is ungrammatical (when various extraneous factors affecting intuitions are controlled for). This (according to Chomsky) is entirely distinct from the question of whether a sentence is meaningful, or can be understood. It is possible for a sentence to be both grammatical and meaningless, as in Chomsky's famous example "colourless green ideas sleep furiously". But such sentences manifest a linguistic problem distinct from that posed by meaningful but ungrammatical (non)-sentences such as "man the bit sandwich the", the meaning of which is fairly clear, but which no native speaker would accept as being well formed.

The use of such intuitive judgments freed syntacticians from studying language through a corpus of observed speech, since they were now able to study the grammatical properties of contrived sentences. Without this change in philosophy, the construction of generative grammars would have been almost impossible, since it is often the relatively obscure and rarely-used features of a language which give linguists clues about its structure, and it is very difficult to find good examples of such features in everyday speech.

Minimalism

Minimalism in the sense described here has no philosophical association with Minimalism, the artistic and cultural movement.

Much current research in transformational grammar is inspired by Chomsky's Minimalist Program.[7] The new research direction involves the further development of ideas involving economy of derivation and economy of representation, which had started to become significant in the early 1990s, but were still rather peripheral aspects of TGG theory. Economy of derivation is a principle stating that movements (i.e. transformations) only occur in order to match interpretable features with uninterpretable features. An example of an interpretable feature is the plural inflection on regular English nouns, e.g. dogs. The word dogs can only be used to refer to several dogs, not a single dog, and so this inflection contributes to meaning, making it interpretable. English verbs are inflected according to the grammatical number of their subject (e.g. "Dogs bite" vs "A dog bites"), but in most sentences this inflection just duplicates the information about number that the subject noun already has, and it is therefore uninterpretable. Economy of representation is the principle that grammatical structures must exist for a purpose, i.e. the structure of a sentence should be no larger or more complex than required to satisfy constraints on grammaticalness (note that this does not rule out complex sentences in general, only sentences that have superfluous elements in a narrow syntactic sense). Both notions, as described here, are somewhat vague, and indeed the precise formulation of these principles is a major area of controversy in current research. An additional aspect of minimalist thought is the idea that the derivation of syntactic structures should be uniform; that is, rules should not be stipulated as applying at arbitrary points in a derivation, but instead apply throughout derivations. Recently, it has been suggested that derivations proceed in phases). Deep Structure and Surface Structure are not present in Minimalist theories of syntax, and the most recent phase-based theories also eliminate LF and PF as unitary levels of representation.

Mathematical representation

Returning to the more general mathematical notion of a grammar, an important feature of all transformational grammars is that they are more powerful than context free grammars.[8] This idea was formalized by Chomsky in the Chomsky hierarchy. Chomsky argued that it is impossible to describe the structure of natural languages using context free grammars.[9] His general position regarding the non-context-freeness of natural language has held up since then, although his specific examples were later disproven. [10] [11]

Transformations

Some of the rules of Transformational-Generative Grammar are quite simple, such as the Head Initial/Final rules:

  • Head Initial - the Head occurs before its complement.
  • Head Final - the Head occurs after its complement (in English, head finality is only observed in morphology)
    • police station (noun)

Most languages tend to favor only one of these structures (there are exceptions). Japanese is a Head Final language, whereas English is a Head Initial language. Slightly more languages are Head Final.

Other rules are more complex, such as the so-called "Wh-Question Formation Rule" for English, which can be summarized as:

1. Begin with a simple declarative, with a missing item:

  • He gave X a book.

2. Insert the appropriate Wh-word for the expected lexical category of the answer:

  • He gave to whom a book.

3. Change the form of the verb to the appropriate "did X" construction:

  • He did give to whom a book.

4. Use Subject-Auxiliary Inversion to form an interrogative:

  • Did he give to whom a book?

5. Move the Wh-word element to the front of the sentence:

  • To whom did he give a book?

While Chomsky and others have abandoned much of traditional TGG (the mechanisms described in the example above have been out of date since the late 1960s), it continues to have useful applications in syntactic analysis and the study of children's language acquisition.

References

  1. ^ a b Chomsky, Noam (1965). Aspects of the Theory of Syntax. MIT Press.
  2. ^ Jackendoff, Ray (1974). Semantic Interpretation in Generative Grammar. MIT Press.
  3. ^ May, Robert C. (1977). The Grammar of Quantification. MIT Phd Dissertation. (Supervised by Noam Chomsky, this dissertation introduced the idea of "logical form".)
  4. ^ Chomsky, Noam (1986). Knowledge of Language. New York:Praeger.
  5. ^ Chomsky, Noam (2001). "Derivation by Phase". In Michael Kenstowicz (ed.) Ken Hale: A Life in Language. MIT Press. Pages 1-52. (See p. 49 fn. 2 for comment on E-Language.)
  6. ^ Newmeyer, Frederick J. (1986). Linguistic Theory in America (Second Edition). Academic Press.
  7. ^ Chomsky, Noam (1995). The Minimalist Program. MIT Press.
  8. ^ Peters, Stanley (1973). "On the generative power of transformational grammars". Information Sciences. 6: 49–83. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  9. ^ Chomsky, Noam (1956). "Three models for the description of language". IRE Transactions on Information Theory. 2: 113–124.
  10. ^ Shieber, Stuart (1985). "Evidence against the context-freeness of natural language". Linguistics and Philosophy. 8: 333–343.
  11. ^ Pullum, Geoffrey K. (1982). "Natural languages and context-free languages". Linguistics and Philosophy. 4: 471–504. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)

See also