Syntax

Transformational grammar Concepts and Perspectives

Transformational grammar with brief explanation

Transformational grammar

The transformational grammar is a kind of generative grammar , a stream of language developed by Noam Chomsky that provides a set of rules to predict the combinations that appear in grammatically correct sentences.

Avram Noam Chomsky is an American linguist, philosopher, writer and political analyst. Born on December 7, 1928 in Philadelphia , he is Emeritus Professor of Linguistics at MIT .

Grammar is the most used and fertile study model to address any intellectual approach to language manifestations and norms. Through grammar we establish general rules for the correct use of language (task of the so-called normative grammar), or simply describe how the speakers of a particular linguistic community use their language (task of the descriptive grammar call).

Traditionally, grammar has simply been content with this. It established norms, classified all the types of words of a language until grouping them in a few groupings; then he established the relations between them and thus he could verify the great differences that exist between the languages. Transformational grammar

Formal definition

Chomsky‘s adviser Zellig Harris considered the transformations to be relationships between phrases like “I finally met this talk show host you always hated” and simpler (kernel) phrases “I finally met this talk show host” and “You always hated this talk show host.” A transformational-generative (or simply transformational) grammar involved two types of productive rules: sentence structure rules, such as “S → NP VP” (a sentence may consist of a noun phrase followed by a verb phrase), etc., which may be used to generate grammatical sentences with associated parse trees (phrase markers or P markers); and transformational rules, such as rules for converting statements into questions or active to passive voice, which acted on sentence markers to produce other grammatically correct sentences. Hjelmslev had called word order conversion rules permutations .

In this context, transformational rules are not strictly necessary for the purpose of generating the set of grammatical sentences in a language, as this can be done using only the sentence structure rules, but the use of transformations provides savings in some cases. (the total number of rules can be reduced), and also provides a way to represent the grammatical relationships that exist between sentences that would not otherwise be reflected in a system with only sentence structure rules. Transformational grammar

This notion of transformation has proved adequate for subsequent versions, including the “extended”, “revised extended”, and Government-Binding (GB) versions of the generative grammar, but may no longer be sufficient for current minimalist grammar, as merging may require a formal definition that goes beyond the tree manipulation feature of Move α .

Development of basic concepts

While transformations continue to be important in Chomsky‘s current theories, he has now abandoned the original notion of deep structure and surface structure. Initially, two additional levels of representation were introduced – logical form (LF) and phonetic form (PF) – but in the 1990s, Chomsky outlined a new research program initially known as Minimalism , in which deep structure and surface structure do not are more prominent and PF and LF remain the only representation levels.

To complicate understanding of the development of Chomsky‘s theories, the precise meanings of deep structure and surface structure have changed over time. In the 1970s, Chomskyan linguists commonly called them D-Structure and S-Structure. In particular, Chomskyan linguists definitively abandoned the idea that the deep structure of a sentence determined its meaning (brought to its logical conclusions by generative semanticists during the 1970s). same period) when LF took on this role (previously, Chomsky and Ray Jackendoff began to argue that both deep and surface structure determined meaning). Transformational grammar

Innate linguistic knowledge

Using a term like “transformation” might give the impression that theories of transformational generative grammar are intended to be a model of the processes by which the human mind constructs and understands sentences, but Chomsky clearly stated that a generative grammar models only the knowledge that underlies it. the human ability to speak and understand, arguing that because most of this knowledge is innate, an infant may have a large body of knowledge about the structure of language in general and therefore need to learn only the idiosyncratic features of language(s) language(s) to which it is exposed.

Chomsky is not the first person to suggest that all languages ​​have certain fundamental things in common. He cited philosophers who posited the same basic idea several centuries ago. But Chomsky helped make the theory of the innate respectable after a period dominated by more behaviorist attitudes toward language. He made concrete and technically sophisticated proposals about the structure of language, as well as important proposals about how the success of grammatical theories should be evaluated. 

Grammatical theories

In the 1960s, Chomsky introduced two central ideas relevant to the construction and evaluation of grammatical theories. One was the distinction between competence and performance .  Chomsky noted the obvious fact that when people speak in the real world, they often make linguistic mistakes, such as starting a sentence and then dropping it midway. He argued that such errors in linguistic performance are irrelevant to the study of competence .linguistics, the knowledge that enables people to construct and understand grammatical sentences. Consequently, the linguist can study an idealized version of the language, which greatly simplifies linguistic analysis (see the “Grammaticity” section below). Transformational grammar

The other idea was directly related to the evaluation of grammatical theories. Chomsky distinguished between grammars that achieve descriptive adequacy and those that go further and achieve explanatory adequacy.. A descriptively adequate grammar for a specific language defines the (infinite) set of grammatical sentences in that language; that is, it describes the language in its entirety. A grammar that achieves explanatory adequacy has the additional property of providing insight into the underlying linguistic structures of the mind. In other words, it does not limit itself to describing the grammar of a language, but makes predictions about how linguistic knowledge is mentally represented. For Chomsky, the nature of such mental representations is largely innate, and therefore, if a grammatical theory has explanatory adequacy, it must be able to explain the grammatical nuances of different languages ​​as relatively minor variations in the universal pattern of human language.

Chomsky argued that while linguists were still a long way from constructing descriptively adequate grammars, progress in descriptive adequacy would come only if linguists kept explanatory adequacy as their goal: real knowledge of the structure of individual languages ​​can be gained only by the comparative study of a wide variety of languages, on the assumption that they are all cut from the same fabric. citation needed ]

“I-language” and “E-language”

In 1986, Chomsky proposed a distinction between I -language and E-language that is similar but not identical to the competence/performance distinction.  “Language-I” is the internal language; “E-language” is an external language. Language I is considered the object of study in linguistic theory; it is the mentally represented linguistic knowledge that a native speaker of a language possesses and, therefore, a mental object. From this perspective, most theoretical linguistics is a branch of psychology. Electronic language encompasses all other notions of what a language is, such as a body of knowledge or behavioral habits shared by a community. Thus, the language E is not a coherent concept by itself, and Chomsky argues that such notions of language are not useful in the study of innate linguistic knowledge or competence, although they may seem sensible and intuitive and useful in other areas of study. Competence, he argues, can be studied only if languages ​​are treated as mental objects.

Grammaticality

Chomsky argued that “grammatical” and “non-grammatical” can be defined in a meaningful and useful way. In contrast, an extreme behaviorist linguist would argue that language can be studied only through recordings or transcriptions of real speech and that the role of the linguist is to look for patterns in such observed speech, not to hypothesize why such patterns might occur or label specific grammatical or non-grammatical utterances. Few linguists in the 1950s actually took such an extreme position, but Chomsky was at the opposite extreme, defining grammaticality in an unusually mentalistic way for the time. He argued that a native speaker’s intuition is sufficient to define the grammaticality of a sentence; that is, if a particular sequence of English words provokes a double reaction or a sense of error in a native English speaker, with various extraneous factors affecting the controlled intuitions, the sequence of words can be said to be ungrammatical. This, according to Chomsky, is entirely distinct from the question of whether a sentence is meaningful or can be understood. It is possible for a sentence to be grammatical and meaningless, as in Chomsky‘s famous example, “colorless green ideas sleep furiously”. But such sentences manifest a linguistic problem that is distinct from that posed by (non-)significant but non-grammatical sentences such as “man the piece of sandwich”, whose meaning is quite clear, but which no native speaker would accept as well-formed.

The use of such intuitive judgments allowed generative syntacticians to base their research on a methodology in which studying language through an observed speech corpus was minimized, since the grammatical properties of constructed sentences were considered appropriate data to build a grammatical model. .

Minimalist program

From the mid-1990s onwards, much research in transformational grammar was inspired by Chomsky’s minimalist program . Its aim is to further develop ideas involving derivation economics and representation economics , which began to become significant in the early 1990s but were still peripheral aspects of transformational-generative grammatical theory:

  • Derivation economics is the principle that movements, or transformations, occur only to combine interpretable features with uninterpretable features . An example of an interpretable feature is the plural inflection in regular English nouns, e.g. dog s . The word dogs can be used to refer only to multiple dogs, not a single dog, and so the inflection adds to the meaning by making it interpretable . English verbs are inflected according to the number of their subject (“Dogs bite” v. “A dog bite s“), but in most sentences, this inflection just doubles the information about the number that the subject noun already has, and so the inflection cannot be interpreted .
  • Economics of representation is the principle that grammatical structures must exist for a purpose: the structure of a sentence must not be longer or more complex than necessary to satisfy grammatical constraints.

Both notions, as described here, are somewhat vague and their precise formulation is controversial. A further aspect of minimalist thinking is the idea that the derivation of syntactic structures should be uniform : rules should not be stipulated as being applied at arbitrary points in a derivation, but rather across all derivations. Minimalist approaches to sentence structure have resulted in “Bare Sentence Structure,” an attempt to eliminate the X-bar theory. In 1998, Chomsky suggested that derivations occur in phases. The distinction between deep structure and surface structure is not present in minimalist syntax theories, and more recent phase-based theories also eliminate LF and PF as unitary levels of representation.

Mathematical representation

An important feature of all transformational grammars is that they are more powerful than context-free grammars. Chomsky formalized this idea in the Chomsky hierarchy. He argued that it is impossible to describe the structure of natural languages ​​with context-free grammars. His general position on the non-context freedom of natural language has since stood, although his specific examples of the inadequacy of CFGs in terms of their weak generative capacity have been refuted. 

Transformations

The usual usage of the term “transformation” in linguistics refers to a rule that takes an input, usually called a deep structure (in Standard Theory) or D-structure (in extended Standard Theory or theory of government and binding), and changes in some way. constrained form to result in a surface structure (or S structure). In TG, sentence structure rules generate deep structures. For example, a typical transformation in TG is subject-auxiliary inversion (SAI). This rule takes as input a declarative sentence with an auxiliary, such as “John ate all the heirloom tomatoes”, and transforms it into “Has John eaten all the heirloom tomatoes?” In the original formulation (Chomsky 1957), these rules were stated as rules that prevailed over strings of terminals, constituent symbols, or both.

X NP AUX YX AUX NP Y

(NP = noun phrase and AUX = auxiliary)

In the 1970s, at the time of the Extended Pattern Theory, after the work of Joseph Emonds on the preservation of structures, transformations came to be seen as something that extended through trees. At the end of government and linkage theory, in the late 1980s, transformations were no longer operations of structural change; instead, they add information to existing trees by copying the constituents.

The first conceptions of transformations were that they were specific construction devices. For example, there was a transformation that transformed active sentences into passive ones. A different transformation raised embedded subjects in the subject position of the main clause in sentences like “John seems to be gone”, and a third argument reordered in the dative alternation. With the change from rules to principles and constraints that was found in the 1970s, these construction-specific transformations turned into general rules (all examples mentioned are NP move instances), which eventually changed to the only general rule moving alpha or To move .

Transformations actually come in two types: the post-deep structure type mentioned above, which is string or structure change, and generalized transformations (GTs). GTs were originally proposed in early forms of generative grammar (as in Chomsky 1957). They take small structures, atomic or generated by other rules, and combine them. For example, the embedding generalized transformation would take the “Dave said X” kernel and the “Dan likes to smoke” kernel and combine them into “Dave said that Dan likes to smoke”. GTs are therefore building structures rather than altering structures. In Extended Pattern Theory and in government and binding theory, GTs were abandoned in favor of recursive sentence structure rules,

In generative phonology, another form of transformation is the phonological rule, which describes a mapping between an underlying representation (the phoneme) and the surface shape that is articulated during natural speech. 

Critical reception

In 1978, the linguist and historian EFK Koerner hailed transformational grammar as the third and final Kuhnian revolution in linguistics, arguing that it brought about a shift from Ferdinand de Saussure‘s sociological approach to a Chomskyan conception of linguistics as analogous to chemistry and physics. Koerner also praised the philosophical and psychological value of Chomsky‘s theory. 

In 1983, Koerner retracted his earlier statement, suggesting that transformational grammar was a 1960s fad that had spread across the United States at a time when the federal government had invested heavily in new language departments. But he claims that Chomsky‘s work is not original when compared to other syntactic models of the time. According to Koerner, Chomsky’s rise to fame was orchestrated by Bernard Bloch, editor of Language, the journal of the Linguistic Society of America , and Roman Jakobson , a personal friend of Chomsky ‘s father . Koerner suggests that large sums of money were spent to bring foreign students to the 1962 Harvard International Congress, where an exceptional opportunity was arranged for Chomsky to give an opening speech making questionable claims of belonging to the rationalist tradition of Saussure, Humboldt, and the Port-Royal grammar in order to gain popularity among Europeans. The transformational agenda was later forced through at American conferences where students, instructed by Chomsky, regularly verbally attacked and ridiculed their potential opponents. 

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also
Close
Back to top button