Summary
American structuralism is a label attached to a heterogeneous but distinctive style of language
scholarship practiced in the United States, the heyday of which extended from around 1920
until the late 1950s. There is certainly diversity in the interests and intellectual stances of
American structuralists. Nevertheless, some minimum common denominators stand out.
American structuralists valued synchronic linguistic analysis, independent of—but not to the
exclusion of—study of a language’s development over time; they looked for, and tried to
articulate, systematic patterns in language data, attending in particular to the sound properties
of language and to morphophonology; they identified their work as part of a science of
language, rather than as philology or as a facet of literary studies, anthropology, or the study
of particular languages. Some American structuralists tried to establish the identity or
difference of linguistic units by studying their distribution with respect to other units, rather
than by relying on identity or difference of meaning. Some (but not all) American
structuralists avoided cross-linguistic generalizations, perceiving them as a threat to the hard-
won notion of the integrity of individual languages; some (but not all) avoided attributing
patterns they discovered in particular languages to cultural or psychological proclivities of
speakers. A considerable amount of American structuralist research focused on indigenous
languages of the Americas. One outstanding shared achievement of the group was the
institutionalization of linguistics as an autonomous discipline in the United States,
materialized by the founding of the Linguistic Society of America in 1924.
This composite picture of American structuralists needs to be balanced by recognition of their
diversity. One important distinction is between the goals and orientations of foundational
figures: Franz Boas (1858–1942), Edward Sapir (1884–1939), and Leonard Bloomfield
(1887–1949). The influence of Boas, Sapir, and Bloomfield was strongly felt by the next
generation of language scholars, who went on to appropriate, expand, modify, or otherwise
retouch their ideas to produce what is called post-Bloomfieldian linguistics. Post-
Bloomfieldian linguistics displays its own internal diversity, but still has enough coherence to
put into relief the work of other language scholars who were close contemporaries to the post-
Bloomfieldians, but who in various ways and for various reasons departed from them.
American structuralism has at least this much heterogeneity.
This article illustrates the character of American structuralism in the first half of the 20th
century. Analysis of a corpus of presidential addresses presented to the Linguistic Society of
America by key American structuralists grounds the discussion, and provides a microcosm
within which to observe some of its most salient features: both the shared preoccupations of
American structuralists and evidence of the contributions of individual scholars to a
significant collaborative project in the history of linguistics.
Keywords
mid-20th-century linguistics in the United States
American structuralism
descriptive linguistics
Linguistic Society of America
Subjects
History of Linguistics
By
Richard Nordquist
Richard Nordquist
English and Rhetoric Professor
Ph.D., Rhetoric and English, University of Georgia
M.A., Modern English and American Literature, University of Leicester
B.A., English, State University of New York
Dr. Richard Nordquist is professor emeritus of rhetoric and English at Georgia Southern
University and the author of several university-level grammar and composition textbooks.
Learn about our Editorial Process
Updated on March 22, 2020
In linguistics, generative grammar is grammar (the set of language rules) that indicates the
structure and interpretation of sentences that native speakers of a language accept as
belonging to their language.
Adopting the term generative from mathematics, linguist Noam Chomsky introduced the
concept of generative grammar in the 1950s. This theory is also known as transformational
grammar, a term still used today.
Generative Grammar
• Generative grammar is a theory of grammar, first developed by Noam Chomsky in the
1950s, that is based on the idea that all humans have an innate language capacity.
• Linguists who study generative grammar are not interested in prescriptive rules; rather, they
are interested in uncovering the foundational principals that guide all language production.
• Generative grammar accepts as a basic premise that native speakers of a language will find
certain sentences grammatical or ungrammatical and that these judgments give insight into the
rules governing the use of that language.
Definition of Generative Grammar
Grammar refers to the set of rules that structure a language, including syntax (the
arrangement of words to form phrases and sentences) and morphology (the study of words
and how they are formed). Generative grammar is a theory of grammar that holds that human
language is shaped by a set of basic principles that are part of the human brain (and even
present in the brains of small children). This "universal grammar," according to linguists like
Chomsky, comes from our innate language faculty.
In Linguistics for Non-Linguists: A Primer With Exercises, Frank Parker and Kathryn Riley
argue that generative grammar is a kind of unconscious knowledge that allows a person, no
matter what language they speak, to form "correct" sentences. They continue:
"Simply put, a generative grammar is a theory of competence: a model of the psychological
system of unconscious knowledge that underlies a speaker's ability to produce and interpret
utterances in a language ... A good way of trying to understand [Noam] Chomsky's point is to
think of a generative grammar as essentially a definition of competence: a set of criteria that
linguistic structures must meet to be judged acceptable," (Parker and Riley 2009).
Generative Vs. Prescriptive Grammar
Generative grammar is distinct from other grammars such as prescriptive grammar, which
attempts to establish standardized language rules that deem certain usages "right" or "wrong,"
and descriptive grammar, which attempts to describe language as it is actually used (including
the study of pidgins and dialects). Instead, generative grammar attempts to get at something
deeper—the foundational principles that make language possible across all of humanity.
For example, a prescriptive grammarian may study how parts of speech are ordered in English
sentences, with the goal of laying out rules (nouns precede verbs in simple sentences, for
example). A linguist studying generative grammar, however, is more likely to be interested in
issues such as how nouns are distinguished from verbs across multiple languages.
Principles of Generative Grammar
The main principle of generative grammar is that all humans are born with an innate capacity
for language and that this capacity shapes the rules for what is considered "correct" grammar
in a language. The idea of an innate language capacity—or a "universal grammar"—is not
accepted by all linguists. Some believe, to the contrary, that all languages are learned and,
therefore, based on certain constraints.
Proponents of the universal grammar argument believe that children, when they are very
young, are not exposed to enough linguistic information to learn the rules of grammar. That
children do in fact learn the rules of grammar is proof, according to some linguists, that there
is an innate language capacity that allows them to overcome the "poverty of the stimulus."
Examples of Generative Grammar
As generative grammar is a "theory of competence," one way to test its validity is with what
is called a grammaticality judgment task. This involves presenting a native speaker with a
series of sentences and having them decide whether the sentences are grammatical
(acceptable) or ungrammatical (unacceptable). For example:
The man is happy.
Happy man is the.
A native speaker would judge the first sentence to be acceptable and the second to be
unacceptable. From this, we can make certain assumptions about the rules governing how
parts of speech should be ordered in English sentences. For instance, a "to be" verb linking a
noun and an adjective must follow the noun and precede the adjective.
Universal grammar is the theoretical or hypothetical system of categories, operations, and principles
shared by all human languages and considered to be innate. Since the 1980s, the term has often
been capitalized. The term is also known as Universal Grammar Theory.
Linguist Noam Chomsky explained, "'[U]niversal grammar' is taken to be the set of properties,
conditions, or whatever that constitute the 'initial state' of the language learner, hence the basis on
which knowledge of a language develops." ("Rules and Representations." Columbia University Press,
1980)
The concept is connected to the ability of children to be able to learn their native language.
"Generative grammarians believe that the human species evolved a genetically universal
grammar common to all peoples and that the variability in modern languages is basically on the
surface only," wrote Michael Tomasello. ("Constructing a Language: A Usage-Based Theory of
Language Acquisition." Harvard University Press, 2003)
And Stephen Pinker elaborates thusly:
"In cracking the code of language...children's minds must be constrained to pick out just the right
kinds of generalizations from the speech around them....It is this line of reasoning that led Noam
Chomsky to propose that language acquisition in children is the key to understanding the nature
of language, and that children must be equipped with an innate Universal Grammar: a set of plans
for the grammatical machinery that powers all human languages. This idea sounds more
controversial than it is (or at least more controversial than it should be) because the logic
of induction mandates that children make some assumptions about how language works in order for
them to succeed at learning a language at all. The only real controversy is what these assumptions
consist of: a blueprint for a specific kind of rule system, a set of abstract principles, or a mechanism
for finding simple patterns (which might also be used in learning things other than language)." ("The
Stuff of Thought." Viking, 2007)
"Universal grammar is not to be confused with universal language," noted Elena Lombardi, "or with
the deep structure of language, or even with grammar itself" ("The Syntax of Desire," 2007). As
Chomsky has observed, "[U]niversal grammar is not a grammar, but rather a theory of grammars, a
kind of metatheory or schematism for grammar" ("Language and Responsibility," 1979).
History and Background
The concept of a universal grammar (UG) has been traced to the observation of Roger Bacon, a 13th-
century Franciscan friar, and philosopher, that all languages are built upon a common grammar. The
expression was popularized in the 1950s and 1960s by Chomsky and other linguists.
Components that are considered to be universal include the notion that words can be classified into
different groups, such as being nouns or verbs and that sentences follow a particular structure.
Sentence structures may be different between languages, but each language has some kind of
framework so that speakers can understand each other vs. speaking gibberish. Grammar rules,
borrowed words, or idioms of a particular language by definition are not universal grammar.
Challenges and Criticisms
Of course, any theory in an academic setting will have challenges, comments, and criticisms by others
in the field; such as it is with peer review and the academic world, where people build on the body of
knowledge through writing academic papers and publishing their opinions.
Swarthmore College linguist K. David Harrison noted in The Economist, "I and many fellow linguists
would estimate that we only have a detailed scientific description of something like 10% to 15% of
the world's languages, and for 85% we have no real documentation at all. Thus it seems premature to
begin constructing grand theories of universal grammar. If we want to understand universals, we
must first know the particulars." ("Seven Questions for K. David Harrison." Nov. 23, 2010)
And Jeff Mielke finds some aspects of universal grammar theory to be illogical:
"[T]he phonetic motivation for Universal Grammar is extremely weak. Perhaps the most compelling
case that can be made is that phonetics, like semantics, is part of the grammar and that there is an
implicit assumption that if the syntax is rooted in Universal Grammar, the rest should be too. Most of
the evidence for UG is not related to phonology, and phonology has more of a guilt-by-association
status with respect to innateness." ("The Emergence of Distinctive Features." Oxford University
Press, 2008)
Iain McGilchrist disagrees with Pinkner and took the side of children learning a language just through
imitation, which is a behaviorist approach, as opposed to the Chomsky theory of the poverty of the
stimulus:
"[I]t is uncontroversial that the existence of a universal grammar such as Chomsky conceived it is
highly debatable. It remains remarkably speculative 50 years after he posited it, and is disputed by
many important names in the field of linguistics. And some of the facts are hard to square with it.
Languages across the world, it turns out, use a very wide variety of syntax to structure sentences. But
more importantly, the theory of universal grammar is not convincingly compatible with the process
revealed by developmental psychology, whereby children actually acquire language in the real world.
Children certainly evince a remarkable ability to grasp spontaneously the conceptual and
psycholinguistic shapes of speech, but they do so in a far more holistic, than analytic, way. They are
astonishingly good imitators—note, not copying machines, but imitators." ("The Master and His
Emissary: The Divided Brain and the Making of the Western World." Yale University Press, 2009)
Chomskyan linguistics is a broad term for the principles of language and the methods of
language study introduced and/or popularized by American linguist Noam Chomsky in such
groundbreaking works as Syntactic Structures (1957) and Aspects of the Theory of Syntax
(1965). Also spelled Chomskian linguistics and sometimes treated as a synonym for formal
linguistics.
In the article "Universalism and Human Difference in Chomskyan Linguistics" (Chomskyan
[R]evolutions, 2010), Christopher Hutton observes that "Chomskyan linguistics is defined by
a fundamental commitment to universalism and to the existence of a shared species-wide
knowledge grounded in human biology."
See Examples and Observations, below. Also, see:
Cognitive Linguistics
Deep Structure and Surface Structure
Generative Grammar and Transformational Grammar
Linguistic Competence and Linguistic Performance
Mental Grammar
Pragmatic Competence
Syntax
Ten Types of Grammar
Universal Grammar
What Is Linguistics?
Examples and Observations
"The only place a language occupies in Chomskyan linguistics is non-geographical,
in the speaker's mind."
(Pius ten Hacken, "The Disappearance of the Geographical Dimension of Language in
American Linguistics." The Space of English, ed. by David Spurr and Cornelia
Tschichold. Gunter Narr Verlag, 2005)
"Roughly stated, Chomskyan linguistics claims to reveal something about the mind,
but imperviously prefers a strictly autonomist methodology over the open dialogue
with psychology that would seem to be implied by such a claim."
(Dirk Geeraerts, "Prototype Theory." Cognitive Linguistics: Basic Readings, ed. by
Dirk Geeraerts. Walter de Gruyter, 2006)
The Origin and Influence of Chomskyan Linguistics
- "[I]n 1957, the young American linguist Noam Chomsky published Syntactic
Structures, a brief and watered-down summary of several years of original research. In
that book, and in his succeeding publications, Chomsky made a number of
revolutionary proposals: he introduced the idea of a generative grammar, developed a
particular kind of generative grammar called transformational grammar, rejected his
predecessors' emphasis on the description of data--in favour of a highly theoretical
approach based upon a search for universal principles of language (later called
universal grammar)--proposed to turn linguistics firmly toward mentalism, and laid the
foundation for integrating the field into the as yet unnamed new discipline of cognitive
science.
"Chomsky's ideas excited a whole generation of students . . .. Today Chomsky's
influence is undimmed, and Chomskyan linguistics form a large and maximally
prominent cohort among the community of linguists, to such an extent that outsiders
often have the impression that linguistics is Chomskyan linguistics . . .. But this is
seriously misleading.
"In fact, the majority of the world's linguists would acknowledge no more than the
vaguest debt to Chomsky, if even that."
(Robert Lawrence Trask and Peter Stockwell, Language and Linguistics: The Key
Concepts, 2nd ed. Routledge, 2007)
- "In the latter half of the twentieth century, Chomskyan linguistics dominated most
branches of the field apart from semantics, although many alternative approaches were
proposed. All of these alternatives share the assumption that a satisfactory linguistic
theory is in principle applicable to all languages. In that sense, universal grammar is as
alive today as it was in antiquity."
(Jaap Maat, "General or Universal Grammar From Plato to Chomsky." The Oxford
Handbook of the History of Linguistics, ed. by Keith Allan. Oxford University Press,
2013)
From Behaviorism to Mentalism
"The revolutionary nature of Chomskyan linguistics must be considered within the
framework of another 'revolution,' in psychology, from behaviorism to cognitivism.
George Miller dates this paradigm shift to a conference held at M.I.T. in 1956, in
which Chomsky participated. . . . Chomsky evolves from behaviorism to mentalism
between Syntactic Structures (1957) and Aspects of the Theory of Syntax (1965). This
led psycholinguists to consider the relationship between deep structure and surface
structure in processing. However the results were not very promising, and Chomsky
himself seemed to abandon psychological reality as a relevant consideration in
linguistic analysis. His focus on intuition favored rationalism over empiricism, and
innate structures over acquired behavior. This biological turn—the search for the
language 'organ, the 'language acquisition device,' etc.—became the new foundation
for a science of linguistics."
(Malcolm D. Hyman, "Chomsky Between Revolutions." Chomskyan (R)evolutions,
ed. by Douglas A. Kibbee. John Benjamins, 2010)
Characteristics of Chomskyan Linguistics
"For the sake of simplicity, we list some of the characteristics of the Chomskyan
approach:
- Formalism. . . . Chomskyan linguistics sets out to define and specify the rules and
principles which generate the grammatical or well-formed sentences of a language.
- Modularity. The mental grammar is regarded as a special module of the mind which
constitutes a separate cognitive faculty which has no connection with other mental
capacities.
- Sub-modularity. Mental grammar is thought to be divided into other sub-modules.
Some of these sub-modules are the X-bar principle or the Theta principle. Each of
them has a particular function. The interaction of these smaller components results in
the complexities of syntactic structures.
- Abstractness. With the passing of time, Chomskyan linguistics has become more and
more abstract. By this we mean that entities and processes put forward do not overtly
manifest themselves in linguistic expressions. By way of illustration, take the case of
underlying structures which hardly resemble surface structures.
- Search for high-level generalization. Those aspects of linguistic knowledge which
are idiosyncratic and do not abide by general rules are disregarded from a theoretical
point of view since they are regarded as uninteresting. The only aspects which deserve
attention are those which are subject to general principles such as wh-movement or
raising." (Ricardo Mairal Usón, et al., Current Trends in Linguistic Theory. UNED,
2006)
The Minimalist Program
"[W]ith the passage of time, and in collaboration with a variety of colleagues . . .,
Chomsky himself has significantly modified his views, both about those features that
are unique to language—and that thus have to be accounted for in any theory of its
origin—and about its underlying mechanism. Since the 1990s, Chomsky and his
collaborators have developed what has come to be known as the 'Minimalist Program,'
which seeks to reduce the language faculty to the simplest possible mechanism. Doing
this has involved ditching niceties like the distinction between deep and surface
structures, and concentrating instead on how the brain itself creates the rules that
govern language production."
(Ian Tattersall, "At the Birth of Language." The New York Review of Books, August
18, 2016)
Chomskyan Linguistics as a Research Program
"Chomskyan linguistics is a research program in linguistics. As such, it should be
distinguished from Chomsky's linguistic theory. While both were conceived by Noam
Chomsky in the late 1950s, their aims and later development are strikingly different.
Chomsky's linguistic theory went through a number of stages in its development . . ..
Chomskyan linguistics, by contrast, remained stable during this period. It does not
refer to tree structures but specifies what a linguistic theory should explain and how
such a theory should be evaluated.
"Chomskyan linguistics defines the object of study as the knowledge of language a
speaker has. This knowledge is called the linguistic competence or internalized
language (I-language). It is not open to conscious, direct introspection, but a wide
range of its manifestations can be observed and used as data for the study of
language."
(Pius ten Hacken, "Formalism/Formalist Linguistics." Concise Encyclopedia of
Philosophy of Language and Linguistics, ed. by Alex Barber and Robert J. Stainton.
Elsevier, 2010)