Syntax
Part of a series on |
Linguistics |
---|
Portal |
In linguistics, syntax (/ˈsɪntæks/ SIN-taks)[1][2] is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency),[3] agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). Diverse approaches, such as generative grammar and functional grammar, offer unique perspectives on syntax, reflecting its complexity and centrality to understanding human language.
Etymology
The word syntax comes from the ancient Greek word σύνταξις, meaning an orderly or systematic arrangement, which consists of σύν- (syn-, "together" or "alike"), and τάξις (táxis, "arrangement"). In Hellenistic Greek, this also specifically developed a use referring to the grammatical order of words, with a slightly altered spelling: συντάσσειν. The English term, which first appeared in 1548, is partly borrowed from Latin (syntaxis) and Greek, though the Latin term developed from Greek.[4]
Topics
The field of syntax contains a number of various topics that a syntactic theory is often designed to handle. The relation between the topics is treated differently in different theories, and some of them may not be considered to be distinct but instead to be derived from one another (i.e. word order can be seen as the result of movement rules derived from grammatical relations).
Sequencing of subject, verb, and object
One basic description of a language's syntax is the sequence in which the subject (S), verb (V), and object (O) usually appear in sentences. Over 85% of languages usually place the subject first, either in the sequence SVO or the sequence SOV. The other possible sequences are VSO, VOS, OVS, and OSV, the last three of which are rare. In most generative theories of syntax, the surface differences arise from a more complex clausal phrase structure, and each order may be compatible with multiple derivations. However, word order can also reflect the semantics or function of the ordered elements.[5]
Grammatical relations
Another description of a language considers the set of possible grammatical relations in a language or in general and how they behave in relation to one another in the morphosyntactic alignment of the language. The description of grammatical relations can also reflect transitivity, passivization, and head-dependent-marking or other agreement. Languages have different criteria for grammatical relations. For example, subjecthood criteria may have implications for how the subject is referred to from a relative clause or coreferential with an element in an infinite clause.[6]
Constituency
Constituency is the feature of being a constituent and how words can work together to form a constituent (or phrase). Constituents are often moved as units, and the constituent can be the domain of agreement. Some languages allow discontinuous phrases in which words belonging to the same constituent are not immediately adjacent but are broken up by other constituents. Constituents may be recursive, as they may consist of other constituents, potentially of the same type.
Early history
The Aṣṭādhyāyī of Pāṇini, from c. 4th century BC in Ancient India, is often cited as an example of a premodern work that approaches the sophistication of a modern syntactic theory since works on grammar had been written long before modern syntax came about.[7] In the West, the school of thought that came to be known as "traditional grammar" began with the work of Dionysius Thrax.
For centuries, a framework known as grammaire générale, first expounded in 1660 by Antoine Arnauld and Claude Lancelot in a book of the same title, dominated work in syntax:[8] as its basic premise the assumption that language is a direct reflection of thought processes and so there is a single most natural way to express a thought.[9]
However, in the 19th century, with the development of historical-comparative linguistics, linguists began to realize the sheer diversity of human language and to question fundamental assumptions about the relationship between language and logic. It became apparent that there was no such thing as the most natural way to express a thought and so logic could no longer be relied upon as a basis for studying the structure of language.[citation needed]
The Port-Royal grammar modeled the study of syntax upon that of logic. (Indeed, large parts of Port-Royal Logic were copied or adapted from the Grammaire générale.[10]) Syntactic categories were identified with logical ones, and all sentences were analyzed in terms of "subject – copula – predicate". Initially, that view was adopted even by the early comparative linguists such as Franz Bopp.
The central role of syntax within theoretical linguistics became clear only in the 20th century, which could reasonably be called the "century of syntactic theory" as far as linguistics is concerned. (For a detailed and critical survey of the history of syntax in the last two centuries, see the monumental work by Giorgio Graffi (2001).[11])
Theories
There are a number of theoretical approaches to the discipline of syntax. One school of thought, founded in the works of Derek Bickerton,[12] sees syntax as a branch of biology, since it conceives of syntax as the study of linguistic knowledge as embodied in the human mind. Other linguists (e.g., Gerald Gazdar) take a more Platonistic view since they regard syntax to be the study of an abstract formal system.[13] Yet others (e.g., Joseph Greenberg) consider syntax a taxonomical device to reach broad generalizations across languages.
Syntacticians have attempted to explain the causes of word-order variation within individual languages and cross-linguistically. Much of such work has been done within the framework of generative grammar, which holds that syntax depends on a genetic endowment common to the human species. In that framework and in others, linguistic typology and universals have been primary explicanda.[14]
Alternative explanations, such as those by functional linguists, have been sought in language processing. It is suggested that the brain finds it easier to parse syntactic patterns that are either right- or left-branching but not mixed. The most-widely held approach is the performance–grammar correspondence hypothesis by John A. Hawkins, who suggests that language is a non-innate adaptation to innate cognitive mechanisms. Cross-linguistic tendencies are considered as being based on language users' preference for grammars that are organized efficiently and on their avoidance of word orderings that cause processing difficulty. Some languages, however, exhibit regular inefficient patterning such as the VO languages Chinese, with the adpositional phrase before the verb, and Finnish, which has postpositions, but there are few other profoundly exceptional languages.[15] More recently, it is suggested that the left- versus right-branching patterns are cross-linguistically related only to the place of role-marking connectives (adpositions and subordinators), which links the phenomena with the semantic mapping of sentences.[16]
Theoretical syntactic models
Dependency grammar
Dependency grammar is an approach to sentence structure in which syntactic units are arranged according to the dependency relation, as opposed to the constituency relation of phrase structure grammars. Dependencies are directed links between words. The (finite) verb is seen as the root of all clause structure and all the other words in the clause are either directly or indirectly dependent on this root (i.e. the verb). Some prominent dependency-based theories of syntax are the following:
- Recursive categorical syntax, or algebraic syntax
- Functional generative description
- Meaning–text theory
- Operator grammar
- Word grammar
Lucien Tesnière (1893–1954) is widely seen as the father of modern dependency-based theories of syntax and grammar. He argued strongly against the binary division of the clause into subject and predicate that is associated with the grammars of his day (S → NP VP) and remains at the core of most phrase structure grammars. In the place of that division, he positioned the verb as the root of all clause structure.[17]
Categorial grammar
Categorial grammar is an approach in which constituents combine as function and argument, according to combinatory possibilities specified in their syntactic categories. For example, other approaches might posit a rule that combines a noun phrase (NP) and a verb phrase (VP), but CG would posit a syntactic category NP and another NP\S, read as "a category that searches to the left (indicated by \) for an NP (the element on the left) and outputs a sentence (the element on the right)." Thus, the syntactic category for an intransitive verb is a complex formula representing the fact that the verb acts as a function word requiring an NP as an input and produces a sentence level structure as an output. The complex category is notated as (NP\S) instead of V. The category of transitive verb is defined as an element that requires two NPs (its subject and its direct object) to form a sentence. That is notated as (NP/(NP\S)), which means, "A category that searches to the right (indicated by /) for an NP (the object) and generates a function (equivalent to the VP) which is (NP\S), which in turn represents a function that searches to the left for an NP and produces a sentence."
Tree-adjoining grammar is a categorial grammar that adds in partial tree structures to the categories.
Stochastic/probabilistic grammars/network theories
Theoretical approaches to syntax that are based upon probability theory are known as stochastic grammars. One common implementation of such an approach makes use of a neural network or connectionism.
Functional grammars
Functionalist models of grammar study the form–function interaction by performing a structural and a functional analysis.
- Functional discourse grammar (Dik)
- Prague linguistic circle
- Role and reference grammar (RRG)
- Systemic functional grammar
Generative syntax
Generative syntax is the study of syntax within the overarching framework of generative grammar. Generative theories of syntax typically propose analyses of grammatical patterns using formal tools such as phrase structure grammars augmented with additional operations such as syntactic movement. Their goal in analyzing a particular language is to specify rules which generate all and only the expressions which are well-formed in that language. In doing so, they seek to identify innate domain-specific principles of linguistic cognition, in line with the wider goals of the generative enterprise. Generative syntax is among the approaches that adopt the principle of the autonomy of syntax by assuming that meaning and communicative intent is determined by the syntax, rather than the other way around.
Generative syntax was proposed in the late 1950s by Noam Chomsky, building on earlier work by Zellig Harris, Louis Hjelmslev, and others. Since then, numerous theories have been proposed under its umbrella:
- Transformational grammar (TG) (Original theory of generative syntax laid out by Chomsky in Syntactic Structures in 1957)[18]
- Government and binding theory (GB) (revised theory in the tradition of TG developed mainly by Chomsky in the 1970s and 1980s)[19]
- Minimalist program (MP) (a reworking of the theory out of the GB framework published by Chomsky in 1995)[20]
Other theories that find their origin in the generative paradigm are:
- Arc pair grammar
- Generalized phrase structure grammar (GPSG)
- Generative semantics
- Head-driven phrase structure grammar (HPSG)
- Lexical functional grammar (LFG)
- Nanosyntax
- Relational grammar (RG)
- Harmonic grammar (HG)
Cognitive and usage-based grammars
The Cognitive Linguistics framework stems from generative grammar but adheres to evolutionary, rather than Chomskyan, linguistics. Cognitive models often recognise the generative assumption that the object belongs to the verb phrase. Cognitive frameworks include the following:
See also
- Cartographic syntax
- Metasyntax
- Musical syntax
- Semiotics
- Syntactic category
- Syntax (academic journal)
- Syntax (programming languages)
- Syntax–Semantics Interface
- Usage
Syntactic terms
- List of syntactic phenomena
- Adjective
- Adjective phrase
- Adjunct
- Adpositional phrase
- Adverb
- Antecedent
- Appositive
- Argument
- Article
- Aspect
- Attributive adjective and predicative adjective
- Auxiliary verb
- Branching
- c-command
- Category
- Catena
- Clause
- Closed class word
- Comparative
- Complement
- Compound noun and adjective
- Conjugation
- Conjunction
- Constituent
- Coordination
- Crossover
- Dangling modifier
- Declension
- Dependency grammar
- Dependent marking
- Determiner
- Dual (form for two)
- Endocentric
- Finite verb
- Function word
- Gender
- Gerund
- Government
- Head
- Head marking
- Infinitive
- Inversion
- Lexical item
- Logical form (linguistics)
- m-command
- Measure word (classifier)
- Merge
- Modal particle
- Modal verb
- Modifier
- Mood
- Movement
- Movement paradox
- Nanosyntax
- Non-finite verb
- Noun
- Noun ellipsis
- Noun phrase
- Number
- Object
- Open class word
- Part of speech
- Particle
- Periphrasis
- Person
- Personal pronoun
- Phrasal verb
- Phrase
- Phrase structure grammar
- Plural
- Predicate
- Predicative expression
- Preposition and postposition
- Pronoun
- Grammatical relation
- Restrictiveness
- Right node raising
- Scrambling
- Selection
- Sentence
- Separable verb
- Singular
- Subcategorization
- Subject
- Subordination
- Superlative
- Tense
- Uninflected word
- V2 word order
- Valency
- Verb
- Verb phrase
- Voice
- Word order
- X-bar theory
References
Citations
- ^ "syntax". Lexico UK English Dictionary. Oxford University Press. Archived from the original on 2020-03-22.
- ^ "syntax". Merriam-Webster.com Dictionary. Merriam-Webster.
- ^ Luuk, Erkki (2015). "Syntax–Semantics Interface". In Wright, James D. (ed.). International Encyclopedia of the Social & Behavioral Sciences (2nd ed.). Amsterdam: Elsevier. pp. 900–905. doi:10.1016/b978-0-08-097086-8.57035-4. ISBN 978-0-08-097087-5.
- ^ Oxford English Dictionary, s.v. “syntax (n.),” July 2023, https://doi.org/10.1093/OED/1603449563.
- ^ Rijkhoff, Jan (2015). "Word Order" (PDF). In Wright, James D. (ed.). International Encyclopedia of the Social & Behavioral Sciences (2nd ed.). Amsterdam: Elsevier. pp. 644–656. doi:10.1016/b978-0-08-097086-8.53031-1. ISBN 978-0-08-097087-5.
- ^ Shibatani, Masayoshi (2021). "Syntactic Typology". Oxford Research Encyclopedia of Linguistics. Oxford: Oxford University Press. doi:10.1093/acrefore/9780199384655.013.154. ISBN 978-0-19-938465-5.
- ^ Fortson, Benjamin W. (2004). Indo-European Language and Culture: An Introduction. Blackwell. p. 186. ISBN 978-1-4051-8896-8.
[The Aṣṭādhyāyī] is a highly precise and thorough description of the structure of Sanskrit somewhat resembling modern generative grammar...[it] remained the most advanced linguistic analysis of any kind until the twentieth century.
- ^ Arnauld, Antoine; Lancelot, Claude; Rollin, Bernard E.; Danto, Arthur Coleman; Kretzmann, Norman; Arnauld, Antoine (1975). The Port-Royal grammar: General and rational grammar. The Hague: De Gruyter. p. 197. ISBN 9789027930040.
- ^ Arnault, Antoine; Lancelot, Claude (1660). Grammaire générale et raisonnée de Port-Royal.
- ^ Arnauld, Antoine (1683). La logique (5th ed.). Paris: G. Desprez. p. 137.
Nous avons emprunté...ce que nous avons dit...d'un petit Livre...sous le titre de Grammaire générale.
- ^ Graffi (2001).
- ^ See Bickerton, Derek (1990). Language & Species. Chicago: University of Chicago Press. ISBN 0-226-04610-9. and, for more recent advances, Bickerton, Derek; Szathmáry, Eörs, eds. (2009). Biological Foundations and Origin of Syntax. Cambridge, Massachusetts: MIT Press. ISBN 978-0-262-01356-7.
- ^ Gazdar, Gerald (2 May 2001). "Generalized Phrase Structure Grammar" (Interview). Interviewed by Ted Briscoe. Archived from the original on 2005-11-22. Retrieved 2008-06-04.
- ^ Moravcsik, Edith (2010). "Explaining Language Universals". The Oxford Handbook of Linguistic Typology. doi:10.1093/oxfordhb/9780199281251.013.0005. Retrieved 2022-03-13.
- ^ Song, Jae Jung (2012). Word Order. New York: Cambridge University Press. ISBN 978-1-139-03393-0.
- ^ Austin, Patrik (2021). "A semantic and pragmatic explanation of harmony". Acta Linguistica Hafniensia. 54 (1): 1–23. doi:10.1080/03740463.2021.1987685. hdl:10138/356149. S2CID 244941417.
- ^ Concerning Tesnière's rejection of the binary division of the clause into subject and predicate and in favor of the verb as the root of all structure, see Tesnière (1969:103–105).
- ^ Chomsky, Noam (1957). Syntactic Structures. The Hague: Mouton. p. 15.
- ^ Chomsky, Noam (1993). Lectures on Government and Binding: The Pisa Lectures (7th ed.). Berlin: Mouton de Gruyter. ISBN 3-11-014131-0.
- ^ Chomsky, Noam (1995). The Minimalist Program. Cambridge, Massachusetts: The MIT Press.
Sources
- Brown, Keith; Miller, Jim, eds. (1996). Concise Encyclopedia of Syntactic Theories. New York: Elsevier Science. ISBN 0-08-042711-1.
- Carnie, Andrew (2006). Syntax: A Generative Introduction (2nd ed.). Oxford: Wiley-Blackwell. ISBN 1-4051-3384-8.
- Freidin, Robert; Lasnik, Howard, eds. (2006). Syntax. Critical Concepts in Linguistics. New York: Routledge. ISBN 0-415-24672-5.
- Graffi, Giorgio (2001). 200 Years of Syntax: A Critical Survey. Studies in the History of the Language Sciences 98. Amsterdam: Benjamins. ISBN 90-272-4587-8.
- Talasiewicz, Mieszko (2009). Philosophy of Syntax – Foundational Topics. Dordrecht: Springer. ISBN 978-90-481-3287-4. An interdisciplinary essay on the interplay between logic and linguistics on syntactic theories.
- Tesnière, Lucien (1969). Eléments de syntaxe structurale (in French) (2nd ed.). Paris: Klincksieck. ISBN 2-252-01861-5.
Further reading
- Everaert, Martin; Van Riemsdijk, Henk; Goedemans, Rob; Hollebrandse, Bart, eds. (2006). The Blackwell Companion to Syntax. Malden, Massachusetts: Blackwell. ISBN 978-1-4051-1485-1. 5 Volumes; 77 case studies of syntactic phenomena.
- Isac, Daniela; Reiss, Charles (2013). I-Language: An Introduction to Linguistics as Cognitive Science (2nd ed.). Oxford: Oxford University Press. ISBN 978-0-19-966017-9.
- Moravcsik, Edith A. (2006). An Introduction to Syntax: Fundamentals of Syntactic Analysis. London: Continuum. ISBN 978-0-8264-8946-3. Attempts to be a theory-neutral introduction. The companion Moravcsik, Edith A. (2006). An Introduction to Syntactic Theory. London: Continuum. ISBN 0-8264-8943-5. surveys the major theories. Jointly reviewed in Hewson, John (2009). "An Introduction to Syntax: Fundamentals of Syntactic Analysis, And: An Introduction to Syntactic Theory (Review)". The Canadian Journal of Linguistics. 54 (1): 172–175. doi:10.1353/cjl.0.0036. S2CID 144032671.
- Müller, Stefan (2020). Grammatical Theory: From Transformational Grammar to Constraint-Based Approaches (4th revised and extended ed.). Berlin: Language Science Press. ISBN 978-3-96110-273-0.
- Roark, Brian; Sproat, Richard William (2007). Computational Approaches to Morphology and Syntax. Oxford: Oxford University Press. ISBN 978-0-19-927477-2. part II: Computational approaches to syntax.
External links
- The syntax of natural language: An online introduction using the Trees program – Beatrice Santorini & Anthony Kroch, University of Pennsylvania, 2007