“Syntax” is the theory of the construction of sentences out of words. In linguistics, syntax is distinguished from morphology, or the theory of the construction of words out of minimal units of significance, only some of which are words.
According to this division, it is a matter of morphology that the word solubility decomposes into “dissolve” + “able” + “ity”; but it is a matter of syntax to analyze the construction of the sentence, “That substance is able to dissolve.”
Although syntax is a traditional grammatical topic, it was only with the rise of formal methods growing out of the study of mathematical logic that the subject attained sufficient explicitness to be studied in depth, in works by Zelig Harris (1957) and Noam Chomsky (1957). Since then a flourishing field has been created; for it was rapidly discovered that the syntax of human languages was far more complex than at first appeared.
In this respect, the development of syntax is comparable to other fields of cognitive science such as human vision, problem-solving capacities, and the organization of commonsense knowledge, all of which gave rise to difficult problems once the goal of fully explicit representation was put in place.
The dawn of syntax is marked by the realization that the structure of sentences is hierarchical; that is, that behind the linear order of words and morphemes that is visible in natural languages there is another organization in terms of larger or smaller constituents nested one within another.
Description of sentences at this level is said to give their phrase structure. Moreover, phrases of a given kind can occur within others of the same kind: It is this recursive feature of language that enables sentences of arbitrary complexity to be constructed.
The realization that phrase structure is recursive is very old. Assuming the categories of a complete noun phrase (NP) and sentence (S), Antoine Arnauld (1662) gives the examples (rendered here in English):
- (SThe divine law commands that [Skings are to be honored])
- (S[NPMen [Swho are pious]] are charitable)
In linguistic theory the recursive structure of syntax is expressed by principles of combination modeled after the clauses of an inductive definition. However, far more complex devices seem to be required for a compact description that helps to reveal the basis of the native speaker’s ability.
Chomsky’s introduction of grammatical transformations opened the way to a variety of formalisms and developments. Chomsky also initiated the conception of linguistic theory as a study of the acquisition of a system of linguistic knowledge, or competence. Any human language is acquirable under ordinary experiential conditions by any normal child.
The space between empirical evidence and the resulting linguistic competence is sufficiently great that a kind of readiness for language, universal grammar in Chomsky’s terminology, is presupposed. Contemporary theory seeks to probe the basis for this readiness in terms of innate rules and principles of grammar.
Within philosophy too the theory of syntax came to play an important role in the systematization of mathematics, and assumed central importance in Rudolf Carnap (1934). Carnap distinguished between grammatical syntax, of the sort that a linguist might give in a description of a language, and logical syntax, whose aim was not only to specify the class of sentences (or well-formed formulas of a calculus) but also to use formal methods in constructing a theory of logical consequence and logical truth.
Carnap employed the distinction between grammatical form and logical form, which plays a crucial part in Ludwig Wittgenstein’s views both in the Tractatus and in the Philosophical Investigations, and has become part of the lore of analytic philosophy. The scope of logical syntax in Carnap’s terms took on much of the role of semantics in later philosophical discussion.
Even with the later distinction between syntax and model-theoretic semantics, syntactic properties of formalized languages are still crucial for properties of systems of logic (soundness and completeness), and proof theory is established as a part of the syntax of mathematics.
In linguistic theory syntax and semantics have become increasingly intertwined disciplines, as it was realized that there are explanatory issues in relating linguistic forms to the specific meanings, or range of meanings, associated with them. S. Lappin (1995) contains a number of useful expositions on this theme; see also R. Larson and G. Segal (1995).
The current research climate is in practice very different from conceptions associated with “ordinary language” philosophy: The contemporary view is not that ordinary speech lacks an exact logic, but rather that a diligent, collaborative effort is required to find out what the logic is. The concentration on logic implies that syntactic investigations have a metaphysical dimension.
The patterns of inference of ordinary language call for formalization as part of a general account of the structure of individual human languages, or human language in general, and this formalization may in turn lead to proposals for reification, as in Donald Davidson’s (1967) hypothesis that references to events are pervasive in ordinary action sentences.
On the side of linguistics proper, the problems of morphology have been treated in a progressively more syntactic manner as, for instance, our example solubility can be seen as built up by rules of a sort familiar from syntax. The result is the area now called morphosyntax, where the question whether morphology is a distinct level of linguistic organization is under active debate; see R. Hendrick (1995) for more recent discussion.