b. Possession (i) The inheritance went to Philip.
(ii) The money is Phillip"s.
(iii) Susan kept the money.
c. Ascription of properties (i) The light went/changed from green to red.
(ii) The light is red.
(iii) Sam kept the crowd happy.
d. Scheduling of activities (i) The meeting was changed from Tuesday to Monday.
(ii) The meeting is on Monday.
(iii) Let"s keep the trip on Sat.u.r.day.
How salient is the data and what is the data for? At the minimum, the list displays the obvious fact that the same verb can be used in a variety of ways. Consider run. All sorts of things run, some without moving at all: people, dogs, noses, books, computers, cars, batteries, paints, ink, ideas, commentaries, and so on. Things can run into buildings, troubles, persons, surprises, applauses, 500 pages, the thirteenth week, big time; things can run out of breath, time, life, the house, gas, money, ink. As we saw, the Bengali verbs khaawaa and taanaa take a vast range of a.s.sorted complements. We also saw that some of these complements are shared between the two verbs. It is not surprising that go, be, change, and keep also have a variety of uses, while they share some of them. Should we think of semantic fields in terms of a list of these complements? If yes, then a list of semantic fields just represents obvious data.
Words and Concepts 153.
Jackendo "s claim of course is that these verbs not only have a variety of uses, but that these uses fall under delineable patterns captured by a principled array of semantic fields. The underlying idea seems to be that relatively stable linguistic devices such as verbs and prepositions interact in ways such that they generate a spectrum of uses to be realized in specific contexts: ""The variety of uses is not accidental"" (1990, 26). Thus, keep and in interact to determine spatial location of things, keep and on determine scheduling of activities, and so on. There are a number of immediate problems with this proposal.
A theoretically satisfying study of semantic fields ought to tell us how unlisted uses of the concerned verbs fit the cla.s.sification reached so far.
Consider John kept the secret close to his heart. For the part John kept the secret, category (b) seems to apply, but what does it mean to ""pos-sess"" a secret? Why isn"t keeping a secret a scheduling of activities, and what kind of ""activity"" is that anyway? Add the part close to his heart.
Now it seems category (a)-spatial location-(also) applies. But then what does it mean for a secret close to one"s heart to have a ""spatial location""? Similar remarks apply to John went into a coma/tizzy. May be these are ""zeugmatic"" uses (Fodor 1998, 50) of keep and go; may be these should be considered separately or not at all. But how do we decide what to set aside if we cannot find a principled basis for cla.s.sifying the core data? It seems that the placing of verbs in a semantic field is either ""pure stipulation"" (Fodor 1998, 54), or is based on a selection that privileges certain uses, as we will see more clearly below.
Further, no doubt, some uses of the cited verbs specify, say, spatial location and motion. How deep is the fact by itself ? It is not surprising that specification of spatial location, possession, and the like, are general ends that may be realized by a variety of actions/processes, especially when the ""ends"" are characterized broadly and loosely as noted. The study of semantic fields, it would seem, is meant to unravel the abstract properties of specific verb meanings to show how these properties generate a variety of uses when placed in varying complement contexts. Thus, there are two generalizations envisaged here: one for the cl.u.s.ter of verbs that fall within a semantic field, and another for the uses of a given verb across semantic fields. Both require theoretical identification of specific semantic fields.
The absence of sucient guidelines plagues the prospects for the first generalization.
As to the second generalization, Jackendo suggests that, say, the ""keep sentences all denote the causation of a state that endures over a period of time,"" despite the uses of keep across varying semantic fields; ""go 154
Chapter 4.
sentences each express a change of some sort,"" and so on. Notice that the suggested characterizations are nonunique in the sense that lots of verbs satisfy them: turn, leap, run, fall, and so on all express a ""change of some sort."" Some of these verbs perhaps do not appear in some of the fields occupied by go. But that only brings out the well-known fact that uses of words ""criss-cross and overlap,"" as Wittgenstein put it. Just a list such as (115) merely restates this fact in informal terms without suggesting even the beginning of an explanation.
Suppose we grant that the study of semantic fields at least brings out some nonunique but stable conditions on the meaning of verbs: keep signals causation of a state that endures over a period of time. Fodor (1998, 51) points out the obvious problem-familiar to us by now following the study of bachelor and river-that the generalization depends crucially on the univocality of causation, state, endurance, and the like. In other words, the claim that keep sentences always denote the causation of a state that endures requires that the concepts CAUSE, STATE, and ENDURANCE are invariant across the variety of uses. Since Fodor has discussed this problem at length, I will turn to what seems to me to be a more direct objection to the project.
It seems that, even if we grant invariance to the concepts that supposedly explain the meaning of a word, Jackendo "s description of how the invariant concepts help generalize the meaning of a verb is simply not true. Thus, granting invariance to ENDURANCE, why should we always suppose that keeping something results in enduring states? Suppose John promised that he would destroy the contract if Ali disagreed. Ali disagreed and John tore o the contract. We could say, John kept his word. What was the enduring state caused by/to John, and where? Following John Austin (1961, 233252), we could say that keeping the word, like promising, is a performance that begins and ends with the utterance of the expression; in that sense, these uses of keep need not involve ENDURANCE at all. It seems that the idea of an enduring state is based on selective uses of keep as in kept the crowd happy or kept the bird in the cage; it does not apply to all uses of keep.
In fact, even for kept the bird in the cage, it is unclear if the generalization applies. Suppose the bird escaped as soon as John kept it in the cage: what was the enduring state? Jackendo is surely depending on the normal circ.u.mstance that if you keep a bird in a cage (carefully), it stays there for a while. From familiar expectations about birds, cages, and our ability to hold them there, Jackendo proceeds to formulate universal conditions on meaning: some birds are not likely to oblige.
Words and Concepts 155.
We merely scratched the ma.s.sive literature in lexical semantics.11 Still, the basic problem ought to be reasonably clear by now. Undoubtedly, words-bachelor, river, keep, go-not only dier in sound, they dier in meaning as well. Also, each of these words has a variety of uses in varying contexts. It is natural to expect that the meaning, which distinguishes a word from another, also plays out in its varying uses. Here, the general suggestion in the literature is that a word A is narrowly understood in terms of a manageably finite set of concepts C, where the meaning of A is determined by the conceptual role conferred on it by C: the meaning of bachelor is captured in terms of UNMARRIED, ADULT, and MALE; the meaning of keep is determined by CAUSE, STATE, and ENDURANCE. The study of word meanings is expected to bring out neat conceptual networks in which the individual meanings are placed. The problem is to construct a suciently abstract theory that captures this expectation.
The problem looks unsolvable because the putative theory needs some terms in which to describe the system of unity-in-diversity, and those terms can only be some words in use. As suggested throughout the preceding exercise, the idealization we hope for is that we describe word meanings with a system of (nonlinguistic) concepts perhaps with some ""parameters"" attached to account for varying uses. Yet, devices like cap-italization notwithstanding, the expressions that supposedly stand for concepts-male, mate, source, cause, endurance-are themselves words.
So, what meaning they lend to the word meanings depends entirely on what these (concept) words themselves mean. In other words, we do not have any access to the conceptual system except through the word meanings the system is supposed to describe.
The problem arises even if each word had exactly one, definite use in every context. The problem compounds, as we saw, if the same word is used in a variety of contexts such that no delineable pattern emerges to cover all of them. In fact, even a cursory study of word use-bachelor, river, keep-suggests that the conceptual relations in which word meanings are embedded are not only indefinitely porous and open-ended, they are often in conflict for the uses of the same word. As a result, either we are restricted to a list of preferred uses or we get trapped in cycles of word meanings when we attempt to generalize. It is important to emphasize that the complaint is about the suggested form of explanation, not about a competent speaker"s uses of words, for, there must be universal constraints on the content of words for children to learn them rapidly and eortlessly. The suggested form of explanation does not unearth those constraints.
156.
Chapter 4.
Recall that we entered the study of word meanings to make some progress toward a more specific account of sound-meaning correlations. The project was faced with a number of initial objections (section 4.3.1). In each case, we asked for deferment since we wanted to survey empirical research in this domain. It looks as though each of the objections is borne out from what we saw. Hornstein"s objection that postulating semantic markers only pushed the problem back one step is justified since the expressions that stand for semantic markers ( concepts) themselves require explanation of word meanings. The objection noted by Jackendo is essentially borne out because, in the absence of even the first abstract generalization, no route to primitives has been found. Quine"s objection to the very idea of a.n.a.lytic connections is plausible because we failed to discern any stable pattern between words and their conceptual structure independently of use in specific contexts. Quine"s problem of indeterminacy is valid roughly for the same reason: since the desired patterns could not be detected, uncontrolled conceptual variation cannot be ruled out within the suggested form of explanation. Kripke"s problem is justified directly because in each case we saw that the explanations were based on a limited list of current uses which do not generalize. Thus, current approaches in this domain do not raise the prospect of (even the beginning of ) an abstract theoretical hold on the phenomena under consideration.
As Jackendo (2002, 377) observes, ""There are fundamental methodological and expository diculties in doing lexical semantics.""
4.4.
Crossroads What happens then to the cla.s.sical idea that language is a system of sound-meaning correlations? To recapitulate (section 4.1), in the face of the alleged incompleteness of grammatical theory, I suggested three options for linguistic inquiry: (i) to supplement grammatical theory with theories covering full-blooded cla.s.sical notion of meaning; (ii) to declare that language theory is indeed incomplete in principle; or (iii) to dissociate the scope of grammatical theory from the putative scope of (broad) language theory. From what we saw in the last two chapters, it stands to reason that the first option is unlikely to materialize from contemporary formal and lexical semantics approaches. This negative result leads to two contrasting perspectives.
Phenomenal Complexity of Language Global skepticism about approaches from formal and lexical semantics does not preclude the possibility Words and Concepts 157.
that grammatical theory may cover aspects of traditional notions of meaning-truth conditions, conceptual connections, and so forth-from its own resources in the future. In principle, then, option (i) can still be resurrected and pursued from the direction of grammatical theory itself.
What are the prospects?
Insofar as the current form and scope of grammatical theory is concerned, it is unlikely that it may be extended to qualify as a complete traditional theory of language. Even if resources of grammatical theory such as hierarchy, Subject-Object asymmetry, argument structure, and so on are exploited to capture some of the abstract, structural aspects of predication and truth condition (Hinzen 2006, chapter 5), as noted, it is hard to see how grammatical theory can determine the content of these structures. Also, I do not see how all aspects of lexical richness-beyond I-meanings, that is-involved in climb, run, bachelor, river, keep, khaawaa, and so forth can be predicted from grammatical resources alone. As we saw, there is nothing grammatical about the palpable conceptual distinction between John decided to attend college and Bill tried to attend church since the conceptual distinction arises after grammatical resources have been exhausted. In the next chapter, I will suggest that the scope of semantics in grammar is basically restricted to what I will call ""FL-driven interpretive systems."" As things stand now, grammatical theory will continue to be an incomplete theory of language-option (ii)-for establishing sound-meaning correlations, if our desired notion of language includes the full-blooded concept of meaning.
This is because, as we saw, any composite theory that attempts to harness the phenomena, that are sought to be covered by formal and lexical semantic approaches, needs to plunge into the ma.s.sive complexity of natural languages. I recall Chomsky"s remarks with which I opened chapter 1: ""The systems found in the world will not be regarded as languages in the strict sense, but as more complex systems, much less interesting for the study of human nature and language.""
To get a feel of what this implies, consider the putative components of a ""full"" theory of language that begins with grammatical theory. First, apart from the core computational system CS, we need a systematic account of the complex human lexicon. Next, we need a principled enumeration of conditions enforced by the complex array of ""external systems""
(at least) at the two interfaces.12 Finally, we need a theory of language use in the complex conditions of world knowledge, including speakers"
and hearers" intentions and goals, not to mention more ""local"" factors such as personal idiosyncrasies, historical accidents, cultural eects, 158
Chapter 4.
mixing of populations, injunctions of authority, and the like. Needless to say, an account of the extremely complex human conceptual system needs to be woven into many of the preceding components if we are to capture anything like the full-blooded concept of meaning. The foundational diculties with the study of the conceptual system (a.s.suming there to be one) alone casts doubt on the intelligibility of the enterprise, notwithstanding the salience of the grammatical component and its (very) restricted extensions. The phenomenal complexity of language thus keeps the second option alive suggesting that a study of complex systems is unlikely to unearth ""real properties"" of nature, as Chomsky suggested.
Simplicity of Grammar In sharp contrast, as progressively abstract theoretical inquiry was directed at the phenomenal complexity of language, linguistic theory was able to extract a strikingly simple grammatical system. Even if there are sharp dierences as to how language-specific features are to be covered under a general theory, there is wide agreement that the computational principles and operations are invariant across languages.13 On the one hand, given the formal part of the human lexicon, the system establishes sound-meaning correlations of a restricted sort in some core cases (Chomsky 2000c, 95); in this sense, the system captures the essential feature of what counts as a language ""in the strict sense.""
On the other, the system does not cover conceptual and categorial organization, structure of intentionality, background beliefs, pragmatic factors such as speaker"s intentions, and the like (Chomsky and Lasnik 1977, 428; Chomsky 2000d, 26). This much abstraction was fully in force during the G-B phase, but it is discernible throughout the biolinguistic project. Thus, without failing to be languages in the strict sense, grammars radically depart from traditional conceptions of language both in terms of coverage and the principles contained in it.
Looking back at the recent thrust of research in linguistic theory, it seems to me that the enterprise looks at the rest of the phenomenal complexity of language basically to isolate just those parts that can be brought into harness with the grammatical system. As a result, as far as I can see, linguistic inquiry is getting increasingly restricted to just those phenomena that can be directly explained with the properties of the computational system and the relevant properties of the lexicon alone. At least in Chomsky"s own work-my focus-there is less and less concern with the sound system and, as we will see, even the quasi-semantic topics of G-B (y-theory, Binding theory, scope ambiguity, and so on) are progressively set Words and Concepts 159.
aside from the (core) domain of grammar without losing sight of the phenomena, of course.
It is a lesson from the history of the sciences that once a simple optimal design of a system has been reached from rigorous empirical argumentation, it becomes harder to handle phenomenal complexity that abounds in any case. Typically, we try to ""protect"" the simple design. It took several centuries of research, supported by a range of mathematical innovations, to spell out and identify the scope of Newton"s strikingly simple laws of motion. Phenomenal complexity continuously threatened those laws, but they were too elegant to be dispensed with. It is one of the wonderful mysteries of science that, in the favorable cases, these restrictive moves actually lead to wider coverage of (previously unantic.i.p.ated) phenomena.
In any case, as noted at the outset, the history of the more established sciences suggest that once some observed complexity has been successfully a.n.a.lyzed into parts, a study of the original complexity often drops out of that line of research, perhaps forever. I am possibly overstating the current case for grammatical theory, but that seems to be one of the prominent future directions; the other of course is to continue to pursue the first option somehow.
As we will soon see, in the more recent Minimalist Program, the even more restrictive grammatical system-called ""CHL""-consists of exactly one recursive operation, Merge, and a small set of principles of ecient computation that constrain the operations of Merge. This object is buried deep down somewhere in the total cognitive architecture of humans such that people do not have introspective access to it in any intelligible sense.
Its existence was not even known until recently. In CHL, linguistic theory is most likely to have unearthed a ""real property of matter.""
These considerations strongly suggest an even more focused line of inquiry directed at CHL itself, but on a plane dierent from that of cla.s.sical linguistic inquiry. The existence of this object raises a new set of questions. What are the relations between the dierent components of CHL?
What is the notion of meaning minimally captured by the operations of this system? Given the very abstract character of its principles, is the ef-fect of CHL restricted to human language? What does it mean to have such a system in the broad architecture of the human mind? In addressing these and related questions, we essentially leave the traditional choice (the first option) behind, and pursue the third.
5 Linguistic Theory II Since its inception, the biolinguistic project has always viewed languages as systems consisting of a computational system (CS) and a lexicon. CS works on elements of the lexicon to generate sound-meaning connections of human languages. This foundational aspect of the project was firmly articulated in Chomsky 1980, 5455, where a principled distinction was made between the computational and the conceptual aspects of languages. A computational system includes the syntactic and the semantic systems that together provide the ""rich expressive power of human language""; the conceptual system was taken not to belong to the language faculty at all but to some other faculty that provides ""common sense understanding"" of the world.
To my knowledge, the strikingly more abstract notion of the single computational system of human language-CHL-was first used ocially in Chomsky 1994a.1 To appreciate how the notion emerged, recall that the G-B system postulated a computational system each of whose principles and operations was universal-that is, the principles themselves did not vary across languages. But most of the principles came with parametric choices such that particular languages could not be described in this system until the parameters were given specific values. In that sense, the G-B system also worked with a plurality of computational systems for describing human languages. A very dierent conception of the organization of the language faculty emerged as follows.
To track the printed story, Chomsky (1991c, 131) proposed that ""parameters of UG relate, not to the computational system, but only to the lexicon."" If this proposal is valid, then ""there is only one human language, apart from the lexicon, and language acquisition is in essence a matter of determining lexical idiosyncrasies"" (p. 131). The ""lexical idiosyncrasies"" are viewed as restricted to the morphological part of the lexicon; the rest of the lexicon is also viewed as universal. In that sense, 162
Chapter 5.
Chomsky (1993, 170) held that ""there is only one computational system and one lexicon."" Finally, in Chomsky 1994a, this computational system was given a name: a single computational system CHL for human language. ""CHL"" was used extensively in Chomsky 1995b.
The guiding a.s.sumption-mentioned earlier but spelled out much later, in Chomsky 2000b-is that language variation itself ought to be viewed as a problem, an ""imperfection,"" for learnability of languages; there are just too many of them. Notice that this idea has a dierent flavor from the cla.s.sical considerations from the poverty of the stimulus. Even if the salience of those considerations are questioned (Pullum and Scholz 2002; Crain and Pietroski 2002 for response), no one can deny that there are thousands of languages and dialects. Linguistic theory, therefore, should be guided by the ""Uniformity Principle"" (116).
(116) In the absence of compelling evidence to the contrary, a.s.sume languages to be uniform, with variety restricted to easily detectable properties of utterances.
a.s.suming the detectable properties to reside (essentially) in the morphological part of the lexicon, the conception of a single computational system for (all) human languages follows. The Minimalist Program (MP) for linguistic theory attempts to articulate this conception. As usual, I will keep to the basic design of the system following Chomsky.
5.1.
Minimalist Program To set the stage for MP, let me quickly restate one of the major aspects of the principles-and-parameters framework that was stressed at the very beginning (see section 2.2): the property Gcp (construction-particularity) disappears.2 Literally, there are no rules in the G-B system; there are only universal principles with parametric choices. Move-a is the only transformational operation which applies to any category. Although the property Glp (language-particularity) cannot disappear, nothing in the system explicitly mentions specific languages such as English or Hopi.
Specific languages are thus viewed as ""epiphenomenon"" produced by the system when triggered by particular experience (Chomsky 1995b, 8).
In this sense, the framework abstracts away from the vast particularities of languages to capture something like ""human language"" itself.
By any measure, this is a striking abstraction. It is surprising that, despite this leap in abstraction, empirical coverage has in fact substantially increased, explaining the exponential growth in crosslinguistic studies in the last three decades. On the basis of what we can see from here with Linguistic Theory II 163.
limited understanding, the Minimalist Program proposes to push the growing abstraction very close to the limit.
5.1.1.
Conceptual Necessity As a preparation to that end, let us ask, following Chomsky 1993: Can the linguistic system be described entirely in terms of what is (virtually) conceptually necessary? Suppose we are able to form some conception of what is minimally required of a cognitive system that is geared to the acquisition, expression and use of human language with its specific properties of unboundedness and construction of ""free expression"" for speech and interpretation (Chomsky 1994a). Chomsky is asking whether this minimal conception can be put to maximal explanatory use. In other words, can the relevant facts of language be fully explained with this conception alone?
To appreciate what is involved, let us see if there are nonminimal conceptions in the earlier theory. Consider the distinction between ""inner""
and ""outer"" levels of grammatical representations. We a.s.sume that (at least) two outer levels are needed in any case since the system must be accessible to external systems: sensorimotor (SM) and conceptual-intentional (C-I). Keeping to G-B, let us a.s.sume for now that PF and LF are those levels; PF and LF are then the ""interface"" levels of representation. The inner levels (d- and s-structures), in contrast, are just theoretical constructs to which no other system of the mind has access. In current terms, the inner levels are not conceptually necessary. So we have a lexicon and two interface levels for the SM and the C-I systems.
The task of the computational system then is to map lexical information onto the two interfaces for the ""external"" systems to read them: satisfaction of legibility conditions.
As for the design of CHL, we not only a.s.sume as noted that it is completely universal, we would want the system to contain only those principles and operations that are required under conceptual necessity. Now, we know that there is at least one ""imperfection"" in human languages in that sound-meaning connections are often indirect: the displacement problem. We return to the issue of whether it in fact is an imperfection. Since syntactic objects need to be displaced, conceptual necessity suggests that there is one operation that aects it: Move-a; better, Aect-a. This part of conceptual necessity (as with much else), then, was already achieved in G-B.
Since movement is ""costly,"" we a.s.sume, again following the spirit of conceptual necessity, that movement happens only as a ""last resort"" and with ""least eort."" These are notions of economy that endow an optimal 164
Chapter 5.
character to a system. We a.s.sume that only economy principles with ""least-eort"" and ""last-resort"" eects occur in the system to enforce optimal computation. Move-a (or, Aect-a) may now be viewed quite dierently: nothing moves unless there is a reason for movement and then the preference is for the most economical movement (Marantz 1995). Finally, we ensure, following conceptual necessity, that optimal computation generates only those syntactic objects that are least costly for the external systems to read. In other words, representations at the interfaces also must meet conditions of economy. Let us say, a system that meets these design specifications is a ""perfect"" system.
We thus reach the Strong Minimalist Thesis (SMT): ""Language is an optimal solution to interface conditions that FL must satisfy."" If SMT held fully, ""UG would be restricted to properties imposed by interface conditions"" (Chomsky 2006c). (See figure 5.1.) The first thing we would want to know is how the displacement problem is now addressed within the restrictive conditions. We saw that postulation of s-structures allowed computations to branch o to accommodate this fact. Without s-structures, the problem reappears. We saw that displacement requires movement. So the issue of how to handle displacement gets closely related to the issue of what causes movement and Figure 5.1 Minimalist program Linguistic Theory II 165.
what constrains it. The basic idea in MP is that certain lexical features cause movement under conditions of optimality enforced by economy principles. The idea is explained and implemented as follows.
5.1.2.