Feature Checking Suppose the system is presented with an array of lexical items. This array is called a ""numeration"" N, which is a (reference) set {LI, i}, where LI is a lexical item and i its index showing the number of occurrences of LI in the array. A lexical item enters the system with the operation Select. Select maps N to a pair of representations at the interfaces. Each time Select picks a lexical item, the index is reduced by one. A representation is not well formed until all indices reduce to zero; essentially, the procedure guarantees in purely algorithmic terms that each occurrence of every lexical item of the given array has entered the system. Since, by the time computation reaches the interfaces, complex syntactic objects must be formed for inspection by the external systems, two or more objects picked up by Select need to combine. The operation that repeatedly combines lexical items as they individually enter the system is Merge.

What are the items that must be ""visible"" at the interfaces for the external systems to read? It seems the following items are needed at most: properties of lexical items, and certain types of larger units formed of lexical items-units traditionally called ""noun phrase,"" ""verb phrase,"" and so on. As we will see, the semantic component seems to recognize things that can be interpreted as topic, new item, argument structure, perhaps even proposition, and the like. For this, individual lexical items (their interpretable features) and structures like CP, DP, and ""light-verb""

phrases are needed at the interface. In X-bar theoretic terms, just the minimal and maximal projections are needed; in particular, X-bar levels are not needed. Conceptually speaking, lexical items enter the system via Select anyway and Merge combines larger units from them ""online."" So, X-bar theory is not needed under conceptual necessity. Having suggested the conceptual point, I will leave matters at that, since, technically, it is not clear that bar levels can in fact be dispensed with.3 The elimination of bar levels suggests another minimalist move. Recall that bar levels were introduced during computation; similarly for indices for binding theory, not to speak of d- and s-structures. Under conceptual necessity, lexical information must enter CHL. In contrast, bar levels, indices, and inner structures seem to be just theoretical devices, and of these, as we saw, some are certainly eliminable. In the spirit of minimalism, then, we adopt the Inclusiveness Condition: no new objects are added during 166

Chapter 5.

computation apart from rearrangement of lexical properties. In eect, the condition requires that what must enter the system is the only thing that enters. Conceptually, the Inclusiveness Condition captures the natural idea that to know a language is to know its words (Wasow 1985); the computational system does not have anything else to access.



We recall that most of the constraints on movement were located at the inner levels of computation (figure 2.1). With the inner levels removed, the system will generate wildly. The eects of some of these constraints, therefore, need to be rephrased or captured somehow in minimalist terms.

Under conceptual necessity, there are a limited number of options available: either the burden of the inner levels is distributed over the outer levels, or they are traced back to the lexicon, or the principles of the system are redesigned; or, as it turns out, all of them together. Basically, we will expect the conditions of computational eciency, such as least eort and last resort, to constrain the operations of Merge such that, after optimal computation, all and only those structures are typically retained which meet legibility conditions. The huge empirical task is to show that this in fact is the case.

Suppose Merge combines two lexical items a and b to form the object K, a ""phrase,"" which is at least the set {a, b}. K cannot be identified with this set because we need to know which phrase it is: verb phrase, noun phrase, and so on. So K must be of the form {g, {a, b}}, where g identifies the type of K; g is, therefore, the label of K. As discussed below, lexical items are nothing but collections of features. By the inclusiveness condition, g can be either the set of features in a or in b or a union of these sets or their intersection. For simplicity, suppose g is either a or b, the choice depending on which of the two items in the combination is the head: the head of a combination is the one that is ""missing something,""

one that needs supplementation. Suppose it is a;4 the other one, b, is then the complement. Thus K {a, {a, b}} (for more, see Uriagereka 1998, 176182).

When the and man are combined by Merge, we get the structure K [the [the man]]. Note that K here is of the type ""determiner phrase""

(DP). Suppose Merge forms another object L from saw and it. Merging K and L, we get, (117) [VP saw [DP the [the man]] [V0V saw [saw it]]]

The resulting structure (117) is not exactly pretty in that it is dicult to interpret visually; the proposed tree diagrams are even more so (see Chomsky 1993). We seem to lose the familiar sense of sentencehood. But Linguistic Theory II 167.

then the subroutines that underlie the single-click operations Delete or Escape in your computer, not to speak of Change All Endnotes to Foot-notes, are not simple either. We expect a genuine theory to describe (states of ) its objects at a remove from common expectations. (117) is what the mind ""sees,"" only a part of which is made available to the articulatory systems via economy conditions internal to the phonological system.

This unfamiliar structure enables a range of novel theoretical moves.

Recalling X-bar concepts, think of K, which is the maximal projection of the, as the ""Specifier"" of saw, and the maximal projection of saw ""Verb Phrase"": all and only items needed at the LF-interface have been made available under minimalist a.s.sumptions. The following local domains are also available: specifier-head, head-complement, alongwith a modified c-command details of which I put aside. Notice that X-bar theory-especially the intermediate projections (the bar levels)-is essentially abandoned on the basis of most elementary a.s.sumptions (Chomsky 1995b, 249). Notice also that the scheme naturally incorporates the VP-internal-Subject hypothesis, which, as noted, has independent motivation-for example, c.l.i.ticization and external y-a.s.signment. There are other advantages of generating phrase structure ""online"" as Select and Merge introduce and combine sets of lexical features. We will see that it leads to a very natural and economical theory of displacement.

Regarding the nature of (lexical) information that enters the system, the basic idea is that lexical items are a cl.u.s.ter of features: phonetic features (how it sounds), semantic features (what is its linguistic meaning) and formal features (what is its category). Suppose we introduce a general requirement that formal features need to be ""checked"" in the course of a derivation. ""Checking"" really means checking for agreement, a grammatical well-formedness condition known since antiquity. For example, the number feature of verbs has to agree with the same feature of Subject-nouns: they do versus he does. The number feature makes a dierence to the semantic interpretation of nouns, for example, the dierence between singularity and plurality. But, although the verbs have to agree with the nouns in this respect, the number feature of verbs does not contribute to its semantic interpretation: do means exactly what does does.

The fact that number feature of verbs are uninterpretable at the semantic interface conflicts with optimal design because a principle of economy on representations, Full Interpretation (FI), will be violated. The presence of uninterpretable features thus renders the system ""imperfect.""

Uninterpretable features abound in languages: number, gender and person features on verbs; Case features on nouns; number and gender 168

Chapter 5.

features on articles, and so on. The peculiar and ubiquitous property of the agreement system in human languages is at once variously exploited and explained in MP.

For one such exploitation, we ""save"" semantic interpretation by letting the number feature of verbs being wiped out from the semantic part of the computation once the sound requirement of agreement is met. Generalizing sweepingly on this, we first impose the requirement of feature checking of certain formal features which means, in eect, that another instance of the (agreeing) feature to be checked be found somewhere in the structure. By minimalism, the relation between the feature-checker and the feature-checked must be established in a local domain: least ef-fort. So, if an item is not in the local domain of its feature-checker, it must move-as a last resort, obeying least eort throughout-to a suitable position to get its feature checked (and be wiped out if it is not making any semantic dierence). Thus, certain formal features of lexical items such as gender, person, and number, often called ""j-features,"" force displacement under the stated conditions.

To take a simple example and skipping many details, consider the sentence she has gone (Radford 1997, 72). The item she has the formal features third-person female nominative in the singular (3FNomS). Apart from the tense feature (Present) the item has has the feature that it needs to agree with a 3NomS Subject. Has is in the local domain of she (specifier-head) and its features match those in the Subject; similarly for the features in has and gone. The structure is well formed. Among features, Nom is a Case feature. Case features do not make any dierence to semantic interpretation. So, it is wiped out from she once the requirement of has has been met; similarly with the 3S features of has. However, the 3SF features of she remain in the structure since these are semantically significant.

The situation is dierent with she mistrusts him. Following Radford 1997, 122, suppose Merge creates the structure (118) in three steps.

(118).

[she [Infl [mistrusts him]

3FS Nom Pres 3SNom Now, mistrusts requires a 3SNom Subject to check its uninterpretable 3SNom features. She is the desired Subject, but she and mistrusts are not in a local relation since Infl ( Tense) intervenes. The tense feature of Infl, on the other hand, lacks any value. Let us say Infl attracts the tense feature Pres of mistrusts. Suppose that the attraction of the Pres feature Linguistic Theory II 169.

moves other features of mistrusts as well: general pied-piping. This results in a local relation between the (features) of she and the attracted features such that the checking of rest of the features of mistrusts takes place, and uninterpretable features are wiped out. Additionally, Infl gets filled with tense information Pres such that Infl now becomes LF-interpretable (satisfies FI).

The scheme just described addressed an obvious problem. Surely, un-interpretability is something that external systems determine; how does the computational system know which features are uninterpretable without ""look-ahead"" information? The basic solution is that CHL looks only for valued and unvalued features, not interpretable and uniterpretable ones. For example, CHL can inspect that she has the value ""female"" of the gender feature or that some INFL is unvalued for the tense feature, as we saw. So, the computation will be on values of features rather than on their interpretability. a.s.sume that features that remain unvalued after computation is done will be uninterpretable by the external systems. CHL will ""know"" this in advance, and the computation will crash. Again, a host of complex issues arise that I am putting aside.

5.1.3.

(New) Merge What I just sketched belongs basically to the early phase of the program (Chomsky 1995b). Developments in the years that followed suggest that we can do even better in terms of SMT. I must also note that, as with every major turn in theory, early adherents begin to branch out as the logic of a given proposal is pursued from dierent directions. Hence, as noted in the preface to this work, there is much less agreement with Chomsky"s more recent proposals than with the early phase of the minimalist program, even within biolinguistics.5 With this caveat in mind, recall that MP postulated two elementary operations: Merge for forming complex syntactic objects (SO), and Move for taking SOs around. Do we need both? We can think of Merge as an elementary operation that has the simplest possible form: Merge (a, b) {a, b}, incorporating the No Tampering Condition (NTC), which leaves a and b intact. The eect is that Merge, in contrast to the earlier formulation, now projects the union of a and b simpliciter without labels-that is, without identifying the type of syntactic object constructed. As a result, phrase-structure component stands basically eliminated from the scheme, leading to the postulation of ""phases"" as we will presently see.

Also, Merge takes only two objects at a time-again the simplest possibility (Boeckx 2006, 7778)-yielding ""unambiguous paths"" in the form 170

Chapter 5.

Figure 5.2 Unambiguous paths of binary branching as in figure 5.2.6 It is obvious that NTC is at work: g is not inserted inside a and b; it is inserted on top ( edge) of a,b: NTC forces hierarchy.

Merge is conceptually necessary. That is, ""unbounded Merge or some equivalent is unavoidable in a system of hierarchic discrete infinity"" because complex objects need to form without bound, so ""we can a.s.sume that it "comes free" "" (Chomsky 2005, 12; also 2006c, 137). The present formulation of Merge is the simplest since, according to Chomsky, anything more complex-for example, Merge forms the ordered pair ha, bi-needs to be independently justified. The argument works even if it is suggested that, due to the physical conditions imposed on human sign systems, a specific order-linearity-is inevitable (Uriagereka 1999, 254255). If the physical design forces linearity anyway, why should Merge specifically reflect that fact? The simplest design of the language faculty thus treats linear ordering of linguistic sequences as a property enforced by the sensorimotor systems since humans can process acoustic information only in a single channel; apparently, dolphins can process acoustic information in two channels simultaneously. The computational system itself does not enforce order; it enforces only hierarchy-that is, sisterhood and containment.7 The emergence of Merge signaled the ""Great Leap Forward"" in evolution.

Now, unless we make the special a.s.sumption that a and b in Merge (a, b) are necessarily distinct, b could be a part of a. Since special a.s.sumptions are ""costly,"" we allow the latter since it comes ""free."" In that case, (Internal) Merge can put parts together repeatedly as long as other things are equal. The original part will appear as copies ( traces) conjoined to other parts: (119) The book seems [the book] to have been stolen [the book]

Linguistic Theory II 171.

Here, displacement of the book from the original Object position just means that only one of the copies, that is, the leftmost one, is sounded for reasons of economy in the phonological component; others are left as covert elements to be interpreted by the C-I systems. Internal Merge thus functions as Move under copy theory of movement.

5.1.3.1.

Merge and Syntax Consider the eects of Merge in the system.

As noted, linguistic information enters the computational system in the form of lexical features divided into three types: phonetic, formal, semantic. External Merge takes lexical features as inputs and constructs complex SOs obeying NTC; internal Merge sets up further relations within SO. These ideas enable us to redescribe the familiar phenomenon of dislocation. Following Chomsky 2002, I will give a brief and informal description (see Chomsky 2000b, 2000c, 2006c, for more formal treatment).

To implement dislocation, three things are needed: (i) A target ( probe) located in a head that determines the type of category to be dislocated (ii) A position to be made available for dislocation (iii) A goal located at the category to be dislocated By the inclusiveness condition, lexical information is the all and only information available to the computational system. Lexical information is distributed as properties of features. So, the preceding three requirements can be met if we can identify three features that have the relevant properties. In fact there are these three features in a range of cases requiring dislocation.

For example, in some cases, the goal is identified by the uninterpretable Structural Case, the probe is identified by redundant agreement features, and the dislocable position ( landing site) is identified by the EPP (extended projection principle) feature. It is easy to see how the phenomenon represented in (118) can now be redescribed in probe-goal terms (Infl Tense is the locus of EPP). The scheme provides a natural explanation for the existence of the EPP feature. Previously, the Extended Projection Principle-the Subject requirement typically satisfied by pleonastic there and it in English (see section 2.3.1.3)-was considered ""weird"" because no semantic role is involved. Now we know what the role is: it is a position for dislocation namely, the Subject position (Chomsky 2002; Boscovic" 2007 for a dierent approach to EPP).

To see how the basic scheme extends to wh-movement and to ill.u.s.trate the subtlety of research, I have followed Pesetsky and Torrego (2006) and 172

Chapter 5.

Figure 5.3 Merge in language Pesetsky (2007), who propose somewhat dierent ideas than Chomsky regarding valuation and interpretability. Setting many details (and controversies aside), figure 5.3 displays the phenomenon of dislocation of which book in the structure (120); only relevant features for wh-movement are shown.

(120) (I wonder) which book the girl has read.

In the diagram (adapted from Pesetsky 2007), (external) Merge generates the basic structure in six steps. According to Pesetsky and Torrego, the probe C-on a.n.a.logy with T, which is typically unvalued and which hosts EPP-is viewed as containing an unvalued but interpretable wh-feature along with EPP; it thus seeks a matching valued feature to ""share"" the value of the latter. A valued (interrogative) but uninterpretable wh-feature is located in the goal as indicated. The EPP feature of C determines the position C0, which is the projection of the head C, for (internal) Merge to place a copy of the goal which book there.8 The dislocation of which book to the edge satisfies the ""external"" condition that which book is to be interpreted as a quantifier. In eect, conditions on meaning follow from the satisfaction of narrow conditions on computation.

Furthermore, the incorporation of Merge enforces a drastic reduction in computational complexity. As Chomsky (2005) explains, the G-B Linguistic Theory II 173.

model required three internal levels of representation-D-Structure, S-Structure, and LF-in addition to the interface levels. This increases cyclicity. Intuitively, a syntactic ""cycle"" refers to the syntactic operations in a domain, where the domain is determined by a selection from the lexicon (Boeckx 2006 for more). Now, the postulation of three internal levels in G-B required five cyclic operations on a selection from the lexicon: (i) the operations forming D-Structures by the cyclic operations of X-bar theory; (ii) the overt syntactic cycle from D- to S-Structure; (iii) the phonological/morphological cycle mapping S-Structure to the sound interface; (iv) the covert syntactic cycle mapping S-Structure to LF; and (v) formal semantic operations mapping LF compositionally to the meaning interface.

SMT suggests that all this can be reduced to a single cycle, dispensing with all internal levels; as a result, the distinction between overt and covert computation is given up. Incorporation of Merge enables the elimination of compositional cycles (i), (ii), and (iv). This leaves two mapping operations, (iii and v), to the interfaces. For these, Chomsky invokes the notion of a minimal SO called a ""phase."" Phases denote syntactic domains constructed by Merge. In some cases, phases look like cla.s.sical phrases such as CP and DP (see figure 5.3), in some others these are new objects such as ""light-verb"" phrases (vP) with full argument structure.

Details (and controversies) aside, the basic point for SMT is that phases are all that the grammatical system generates and transfers to the interfaces; perhaps the same phase is transferred to both the interfaces.

5.1.3.2.

Merge and Semantics As noted, computations at the sound end of the system might pose problems for this coveted austere picture since phonological computation is thought to be more complicated than computation to the C-I interface ( narrow syntax): ""We might discover that SMT is satisfied by phonological systems that violate otherwise valid principles of computational eciency, while doing the best it can to satisfy the problem it faces (Chomsky 2006c, 136).""

The austere picture might still hold if we recall cla.s.sical conceptions of language ""as a means of "development of abstract or productive thinking" "" with ""communicative needs a secondary factor in language evolution"" (Chomsky 2006c, 136).9 In contemporary terms, this means that the princ.i.p.al task of FL is to map syntactic objects to the C-I interface optimally; mapping to the SM interface will then be viewed as ""an ancillary process.""10 SM systems are needed for externalization such as the ability to talk in the dark or at a distance; if humans were equipped with 174

Chapter 5.

telepathy, SM systems would not be needed. Thus, SM systems have little, if anything, to do with the productivity of language. Viewing narrow syntax thus as the princ.i.p.al eect of FL, syntactic operations essentially feed lexical information into the C-I systems single cyclically (phase by phase), period.

Finally, the Merge-based mechanisms just sketched have interesting consequences, in line with SMT, for the meaning-side. First, the single-cyclic operation eliminates the level of representation LF/SEM; this is because, by (v) above, the existence of this level of representation will add another cycle. Second, Merge constructs two types of structures, each a type of phase, that are essentially geared to semantic interpretation.

According to Chomsky, most of the grammatically sensitive semantic phenomena seem to divide into two kinds: argument structure and ""else-where."" ""Elsewhere"" typically const.i.tutes semantic requirements at the edge such as questions, topicalization, old/new information, and the like, as noted. These requirements seem to fall in place with what Merge does: as we saw, external Merge constructs argument structures, internal Merge moves items to the edge. At the sound end, internal Merge forces displacement.

For our purposes, it is of much interest that, although the relevant format for the C-I systems is made available by Merge, much of the semantic phenomena handled in G-B is no longer covered inside the narrow syntax of MP. However, as argued above (chapter 3), some semantic computation must be taking place in mapping lexical information to the C-I interface; there is no alternative conception of what is happening in this cycle. In that sense, the narrow syntax of MP captures what may be currently viewed as ""minimum"" semantics. Given SMT, much of what fell under grammatical computation earlier-binding, quantifier scope, antecedent deletion condition, theta theory, and so on-can no longer be viewed as parts of FL proper. According to Chomsky (2002, 159), operations that are involved in these things-a.s.signment of y-roles to arguments to enforce the y-criterion, free indexing to set up coindexing for Binding theory, and mechanisms for marking scope distinctions,-""are countercyclic, or, if cyclic, involve much more complex rules transferring structures to the phonological component, and other complications to account for lack of interaction with core syntactic rules.""11 These systems are best viewed as located just outside FL, hence, at the edge of C-I systems, which, to emphasize, are performance systems: ""It is conceivable that these are just the interpretive systems on the meaning side, the a.n.a.logue to articulatory and acoustic phonetics, what is going Linguistic Theory II 175.

on right outside the language faculty"" (Chomsky 2002, 159). These interpretive systems are activated when data for anaphora, thematic roles, and so on carry information for these procedures. It enables us to think of CHL itself as independent of these procedures.12 To speculate, although Chomsky has placed these systems ""right outside"" FL, it seems to me that they require a special location there. So far, the ""outside"" of FL at the semantic side was broadly covered by the entire array of the conceptual-intentional (C-I) systems. These comprised of the conceptual system, the ""pragmatic"" systems giving instructions for topicalization, focus, and perhaps instructions for truth conditions; call the package ""cla.s.sical C-I."" The systems just expelled from FL seem to dier quite radically from cla.s.sical C-I systems: (i) they enforce structural conditions like scope distinction and referential dependency; (ii) their description needs grammatical notions such as anaphors and wide scope; and, most importantly, (iii) few of them, if any, are likely to be shared with, say, chimpanzees who are otherwise viewed as sharing much of the C-I elements with humans (Hauser, Chomsky, and Fitch 2002; Premack and Premack 2003; Reinhart 2006).

To capture the distinction between these and the cla.s.sical C-I systems, let us call the former ""FL-driven Interpretation"" (FLI) systems. To follow Chomsky"s tale, cla.s.sical C-I (and sensorimotor) systems were already in place when the brain of an ape was ""rewired"" to insert FL. In contrast, as we will see in section 5.2, although FLI systems are not language-specific, they are best viewed as linguistically specific; we do not expect every cognitive system requiring manipulation of symbols to have them. In other words, although FLI systems are invariant across specific languages such as English and Hopi, their application is restricted to the domain of human language. Given their linguistically specific nature, it is hard to think of FLI systems as existing in the ape"s brain prior to the insertion. In that sense, FLI systems belong to FL without belonging to CHL.

In the extended view of FL just proposed, FLI systems occupy the s.p.a.ce between CHL and cla.s.sical C-I.13 If this perspective makes sense, then, we may not view FLI systems as ""external"" in the sense in which cla.s.sical C-I systems are external. In this sense, they are dedicated to language even if they are viewed as located outside of CHL. We are just going by a tentative list of these systems so far. As discussed in chapter 3, much of the preparatory structural work for the thought systems seems to take place here: the y-criterion is satisfied, referential dependency is established, scope ambiguities are resolved, and so forth. What falls under 176

Chapter 5.

FLI is an empirical issue, but the conception ought to be reasonably clear. The proposed organization thus retains the spirit of G-B in that the information encoded at the output representation of FLI systems recaptures LF; however, in G-B, these ""modules"" were seen as operative in narrow syntax itself.

FLI systems thus (immediately) surround CHL. In doing so, they carve the path ""interpretation must blindly follow"" (Chomsky 2006a). CHL is a computational device that recursively transfers symbolic objects optimally to FLI systems. If CHL has optimal design, then we will expect CHL to transfer (mostly) those objects that meet the conditions enforced by FLI systems. For example, we will expect the structural conditions for establishing various dependencies in Binding theory, or scope distinctions for quantifiers, to follow from computational principles contained in CHL (Chomsky 2006a; 2006d; Rouveret 2008). FLI systems then enforce additional conditions-perhaps countercyclically-to establish sound-meaning correlations within grammar. In doing so, they generate structures to which cla.s.sical C-I systems add content. From this perspective, most of the interesting work trying to articulate grammatically sensitive semantics of human language is in fact focused on FLI systems (Hinzen 2006; Reinhart 2006; Uriagereka, 2008).

5.1.4.

Economy Principles Chomsky (1995b) extensively discussed two global ideas: least eort and last resort. As far as I can figure, these were discussed as ""considerations""

or ""factors"" that must be implemented somehow in the working of the computational system for human language; it is hard to locate an actual statement of them as principles on par with, say, the projection principle or the principles of binding.14 Be that as it may, the conceptual significance of these ideas is not dicult to appreciate. As the name suggests, last resort requires that syntactic operations-especially, operations eecting dislocation-are resorted to only when some syntactic requirement cannot be met by other means. This is because syntactic operations are ""costly."" Least eort requires that, in case syntactic operations are to be executed, global preference is for executions that are least costly. The point is that, whatever be the specific articulation of these ideas, the minimalist program needs them in one form or another. Hence, subsequent controversies about the character of specific principles (noted below) do not aect the general requirement.

Although Chomsky examines a range of syntactic phenomena directly from these ""considerations,"" he does propose some specific principles that Linguistic Theory II 177.

appear to fall under the last resort and least eort categories. For example, Chomsky 1995b made extensive use of the principles of greed and procrastinate. Greed required that a syntactic operation, essentially Move, applies to an item a only to satisfy morphological properties of a, and of nothing else. This clearly is a last resort principle-Chomsky christened it ""self-serving last resort""-which has a markedly descriptive character.

It is eliminable if we simply require that Move works only under morphological considerations-that is, Move raises features which, in fact, is all that Move does. The idea is even more naturally incorporated in the probe-goal framework with (Internal) Merge, as we saw. Procrastinate, another last resort principle, required that covert operations to LF be delayed until overt operations to PF are completed because PF operations are ""costly."" Skipping details, this had to do with a distinction between strong (phonological) features and weak features (for more, see Lasnik and Uriagereka 2005, 3.4). As the distinction was given up, so was procrastinate. In any case, as we saw, in the recent design there is no overt/covert distinction.

The point to note is that both of these last resort principles were cast in linguistic terms-morphological features and strong features-and both were eliminated soon after they were proposed. But the elimination of specific principles does not mean that the last resort condition has disappeared. As Boeckx (2006) shows for a variety of cases, the last resort condition is now deeply embedded in ""local"" syntactic mechanisms; for example, an element is no longer accessible for computation once its Case has been checked, which means that Case checking is a last resort.

We can state this rather specific condition because it is a reflex of the general last resort condition. In this sense, the last resort idea remains central, but as a consideration or an eect to be implemented in syntax whenever required (Lasnik and Uriagereka 2005, 174175).

As for least-eort principles, three principles stand out: Full Interpretation (FI), Minimal Link Condition (MLC), and Shortest Derivation Condition (SDC)-all drawn essentially from G-B. Although the general conceptions of last resort and least eort are clear enough, it is not obvious that specific principles fall neatly under them. Take Procrastinate. In the preceding discussion, Procrastinate was taken to be a last resort principle because it delayed (covert) computation to LF-that is, between PF- and LF-computation, LF-computation is the last one resorted to.

By the same token, since Procrastinate allows PF-computation first, it can be viewed as a least eort principle that reduces the number of ""costly"" overt operations. This is rearmed as follows.

© 2024 www.topnovel.cc