From mcginnis at ucalgary.ca Mon Jan 10 23:37:01 2000 From: mcginnis at ucalgary.ca (Martha McGinnis) Date: Mon, 10 Jan 2000 16:37:01 -0700 Subject: Alec Marantz: Lecture notes - Fall morphology class Message-ID: Dear DM-List-ers, With the hope of generating more productive discussion about morphological theory, I have decided to post the lecture notes I wrote up for the fall 99 morphology course at MIT. Anticipating some responses to the questions raised in these notes, I delayed posting until after the semester was over, since I was reluctant to both push forward with the topics I promised to discuss in the course and engage thoughtfully in the discussion over the notes on previous topics. I still can't promise to reply to your comments or to clarify everything that is left unclear in the notes. Some formatting may be lost in the pasting of Word documents into these e-mails. I will monitor the results when I receive the postings myself. There should be twelve sets of notes in all; I hope to finish posting them by the end of January. I'm re-reading the notes and making small corrections, particularly in response to class discussions. There will be still be errors remaining, along with much opaqueness and unclarity. Please think twice before quoting these notes in print. In addition to flat out errors, there is sometimes a dialog going on in the text, with positions other than my own being presented for discussion. Yours with some trepidation, Alec Marantz Lecture Notes, Morphology 9/10/99 Jackendoff, R. 1997. "Idioms and Other Fixed Expressions." In The Architecture of the Language Faculty, LI Monograph 28, Cambridge, MIT Press, pp. 153-177. Marantz, A. 1997. "No Escape from Syntax," in A. Dimitriadis et al., eds., U Penn Working Papers in Linguistics, 4.2, pp. 201-225. Fodor, J.A., & E. Lepore. 1998. "The Emptiness of the Lexicon." LI 29.2, 269-288. Pustejovsky, J. 1998. "Generativity and Explanation in Semantics." LI 29.2, 289-311. 1. Lieber, Deconstructing Morphology "The starting point for this theory of word formation is a somewhat odd one - the fringes of morphology, so to speak, where the syntax of words and that of phrases seem to converge." a floor of a birdcage taste over the fence gossip off the rack dress God is dead theology 2. This betrays the hidden assumption that the syntax deals with the arrangement of words. 3. Jackendoff, The Architecture of the Language Faculty "What is in the lexicon and how does it get into sentences? So far, we have dealt only with the lexical insertion/licensing of morphologically trivial words like cat (chapter 4). Now we turn to morphologically complex elements of various sorts." There's an illusion that we know what a word is -- "cat" is a word - and an assumption that we do syntax with words. Of course every syntactician violates this assumption from square one. D'ya wanna debate this point? Latin: fe:les NOM singular fe:lem ACC fe:lis GEN 4. Suppose we begin with the features or categories of language. In phonology, phonological features are constituitive of sounds - that is, if you strip the features and structure from a sound, there is no remnant; there's nothing left. Are there non-constituitive (say, classificatory) features to the terminal (combining) nodes of the syntax (or the Lexicon)? 5. cat has at least a root and a category feature ("noun"). If syntactic categories can be classificatory features of roots, then the word cat could be a syntactic and Lexical atom - an atom with a syntactic category. However, if syntactic features are constituitive, then there must be a remnant to cat when its nounness is removed (just like there's a remnant to a /t/ when its [-voice] is removed). 6. Word and Paradigm theories, like A-Morphous Morphology, assume that words have properties required by the syntax. Suppose Word and Paradigm theories existed without the notion of a "paradigm." The words of a language, an arbitrary set, would have whatever features (properties) they have. Syntactic combination, however, requires words with particular properties. So Latin direct objects must have the ACC property. Only if there is a word meaning 'cat' with the property ACC in the language will Latin allow 'cat' as a direct object. 7. Paradigm theory says that "words" have different "forms." The set of features for, say, a Noun that the syntax might require (all the different cases, for example) form a paradigm space (crossing the features, e.g., case and number, yields an n-dimensional space, e.g., 2 dimensional case by number space). Words come in families identified as forms of the same word. The forms of a word fill out paradigm space, so a particular form may fill several "cells." 8. Anderson: Forms of a word are created from a base form via morphophonological rules. On this logic, the base form is the (universal) default form, since no morphophonological rules need to apply to a base with a full set of syntactic features (assigned to the base by the syntax). 9. Anderson: The features realized by inflectional morphology - that is, the features that determine paradigm space - differ from language to language. For example, in some languages number on nouns is inflectional and in some it is derivational. Since the syntax itself must treat inflectional features specially, this means that the basic operation of the syntax is different from language to language. 10. Why isn't a whole sentence simply a "form of a word" in some paradigm space defined by the syntax? Everyone assumes that one reaches bedrock at the "content" words. So, "The cat is on the mat," can't be a form of the word "cat" since "mat" must be an independent word. However, "He's always singing," could be a form of the word "sing," and might be realized as a single phonological word in some language. 11. Should we take anything for granted? If we don't actually do syntax with phonological words, is there any reason to suppose that we have a notion of "word" suitable for the atoms of syntax? Should we assume that cat is atomic, that it is an atomic Noun, that it is a syntactic atom, that we even know what Jackendoff is referring to when he writes, "morphologically trivial words like cat"? 12. The basic questions any theory of grammar must ask include, what are the atomic, combinatory units of the syntax, what are the atomic, combinatory units of the (morpho-) phonology, and how do these atomic units connect in the derivation of a phonological form? 13. One could imagine that, as far as the grammar is concerned, there are only "constituitive" features, each feature is an atomic, combinatory unit in the syntax, and each feature corresponds to an atomic morphophonological unit, call it a "vocabulary item." To the extent that roots (the "remnants" of "content words" when all constituitive features (save the root) are stripped away) have classificatory features, they may play a role in semantic interpretation (at the LF interface with conceptual systems) but would not enter into any grammatical principle (rule, constraint, what have you=8A). 14. Let's suppose that this idealized view of grammar is too simplistic. On the syntax side, for example, sets of features, as opposed to individual features, may operate as atomic units in the syntax (which means that the features aren't combined by "merge" and thus that their combination need not be interpreted at PF or LF). On the phonology side, sets of syntactic features may correspond to single vocabulary items. We call the theory of pre-syntactic "bundling" of features into single atomic units for the syntax the theory of the Lexicon, where the bundles are the Lexical items. The strongest claim we can make (other than the claim implied in 13. - that there is no Lexicon in this sense and that therefore every feature is a syntactic atom subject to "merge") is that there is no "bundling" (Fusion) outside the Lexicon and that therefore every syntactic atom is the locus for vocabulary insertion in the phonology. 15. Jackendoff explicitly assumes that "mismatches" between "conceptual structure" and "phonological structure" can include arbitrary many-to-one correspondences wherein whole chunks of hierarchically arranged syntactico-semantic atoms correspond to atomic phonological pieces. So simplex "went" can correspond to [GO + Past] (and simplex "kill" could be CAUSE(X, DIE(Y))). 16. Fodor, on the other hand, supposes (in some sense, to be made clear) a one-to-one correspondence between words and concepts. Note that =46odor acknowledges that "cats" has "cat" in it, and that "book shelf" probably has "book" in it as well. He doesn't dwell on the issue of whether "redden" has "red" in it, or "raise" includes "rise." The issue he's after is whether logical inference is run off constituitive features of (in our terms) roots. Is the conclusion that "John is unmarried" from "John is a bachelor" made on the basis of a decomposition of "bachelor" into features like "unmarried" (with our without remnants) or is this conclusion drawn on the basis of classificatory features (or properties) of "bachelor"? Fodor never considers the possibility that "bachelor" decomposes into the root "bachel" and the suffix "or," where the nominalizing suffix -or here is associated with the meaning of "occupation" or "socially identifying classification of a human" (cf. butcher, pensioner, widower, spinster=8A). 17. For Jackendoff, in the best of all possible worlds, phonological structure transparently reflects conceptual structure and logical inference can be run formally off conceptual structure. What's interesting is that both Jackendoff and Fodor argue essentially from a "what you see is what you get" perspective. Fodor see words and grapples with what he takes as a transparent truth that linguistic structure doesn't support logical inference. Jackendoff sees just slightly messed up conceptual structures in phonological structures, once the proper form of lexical representations is revealed. Both think they know a word and a noun when they see one (cat). 18. The failure to question the obvious has led morphology into darkness and despair. A. No one takes the serious morphologists seriously. I don't know of a single phonologist or syntactician that has really come to grips with Anderson or Lieber (adopted their theories - I mean really adopted, rather than waved at what they thought the theories were - or rejected the theories on reasoned grounds), for example. Most syntacticians and phonologists operate outside any defensible or well-supported theory of morphology (and I'm willing to name names here). B. The serious morphologists, on the other hand, who generally tend to be phonologists at heart, end up making up their own theories of syntax, by themselves. Anderson and Lieber think they're adopting standard syntactic theories, but they were several years out of date in syntax when they created their theories and since syntacticians didn't take them seriously anyway, they were never subjected to the type of critique of their understanding of syntax that could have shown them the error of their ways. [Wunderlich and Beard just make the syntax up - as a serious enterprise for Wunderlich, although out of the mainstream.]. C. Phonologists generally punt the morphology entirely these days. Standard Optimality Theory is simply [i.e., literally] incoherent from the standpoint of morphology. [It is impossible to tell how standard OT deals with the usual issues of morphological theory.] 19. We know some things about morphology. A. First, there's s paradigmatic dimension to at least non-root morphemes. i. "gaps" are the exception rather than the rule. So we expect every noun to be able to appear in every case position, every verb to have a past tense, etc. ii. there is "competition" among forms for the realization of features, a competition that yields syncretism and (at least the appearance of) undespecification. This paradigmatic quality of morphology already disconfirms standard theories of Lexical Morphology in which the feature structure of a word is built up via percolation or other computation over the feature structure of morphemes qua lexical atoms of sound/meaning correspondence. You can't prevent, "He comb his hair" for "He combs his hair" unless you have competition, and thus "separation" in Beard's sense. You have to know that you're shooting for third person singular and thus "combs" (really, /z/) can beat out "comb" (really, /=F8/) for the realization of third person= , being more specific than (default) "comb"/zero. Proponents of contemporary Lexical Morphology reintroduce paradigms in some other way (see Lieber and Wunderlich), but these mechanisms have never seriously been examined by anyone. B, Second, the phonological realizations of syntactic features are pieces with properties (we call them vocabulary items); thus A-Morphous Morphology as a general approach to morphophonology can't be right. In fact, as shown by Halle and Marantz, A-Morphous Morphology as a theory (as opposed to a descriptive framework in which anything could be described) had no properties independent from its principles of disjunction, which are empirically disconfirmed with data that no one has ever disputed. And Anderson's new version of A-Morphous Morphology has pieces - his theory of clitics (which must be treated as the output of morphophonological rules by the general principles of A-Morphous Morphology) requires constraints ordering the clitics as pieces with properties. 20. The paradigmatic and piece properties of morphology require something like Distributed Morphology - that is, they require a theory in which phonological pieces are organized into hierarchical tree structures and in which these pieces are underspecified with respect to syntactic features and compete with each other for the realization of syntactic features. All the burning issues in morphology can be stated as disputes within this general framework. There are no alternatives to Distributed Morphology (i.e., a theory with late insertion of piece-like vocabulary items into non-root terminal nodes, where the vocabulary items are underspecified with respect to the syntactic features they realize and where vocabulary insertion is governed by "elsewhere" competition). A real alternative would need some explicit account of the paradigmatic and piece properties of morphology. 21. Nevertheless, we will find some recent proposals for alternative theories in the literature. In particular, we will review some proposals for notions of "lexical relatedness" other than "shares the same pieces," where connections between "stored" derived and/or inflected forms play a role in the grammar. An extreme here is the Bybee/Burzio position that speakers store all tokens of all words (for Bybee, probably all sentences) that they hear and draw generalizations over stored forms. Steriade makes related claims. We have to ask how we can evaluate these claims, which are literally incoherent - i.e., do not cohere with any general theory of grammar. 22. Learning roots like "cat" probably requires a prior conceptual space, such that you know the meaning of "cat" before you learn the word ("oh, that's what we call those things!). Still, an innate conceptual space may or may not imply a decomposition of meaning into constituitive features (is "fuzzy" a necessary/definitional property of "cat"). In any case, the essential issue for linguistics is whether the feature space that creates the meaning of "cat" plays any grammatical role, similar to that played by features like [+past]. If the features of roots play a role in the grammar, then we might expect syncretism (from underspecification) and/or suppletion (contextual allomorphic vocabulary items) for roots. 23. Suppose true modularity for root semantics. The features of roots, be they consituitive or classificatory, may be irrelevant/invisible to the grammar. This assumption predicts no syncretism/suppletion for roots. 24. General assumptions: The Distributive Morphology (Sept. 99 Marantz version) Framework. * = inevitable assumption (conceptual necessity) ! = arguable assumption/assumption with testable consequences There exists a Universal set of grammatical features, U * A language chooses a subset of U for its grammar ! The language bundles some subsets of its subset into Lexical Items ! The Lexical Items include only a generic root node and no specific roots ! Root nodes aren't bundled with any features into Lexical Items ! Syntactic Merger is the only process that combines Lexical Items * The phonological word is not a cyclic domain of the syntax ! Re-Merger (Morphological Merger) occurs at PF (i.e., without LF consequences) ! Vocabulary Insertion and phonological realization occurs at the "phase" level * As McCawley emphasized, the cyclic nature of grammatical derivation may be the most important and least questionable discovery of generative grammar. Vocabulary items are connections between features of U and phonological forms !/* Vocabulary insertion works bottom up ! Features from U in Lexical Items may be deleted prior to VI (Impoverishment) ! There may be multiple VI in a single Lexical Item (fission) ! There are Vocabulary Item specific ordering constraints at a single hierarchical level ! marantz at mit.edu From mcginnis at ucalgary.ca Tue Jan 11 16:44:20 2000 From: mcginnis at ucalgary.ca (Martha McGinnis) Date: Tue, 11 Jan 2000 09:44:20 -0700 Subject: Alec Marantz: Lecture Notes II Message-ID: Dear List-ers, Without waiting for response to the first set of lecture notes, I present you with set two. Please refer to the readings listed at the beginning of the first set of notes for an indication of what I'm talking about when I discuss "Fodor" and "Pustejovsky." --Alec Morphology II, 9/17/99 Roots and decomposition 1. Fodor asks us to separate the linguistic from the conceptual, and decompositional features from other properties. Fodor agrees that it's odd to say, "John began the rock" (out of context) because rocks aren't events nor imply a characteristic event (as does "book"). However, he asks what reason there is to believe our knowledge about rocks is either linguistic or decompositional. 2. Pustejovsky retorts that Fodor is simply avoiding the interesting question: how can we represent the knowledge that speakers have about "rock" that leads to their judgements about sentences containing "rock"? If the knowledge that rocks don't name/imply events is conceptual, it at least must be represented in a way that interacts with the representation of linguistic semantics to yield sentence meanings from the linguistic system. 3. Fodor points out that inferences based on Pustejovsky-style representations are defeasible ("I baked a cake" sometimes implies I made the cake, but, "I removed the twinkie from its cellophane wrapper and baked it in the sun" is fine with no "creative" implications). On the other hand, if "I baked a cake," then it's true that I baked a cake. (And if "I saw cats" is true, then there is more than one cat such that I saw it.) The question would be, where do Pustejovsky's observations belong, and is his "theory" (simply) an elaborate method to encode some of these generalizations. 4. The decompositional nature of Pustejovsky's lexical entries imply that it makes sense to consider, e.g., what a lion would be if it weren't an animal. Or what a cake would be if it weren't made by baking. Or what a hammer would be if it weren't used for hammering. Cf., what's "raising" without agentive cause (=rising). If a "singer" is "one who sings," what is, "John sings"? >>From the standpoint of morphology, one would ask why it is that the categories Pustejovsky deals in aren't morphologically expressed in any language. Why is there no "telos" morpheme indicating the class of something according to what it's for? 5. Three dimensions of controversy over root meanings: A. Linguistic nature of meanings B. Criteria of identity between roots C. Private nature of meanings/source of meanings in individual 6. Linguistic nature of meanings: a. Are there strict meaning/syntax correlations - for the type of meaning categories Pustejovsky uses in his decompositions? Fodor questions whether distributional facts distinguishing "eat" and "devour" represent a true generalization about semantics/syntax mappings. I ate it up/I devoured it up/I finished it up. I ate it raw/I devoured it whole Cf. Keyser & Roeper on "abstract clitic" hypothesis. b. Are the word-internal decompositional meanings ever expressed systematically in syntactic composition? Compare eat/devour with 'walk around in circles'/'walk to the store' bake/bake+applicative in languages with overt applicative morphology and double object applicative constructions c. Are the semantic decompositional categories ever systematically marked by overt morphology in any language? Compare the animate/inanimate distinction with the 'inherently for doing something'/'created in a particular way' distinction Stage/individual level vs. state/event distinction 6. Note that this discussion takes place against the bedrock assumption of compositionality in the syntax. Everyone agrees that the semantics of "kick the ball" is compositional. One could ask whether monomorphemic "blick" could mean "kick the ball" Cf. Jackendoff on "kick the bucket" = "die" Or one could ask whether there isn't a different sort (different primitives, different means of composition) of compositionality below the word level, where the syntax and the below-the-word level composition mix only at the word level; the syntax never does what the word internal mechanisms do and vice-versa. 7. Criteria of identity between roots polysemy (same thing has multiple meanings) vs. homophony (different things have same sound) Pustejovsky claims that Fodor provides no account of productive polysemy. Fodor claims that Pustejovsky's polysemy is sometimes homophony and sometimes requires no linguistic account (and thus is technically not polysemy or homophony). Is "bake" polysemous or homophonous? Is it the same "cat" in "The cat is on the mat" and "Don't let the cat out of the bag"? bank of a river vs. bank with the money Clearly these questions can't be asked without a fairly articulated linguistic theory. Morphophonology will be crucial here - where is information about allomorphy stored? With the phonological form of a root? With the meaning/identify of the root as an interpreted object? Somehow in a system of interrelated stored words? Consider the "flied out to center field" type of example. What does this tell us about the "fly" in "fly out"? Is this a question of structure or of identity? Related to allomorphy: conjugational/declension classes of roots/stems. What's the connection between the identity of a root and its (arbitrary) class. See Embick on deponent verbs in Latin. 8. Private nature of meanings/source of meanings in individual What makes your "cat" the same as my "cat"? a. Nothing; they're not the same but they're close enough The biology of cats and the biology/psychology of humans interact to make my concept of "cat" close enough to your concept that communication is possible and almost perfect. If chimps could talk, their "cat" would be more or less the same as our "cat" as well. However, if snakes could talk, their "cat" might be quite different. b. The nature of cats "Cat" is a natural class in nature. A more or less "blank slate" perceptual/learning system will discover the class through interaction with the environment. So "cat" is a class of nature and an emergent category of the perceptual/conceptual system. If snakes could talk, their cat category would be the same as ours - or they would be getting things wrong. c. The structure of the human linguistic and conceptual system Jackendoff and Pustejovsky more or less adopt this stance. "Cat" is like "plural," only more complex. The feature space of language (of natural language semantics) leaves a "cell" in the paradigm for "cat," waiting to be filled during language acquisition. Here, chimps can't talk so they can't have our concept of "cat." Why chimps seem to be able to do complex things that would implicate complex conceptual structures, yet don't/can't use language, is an interesting question for Jackendoff (along with why most people have a very hard time describing scenes they see and can operate within). The issues here are tightly bound up with questions of acquisition. Both in a. and in c., the child will know a cat when s/he sees one and will be able to say, oh "cat" means cat. Both scenarios leave open the possibility of suppletion for roots like "cat" (singular, "cat," plural, "blicks"), depending crucially though on the exact structure of the grammar (where do roots enter the grammar and in what form?). 9. Pustejovsky can't win simply by pointing out what we know about the meanings of words in sentential contexts. Fodor is denying it makes sense to separate the word from the concept, and he therefore refuses to accept evidence about speakers' knowledge of concepts as evidence for anything other than knowledge of concepts. Fodor is thus committed to a modularity between linguistic structure and conceptual meanings; what's "below" the word level isn't strictly speaking linguistic, and isn't compositional. Pustejovsky must show that there's a strong resemblance between the compositionality below and above the word level. Fodor accepts the what-you-see evidence of syntactic composition as sufficient for supporting syntactic decomposition; I imagine he could be convinced to see overt morphological composition in the same way, arguing for word decomposition when you can see the composition. Since Pustejovsky is pushing decomposition where there is no overt composition, he needs very strict, reproducible evidence for syntax/semantics correlations (in putative cases of polysemy and for supporting a claim of a semantic feature with syntactic consequences). Fodor says Pustejovsky fails empirically, and I tend to agree. marantz at mit.edu From mcginnis at ucalgary.ca Mon Jan 24 18:18:38 2000 From: mcginnis at ucalgary.ca (Martha McGinnis) Date: Mon, 24 Jan 2000 11:18:38 -0700 Subject: DM list posting re: alec's lecture notes Message-ID: dear DM-listers -- I've read Alec's posting now, but not the reading list it came with, here are a few quick questions/comments off the cuff, with caveats about my uneducatedness, late-afternoon-on-fridayness, etc.ness. A. w/r to posting one, para 6 and para 19 Ai: seems like the lack of a paradgimatic structure for root morphemes (lack of gaps) is a coherent reason for assuming that roots are not subject to competition/not paradigmatic in any sense relevant to language, contra the representation of Jackendoff 'n pustejovsky in posting two, para 8a, and in favor of Rolf's and my distinction between l- (non-competing roots) and f- (competing non-roots) affixes. Is there any way to get the competes/doesn't distinction result out of the stipulative realm and into the principled? I.e. Noyer&Harley argued for a sorting of the Vocabulary Items into roots and everything else, on the basis of competition. What does that correlate with? Can't be association with encyclopedic info, (or maybe it can?) because derivational morphemes are in the f-morpheme block too (unless the types of info derivational morphemes seem to refer to ("agent of X action", "state or quality of X..." etc.) have privileged status? e.g. derivational morphemes occur in restricted syntactic frames that encode these meanings, much as we (some o' us) ascribe the appearance of theta-roles to certain frames/f-morphemic relations?) Question: in all you English-speakers' judgements out there,does the benefactive alternation when used with a potentially creation-implying verb like "bake" force the creation reading? That is, can "I baked Mary a Twinkie" mean "I put a Twinkie in the sun for Mary"? or can it only mean "I created a Twinkie for Mary?" I can't tell; I've convinced myself in both directions in the past hour. B. You imply (Alec) that Fodor could be convinced of the reality of morphological decomposition, and admit morphologically complex words to contain within them several atomic concepts. Indeed, he's basically said as much, that I've read. Could he, however, be convinced of the existence of zero morphemes? (My personal favorite, of course, being v=CAUSE in English). Alternatively, could he be convinced of the conceptual complexity of multimorphemic words whose morphemes don't obviously mean anything? (Latinate words in English, I'm mainly thinking of here). With respect to the last parentheses, in posting two, para 6a, is there supposed to be a star/question mark on "I devoured it up?" Contextually, I assume there's not; it seems to be presented as an argument against the devour=telic generalization, I think. Judgementally, though, seems pretty lousy to me, and it reminds me of remarks I've heard on some occasions from Alec and one particular discussion w/ rolf and Alec, where the conjecture that English speakers aggressively segment multisyllabic words (with the appropriate properties) into morphemes, and that this has syntactic ramifications. In particular, we concluded that the "a-" in "arise" is probably a prepositional head functioning like a particle in a verb-particle construction, forcing the verb into a certain aktionsart. And (many apologies if I'm remembering this wrong) it seems to me that I've heard the suggestion of a similar account of the failure of dative shift in Latinate-sounding ditransitive verbs from you, Alec: they are agressively segmented (do+nate, e.g.), and have a corresponding syntactic complexity, which forces them into the double complement construction, not permitting whatever the necessary zero-morph is that gets you the double object construction (let's, say, call it G). Seems like a similar story for the failure (if it exists) of "*devour it up" would work nicely. In this vein, I've recently been talking loosely with Mike Hammond about phonological evidence for agressive morphological segmentation in English; among other things, if Latinate words are monomorphemic in English, a bunch of medial consonant clusters that are otherwise forbidden. For example, the medial cluster [dh] is forbidden in English -- unless "adhere" is monomorphemic, in which case it occurs in English but only in words of Latinate origin. Mike says that there's a lot of nice phonolgical generalizations to make if you can claim that, e.g., "honest" is aggressively segmented into "hon+est". The point for current purposes, though, is, could Fodor be convinced of conceptual complexity on the basis of morphological complexity *in the absence of* good clear concept-morpheme matches? Another Fodorian remark: I've just read his recent book "Concepts: Where Cognitive Science Went Wrong", which is quite the cogent defense of conceptual atomism, and find to my surprise at the end that Fodor's conceptual atomism no longer entails for him radical nativism: he's now prepared to think that we do acquire DOORKNOB and aren't born with it, having come up with a metaphysical out that satisfies him. Just FYI. It's got a very entertaining and cutting chapter in it called "The Demise of Definitions (Part I): the Linguist's Tale." C: w/r to word-internal morphological rules being different or not from syntactic rules: Pinker presents a bunch of well-known facts in "Words and Rules" about the difference between morphological and syntactic composition; I'd sort of forgotten about them, but I was wondering what people (those of us for whom it's syntax all the way down) thought about 'em. He says: people have confusion about how to treat sequences like 'hole in one', 'gin and tonic', 'jack-in-the-box', 'mother-in-law'. Some analyse them as regular phrasal idioms, with associated syntactic structure, and inflect them as such: holes in one, 'jacks in the box', etc. Others analyse them as complex morphologcial words, without phrasal components (e.g. as flat structures rather than N+PP), and inflect them like words: 'hole in ones', 'jack in the boxes'. What does a Distributed Morphologist have to say about flat-structre analyses of these sequences? Are they multimorphemic? If so, why their evident non-phrasalness? I'm sure I'm embarassingly naive about this. But help is very welcome. So, that's it from me for the moment. Thoughts? comments? derogatory remarks? best, hh --------------------------------------------------------------------- Heidi Harley (520) 626-3554 Department of Linguistics hharley at u.arizona.edu Douglass 200E Fax: (520) 626-9014 University of Arizona Tucson, AZ 85721 From mcginnis at UCALGARY.CA Thu Jan 27 18:18:00 2000 From: mcginnis at UCALGARY.CA (Martha McGinnis) Date: Thu, 27 Jan 2000 11:18:00 -0700 Subject: Martha McGinnis: Reponses to Alec's and Heidi's postings In-Reply-To: Message-ID: Here are some comments and questions in response to the previous three postings. Cheers, Martha ---------------------------------- Alec's first posting: >8. Anderson: Forms of a word are created from a base form via >morphophonological rules. On this logic, the base form is the (universal) >default form, since no morphophonological rules need to apply to a base >with a full set of syntactic features (assigned to the base by the syntax). I'm not sure what this means -- what's a universal default form? And if forms of a word are created by the application of morphophonological rules to a base form, why say that no such rules need to apply to a base form with a full set of syntactic features? >10. Why isn't a whole sentence simply a "form of a word" in some >paradigm space defined by the syntax? > >Everyone assumes that one reaches bedrock at the "content" words. So, "The >cat is on the mat," can't be a form of the word "cat" since "mat" must be >an independent word. However, "He's always singing," could be a form of >the word "sing," and might be realized as a single phonological word in >some language. Indeed. A fine observation! >14. Let's suppose that this idealized view of grammar is too >simplistic. On the syntax side, for example, sets of features, as opposed >to individual features, may operate as atomic units in the syntax (which >means that the features aren't combined by "merge" and thus that their >combination need not be interpreted at PF or LF). Trying to figure out what this would mean.. Say one language lexically bundles together Tense and Agr (or, say, Aspect) into a single head, while another projects them as two separate heads (e.g. along lines suggested in Bobaljik's thesis). Would we expect a difference in LF interpretation? >23. Suppose true modularity for root semantics. The features of roots, >be they consituitive or classificatory, may be irrelevant/invisible to the >grammar. This assumption predicts no syncretism/suppletion for roots. Does it? As (e.g.) Noyer showed in his thesis, there's both contextual and intrinsic specification of vocabulary items. Even if the intrinsic features of roots are invisible to the grammar, the features of functional categories in their local context are visible -- so we might expect contextually determined root suppletion. Alec has argued in the past that any specificiation of root vocabulary items will lead to trouble, under the DM claim that more highly specified vocabulary items will always block less specified vocabulary items if both will fit into a particular context. Suppose, for example, that we had a root vocabulary item specified for nominal plural contexts ("cattle", say), while other root vocabulary items are unspecified for singular/plural distinction ("cat"). Assuming that Encyclopedic features (distinguishing cattle from cats) are invisible to the grammar, "cattle" will ALWAYS block "cat" from being inserted in a plural nominal context. Thus (the reasoning goes), under DM, "cattle" must not be specified for nominal plural contexts. But, as far as I can see, this is a stipulation -- it doesn't follow from the invisibility of the features of roots. Moreover, I don't think it's the only logical possibility. Another possibility is that "cattle" can't be inserted into the nominal plural contexts into which "cat" can be inserted because "cattle" is inconsistent with some _other_ feature in these contexts (for argument's sake, say [- herd animal]) -- that is, "cat" and "cattle" belong to distinct root classes. We would then predict other differences in their syntactic distribution. This alternative may be a non-starter for "cat" and "cattle," but it seems more promising for roots that participate in different verb alternations, for example. ---------------------------------- Alec's second posting: >4. The decompositional nature of Pustejovsky's lexical entries imply >that it makes sense to consider, e.g., what a lion would be if it weren't >an animal. Or what a cake would be if it weren't made by baking. Or what >a hammer would be if it weren't used for hammering. > >Cf., what's "raising" without agentive cause (=rising). If a "singer" is >"one who sings," what is, "John sings"? OK, I give up. What IS "John sings"? (No idea what this is about.) (Heidi wrote: >With respect to the last parentheses, in posting two, para 6a, is >there supposed to be a star/question mark on "I devoured it up?" I assumed so, but found this bit hard to follow too.) >7. Criteria of identity between roots > >polysemy (same thing has multiple meanings) vs. homophony (different things >have same sound) Eh? I thought polysemy was used for cases where the meanings typically have a shared core (e.g. cases of underspecification in DM), while homophony was used for cases where distinct meanings "accidentally" have the same pronunication (e.g., separate vocabulary items in DM). Or maybe that's what was meant here... >8. Private nature of meanings/source of meanings in individual... > > a. Nothing; they're not the same but they're close enough > b. The nature of cats > c. The structure of the human linguistic and conceptual system > >The issues here are tightly bound up with questions of acquisition. Both >in a. and in c., the child will know a cat when s/he sees one and will be >able to say, oh "cat" means cat. Both scenarios leave open the possibility >of suppletion for roots like "cat" (singular, "cat," plural, "blicks"), >depending crucially though on the exact structure of the grammar (where do >roots enter the grammar and in what form?). I don't see the connection between these different views and root suppletion. Does (b) rule out the possibility of root suppletion? How? ---------------------------------- Heidi's posting: >Question: in all you English-speakers' judgements out there,does the >benefactive alternation when used with a potentially >creation-implying verb like "bake" force the creation reading? That is, >can "I baked Mary a Twinkie" mean "I put a Twinkie in the sun for Mary"? >or can it only mean "I created a Twinkie for Mary?" I think either is possible, given appropriate context. >[Pinker] says: people have confusion about how to treat sequences like >'hole in one', 'gin and tonic', 'jack-in-the-box', 'mother-in-law'. Some >analyse them as regular phrasal idioms, with associated syntactic >structure, and inflect them as such: holes in one, 'jacks in the box', >etc. Others analyse them as complex morphologcial words, without phrasal >components (e.g. as flat structures rather than N+PP), and inflect them >like words: 'hole in ones', 'jack in the boxes'. What does a Distributed >Morphologist have to say about flat-structre analyses of these sequences? Titles are an even more extreme case -- why don't they make any more "One Flew Over the Cuckoo's Nest"'s or "Guess Who's Coming to Dinner"'s? I don't have anything insightful to say about these myself, but is there any reason why we couldn't add a nominalizing head (little n) to any old phrasal constituent and pluralize it at will? mcginnis at ucalgary.ca From mcginnis at UCALGARY.CA Fri Jan 28 22:34:28 2000 From: mcginnis at UCALGARY.CA (Martha McGinnis) Date: Fri, 28 Jan 2000 15:34:28 -0700 Subject: Dan Everett: Response to McGinnis posting Message-ID: > Martha McGinnis wrote: > > Here are some comments and questions in response to the previous > three postings. > > Cheers, > Martha > > ---------------------------------- > Alec's first posting: > > >10. Why isn't a whole sentence simply a "form of a word" in some > >paradigm space defined by the syntax? > > > >Everyone assumes that one reaches bedrock at the "content" words. So, "The > >cat is on the mat," can't be a form of the word "cat" since "mat" must be > >an independent word. However, "He's always singing," could be a form of > >the word "sing," and might be realized as a single phonological word in > >some language. > > Indeed. A fine observation! In fact, an entire sentence can be a word (although not exactly like the cases mentioned here). In my grammar of Wari' (ROUTLEDGE), co-authored with Barbara Kern, we discuss several cases of 'verbalization', whereby entire sentences are interpreted as and function as verbs, undergoing derivational morphology, etc. Yet the sentences themselves have undergone WH-Movement, etc. prior to verbalization. Moreover, there are co-reference requirements between constituents of the verbalized sentences and the matrix clause. A summary of the facts is found in my chapter in the _HANDBOOK OF MORPHOLOGY_. I have just emerged from several weeks of fieldwork in the Amazon, so I have missed a lot of the discussion here. But let me second Heidi's recommendation of Fodor's two new excellent books. All linguists oughta read them. Dan Everett From mcginnis at ucalgary.ca Mon Jan 10 23:37:01 2000 From: mcginnis at ucalgary.ca (Martha McGinnis) Date: Mon, 10 Jan 2000 16:37:01 -0700 Subject: Alec Marantz: Lecture notes - Fall morphology class Message-ID: Dear DM-List-ers, With the hope of generating more productive discussion about morphological theory, I have decided to post the lecture notes I wrote up for the fall 99 morphology course at MIT. Anticipating some responses to the questions raised in these notes, I delayed posting until after the semester was over, since I was reluctant to both push forward with the topics I promised to discuss in the course and engage thoughtfully in the discussion over the notes on previous topics. I still can't promise to reply to your comments or to clarify everything that is left unclear in the notes. Some formatting may be lost in the pasting of Word documents into these e-mails. I will monitor the results when I receive the postings myself. There should be twelve sets of notes in all; I hope to finish posting them by the end of January. I'm re-reading the notes and making small corrections, particularly in response to class discussions. There will be still be errors remaining, along with much opaqueness and unclarity. Please think twice before quoting these notes in print. In addition to flat out errors, there is sometimes a dialog going on in the text, with positions other than my own being presented for discussion. Yours with some trepidation, Alec Marantz Lecture Notes, Morphology 9/10/99 Jackendoff, R. 1997. "Idioms and Other Fixed Expressions." In The Architecture of the Language Faculty, LI Monograph 28, Cambridge, MIT Press, pp. 153-177. Marantz, A. 1997. "No Escape from Syntax," in A. Dimitriadis et al., eds., U Penn Working Papers in Linguistics, 4.2, pp. 201-225. Fodor, J.A., & E. Lepore. 1998. "The Emptiness of the Lexicon." LI 29.2, 269-288. Pustejovsky, J. 1998. "Generativity and Explanation in Semantics." LI 29.2, 289-311. 1. Lieber, Deconstructing Morphology "The starting point for this theory of word formation is a somewhat odd one - the fringes of morphology, so to speak, where the syntax of words and that of phrases seem to converge." a floor of a birdcage taste over the fence gossip off the rack dress God is dead theology 2. This betrays the hidden assumption that the syntax deals with the arrangement of words. 3. Jackendoff, The Architecture of the Language Faculty "What is in the lexicon and how does it get into sentences? So far, we have dealt only with the lexical insertion/licensing of morphologically trivial words like cat (chapter 4). Now we turn to morphologically complex elements of various sorts." There's an illusion that we know what a word is -- "cat" is a word - and an assumption that we do syntax with words. Of course every syntactician violates this assumption from square one. D'ya wanna debate this point? Latin: fe:les NOM singular fe:lem ACC fe:lis GEN 4. Suppose we begin with the features or categories of language. In phonology, phonological features are constituitive of sounds - that is, if you strip the features and structure from a sound, there is no remnant; there's nothing left. Are there non-constituitive (say, classificatory) features to the terminal (combining) nodes of the syntax (or the Lexicon)? 5. cat has at least a root and a category feature ("noun"). If syntactic categories can be classificatory features of roots, then the word cat could be a syntactic and Lexical atom - an atom with a syntactic category. However, if syntactic features are constituitive, then there must be a remnant to cat when its nounness is removed (just like there's a remnant to a /t/ when its [-voice] is removed). 6. Word and Paradigm theories, like A-Morphous Morphology, assume that words have properties required by the syntax. Suppose Word and Paradigm theories existed without the notion of a "paradigm." The words of a language, an arbitrary set, would have whatever features (properties) they have. Syntactic combination, however, requires words with particular properties. So Latin direct objects must have the ACC property. Only if there is a word meaning 'cat' with the property ACC in the language will Latin allow 'cat' as a direct object. 7. Paradigm theory says that "words" have different "forms." The set of features for, say, a Noun that the syntax might require (all the different cases, for example) form a paradigm space (crossing the features, e.g., case and number, yields an n-dimensional space, e.g., 2 dimensional case by number space). Words come in families identified as forms of the same word. The forms of a word fill out paradigm space, so a particular form may fill several "cells." 8. Anderson: Forms of a word are created from a base form via morphophonological rules. On this logic, the base form is the (universal) default form, since no morphophonological rules need to apply to a base with a full set of syntactic features (assigned to the base by the syntax). 9. Anderson: The features realized by inflectional morphology - that is, the features that determine paradigm space - differ from language to language. For example, in some languages number on nouns is inflectional and in some it is derivational. Since the syntax itself must treat inflectional features specially, this means that the basic operation of the syntax is different from language to language. 10. Why isn't a whole sentence simply a "form of a word" in some paradigm space defined by the syntax? Everyone assumes that one reaches bedrock at the "content" words. So, "The cat is on the mat," can't be a form of the word "cat" since "mat" must be an independent word. However, "He's always singing," could be a form of the word "sing," and might be realized as a single phonological word in some language. 11. Should we take anything for granted? If we don't actually do syntax with phonological words, is there any reason to suppose that we have a notion of "word" suitable for the atoms of syntax? Should we assume that cat is atomic, that it is an atomic Noun, that it is a syntactic atom, that we even know what Jackendoff is referring to when he writes, "morphologically trivial words like cat"? 12. The basic questions any theory of grammar must ask include, what are the atomic, combinatory units of the syntax, what are the atomic, combinatory units of the (morpho-) phonology, and how do these atomic units connect in the derivation of a phonological form? 13. One could imagine that, as far as the grammar is concerned, there are only "constituitive" features, each feature is an atomic, combinatory unit in the syntax, and each feature corresponds to an atomic morphophonological unit, call it a "vocabulary item." To the extent that roots (the "remnants" of "content words" when all constituitive features (save the root) are stripped away) have classificatory features, they may play a role in semantic interpretation (at the LF interface with conceptual systems) but would not enter into any grammatical principle (rule, constraint, what have you=8A). 14. Let's suppose that this idealized view of grammar is too simplistic. On the syntax side, for example, sets of features, as opposed to individual features, may operate as atomic units in the syntax (which means that the features aren't combined by "merge" and thus that their combination need not be interpreted at PF or LF). On the phonology side, sets of syntactic features may correspond to single vocabulary items. We call the theory of pre-syntactic "bundling" of features into single atomic units for the syntax the theory of the Lexicon, where the bundles are the Lexical items. The strongest claim we can make (other than the claim implied in 13. - that there is no Lexicon in this sense and that therefore every feature is a syntactic atom subject to "merge") is that there is no "bundling" (Fusion) outside the Lexicon and that therefore every syntactic atom is the locus for vocabulary insertion in the phonology. 15. Jackendoff explicitly assumes that "mismatches" between "conceptual structure" and "phonological structure" can include arbitrary many-to-one correspondences wherein whole chunks of hierarchically arranged syntactico-semantic atoms correspond to atomic phonological pieces. So simplex "went" can correspond to [GO + Past] (and simplex "kill" could be CAUSE(X, DIE(Y))). 16. Fodor, on the other hand, supposes (in some sense, to be made clear) a one-to-one correspondence between words and concepts. Note that =46odor acknowledges that "cats" has "cat" in it, and that "book shelf" probably has "book" in it as well. He doesn't dwell on the issue of whether "redden" has "red" in it, or "raise" includes "rise." The issue he's after is whether logical inference is run off constituitive features of (in our terms) roots. Is the conclusion that "John is unmarried" from "John is a bachelor" made on the basis of a decomposition of "bachelor" into features like "unmarried" (with our without remnants) or is this conclusion drawn on the basis of classificatory features (or properties) of "bachelor"? Fodor never considers the possibility that "bachelor" decomposes into the root "bachel" and the suffix "or," where the nominalizing suffix -or here is associated with the meaning of "occupation" or "socially identifying classification of a human" (cf. butcher, pensioner, widower, spinster=8A). 17. For Jackendoff, in the best of all possible worlds, phonological structure transparently reflects conceptual structure and logical inference can be run formally off conceptual structure. What's interesting is that both Jackendoff and Fodor argue essentially from a "what you see is what you get" perspective. Fodor see words and grapples with what he takes as a transparent truth that linguistic structure doesn't support logical inference. Jackendoff sees just slightly messed up conceptual structures in phonological structures, once the proper form of lexical representations is revealed. Both think they know a word and a noun when they see one (cat). 18. The failure to question the obvious has led morphology into darkness and despair. A. No one takes the serious morphologists seriously. I don't know of a single phonologist or syntactician that has really come to grips with Anderson or Lieber (adopted their theories - I mean really adopted, rather than waved at what they thought the theories were - or rejected the theories on reasoned grounds), for example. Most syntacticians and phonologists operate outside any defensible or well-supported theory of morphology (and I'm willing to name names here). B. The serious morphologists, on the other hand, who generally tend to be phonologists at heart, end up making up their own theories of syntax, by themselves. Anderson and Lieber think they're adopting standard syntactic theories, but they were several years out of date in syntax when they created their theories and since syntacticians didn't take them seriously anyway, they were never subjected to the type of critique of their understanding of syntax that could have shown them the error of their ways. [Wunderlich and Beard just make the syntax up - as a serious enterprise for Wunderlich, although out of the mainstream.]. C. Phonologists generally punt the morphology entirely these days. Standard Optimality Theory is simply [i.e., literally] incoherent from the standpoint of morphology. [It is impossible to tell how standard OT deals with the usual issues of morphological theory.] 19. We know some things about morphology. A. First, there's s paradigmatic dimension to at least non-root morphemes. i. "gaps" are the exception rather than the rule. So we expect every noun to be able to appear in every case position, every verb to have a past tense, etc. ii. there is "competition" among forms for the realization of features, a competition that yields syncretism and (at least the appearance of) undespecification. This paradigmatic quality of morphology already disconfirms standard theories of Lexical Morphology in which the feature structure of a word is built up via percolation or other computation over the feature structure of morphemes qua lexical atoms of sound/meaning correspondence. You can't prevent, "He comb his hair" for "He combs his hair" unless you have competition, and thus "separation" in Beard's sense. You have to know that you're shooting for third person singular and thus "combs" (really, /z/) can beat out "comb" (really, /=F8/) for the realization of third person= , being more specific than (default) "comb"/zero. Proponents of contemporary Lexical Morphology reintroduce paradigms in some other way (see Lieber and Wunderlich), but these mechanisms have never seriously been examined by anyone. B, Second, the phonological realizations of syntactic features are pieces with properties (we call them vocabulary items); thus A-Morphous Morphology as a general approach to morphophonology can't be right. In fact, as shown by Halle and Marantz, A-Morphous Morphology as a theory (as opposed to a descriptive framework in which anything could be described) had no properties independent from its principles of disjunction, which are empirically disconfirmed with data that no one has ever disputed. And Anderson's new version of A-Morphous Morphology has pieces - his theory of clitics (which must be treated as the output of morphophonological rules by the general principles of A-Morphous Morphology) requires constraints ordering the clitics as pieces with properties. 20. The paradigmatic and piece properties of morphology require something like Distributed Morphology - that is, they require a theory in which phonological pieces are organized into hierarchical tree structures and in which these pieces are underspecified with respect to syntactic features and compete with each other for the realization of syntactic features. All the burning issues in morphology can be stated as disputes within this general framework. There are no alternatives to Distributed Morphology (i.e., a theory with late insertion of piece-like vocabulary items into non-root terminal nodes, where the vocabulary items are underspecified with respect to the syntactic features they realize and where vocabulary insertion is governed by "elsewhere" competition). A real alternative would need some explicit account of the paradigmatic and piece properties of morphology. 21. Nevertheless, we will find some recent proposals for alternative theories in the literature. In particular, we will review some proposals for notions of "lexical relatedness" other than "shares the same pieces," where connections between "stored" derived and/or inflected forms play a role in the grammar. An extreme here is the Bybee/Burzio position that speakers store all tokens of all words (for Bybee, probably all sentences) that they hear and draw generalizations over stored forms. Steriade makes related claims. We have to ask how we can evaluate these claims, which are literally incoherent - i.e., do not cohere with any general theory of grammar. 22. Learning roots like "cat" probably requires a prior conceptual space, such that you know the meaning of "cat" before you learn the word ("oh, that's what we call those things!). Still, an innate conceptual space may or may not imply a decomposition of meaning into constituitive features (is "fuzzy" a necessary/definitional property of "cat"). In any case, the essential issue for linguistics is whether the feature space that creates the meaning of "cat" plays any grammatical role, similar to that played by features like [+past]. If the features of roots play a role in the grammar, then we might expect syncretism (from underspecification) and/or suppletion (contextual allomorphic vocabulary items) for roots. 23. Suppose true modularity for root semantics. The features of roots, be they consituitive or classificatory, may be irrelevant/invisible to the grammar. This assumption predicts no syncretism/suppletion for roots. 24. General assumptions: The Distributive Morphology (Sept. 99 Marantz version) Framework. * = inevitable assumption (conceptual necessity) ! = arguable assumption/assumption with testable consequences There exists a Universal set of grammatical features, U * A language chooses a subset of U for its grammar ! The language bundles some subsets of its subset into Lexical Items ! The Lexical Items include only a generic root node and no specific roots ! Root nodes aren't bundled with any features into Lexical Items ! Syntactic Merger is the only process that combines Lexical Items * The phonological word is not a cyclic domain of the syntax ! Re-Merger (Morphological Merger) occurs at PF (i.e., without LF consequences) ! Vocabulary Insertion and phonological realization occurs at the "phase" level * As McCawley emphasized, the cyclic nature of grammatical derivation may be the most important and least questionable discovery of generative grammar. Vocabulary items are connections between features of U and phonological forms !/* Vocabulary insertion works bottom up ! Features from U in Lexical Items may be deleted prior to VI (Impoverishment) ! There may be multiple VI in a single Lexical Item (fission) ! There are Vocabulary Item specific ordering constraints at a single hierarchical level ! marantz at mit.edu From mcginnis at ucalgary.ca Tue Jan 11 16:44:20 2000 From: mcginnis at ucalgary.ca (Martha McGinnis) Date: Tue, 11 Jan 2000 09:44:20 -0700 Subject: Alec Marantz: Lecture Notes II Message-ID: Dear List-ers, Without waiting for response to the first set of lecture notes, I present you with set two. Please refer to the readings listed at the beginning of the first set of notes for an indication of what I'm talking about when I discuss "Fodor" and "Pustejovsky." --Alec Morphology II, 9/17/99 Roots and decomposition 1. Fodor asks us to separate the linguistic from the conceptual, and decompositional features from other properties. Fodor agrees that it's odd to say, "John began the rock" (out of context) because rocks aren't events nor imply a characteristic event (as does "book"). However, he asks what reason there is to believe our knowledge about rocks is either linguistic or decompositional. 2. Pustejovsky retorts that Fodor is simply avoiding the interesting question: how can we represent the knowledge that speakers have about "rock" that leads to their judgements about sentences containing "rock"? If the knowledge that rocks don't name/imply events is conceptual, it at least must be represented in a way that interacts with the representation of linguistic semantics to yield sentence meanings from the linguistic system. 3. Fodor points out that inferences based on Pustejovsky-style representations are defeasible ("I baked a cake" sometimes implies I made the cake, but, "I removed the twinkie from its cellophane wrapper and baked it in the sun" is fine with no "creative" implications). On the other hand, if "I baked a cake," then it's true that I baked a cake. (And if "I saw cats" is true, then there is more than one cat such that I saw it.) The question would be, where do Pustejovsky's observations belong, and is his "theory" (simply) an elaborate method to encode some of these generalizations. 4. The decompositional nature of Pustejovsky's lexical entries imply that it makes sense to consider, e.g., what a lion would be if it weren't an animal. Or what a cake would be if it weren't made by baking. Or what a hammer would be if it weren't used for hammering. Cf., what's "raising" without agentive cause (=rising). If a "singer" is "one who sings," what is, "John sings"? >>From the standpoint of morphology, one would ask why it is that the categories Pustejovsky deals in aren't morphologically expressed in any language. Why is there no "telos" morpheme indicating the class of something according to what it's for? 5. Three dimensions of controversy over root meanings: A. Linguistic nature of meanings B. Criteria of identity between roots C. Private nature of meanings/source of meanings in individual 6. Linguistic nature of meanings: a. Are there strict meaning/syntax correlations - for the type of meaning categories Pustejovsky uses in his decompositions? Fodor questions whether distributional facts distinguishing "eat" and "devour" represent a true generalization about semantics/syntax mappings. I ate it up/I devoured it up/I finished it up. I ate it raw/I devoured it whole Cf. Keyser & Roeper on "abstract clitic" hypothesis. b. Are the word-internal decompositional meanings ever expressed systematically in syntactic composition? Compare eat/devour with 'walk around in circles'/'walk to the store' bake/bake+applicative in languages with overt applicative morphology and double object applicative constructions c. Are the semantic decompositional categories ever systematically marked by overt morphology in any language? Compare the animate/inanimate distinction with the 'inherently for doing something'/'created in a particular way' distinction Stage/individual level vs. state/event distinction 6. Note that this discussion takes place against the bedrock assumption of compositionality in the syntax. Everyone agrees that the semantics of "kick the ball" is compositional. One could ask whether monomorphemic "blick" could mean "kick the ball" Cf. Jackendoff on "kick the bucket" = "die" Or one could ask whether there isn't a different sort (different primitives, different means of composition) of compositionality below the word level, where the syntax and the below-the-word level composition mix only at the word level; the syntax never does what the word internal mechanisms do and vice-versa. 7. Criteria of identity between roots polysemy (same thing has multiple meanings) vs. homophony (different things have same sound) Pustejovsky claims that Fodor provides no account of productive polysemy. Fodor claims that Pustejovsky's polysemy is sometimes homophony and sometimes requires no linguistic account (and thus is technically not polysemy or homophony). Is "bake" polysemous or homophonous? Is it the same "cat" in "The cat is on the mat" and "Don't let the cat out of the bag"? bank of a river vs. bank with the money Clearly these questions can't be asked without a fairly articulated linguistic theory. Morphophonology will be crucial here - where is information about allomorphy stored? With the phonological form of a root? With the meaning/identify of the root as an interpreted object? Somehow in a system of interrelated stored words? Consider the "flied out to center field" type of example. What does this tell us about the "fly" in "fly out"? Is this a question of structure or of identity? Related to allomorphy: conjugational/declension classes of roots/stems. What's the connection between the identity of a root and its (arbitrary) class. See Embick on deponent verbs in Latin. 8. Private nature of meanings/source of meanings in individual What makes your "cat" the same as my "cat"? a. Nothing; they're not the same but they're close enough The biology of cats and the biology/psychology of humans interact to make my concept of "cat" close enough to your concept that communication is possible and almost perfect. If chimps could talk, their "cat" would be more or less the same as our "cat" as well. However, if snakes could talk, their "cat" might be quite different. b. The nature of cats "Cat" is a natural class in nature. A more or less "blank slate" perceptual/learning system will discover the class through interaction with the environment. So "cat" is a class of nature and an emergent category of the perceptual/conceptual system. If snakes could talk, their cat category would be the same as ours - or they would be getting things wrong. c. The structure of the human linguistic and conceptual system Jackendoff and Pustejovsky more or less adopt this stance. "Cat" is like "plural," only more complex. The feature space of language (of natural language semantics) leaves a "cell" in the paradigm for "cat," waiting to be filled during language acquisition. Here, chimps can't talk so they can't have our concept of "cat." Why chimps seem to be able to do complex things that would implicate complex conceptual structures, yet don't/can't use language, is an interesting question for Jackendoff (along with why most people have a very hard time describing scenes they see and can operate within). The issues here are tightly bound up with questions of acquisition. Both in a. and in c., the child will know a cat when s/he sees one and will be able to say, oh "cat" means cat. Both scenarios leave open the possibility of suppletion for roots like "cat" (singular, "cat," plural, "blicks"), depending crucially though on the exact structure of the grammar (where do roots enter the grammar and in what form?). 9. Pustejovsky can't win simply by pointing out what we know about the meanings of words in sentential contexts. Fodor is denying it makes sense to separate the word from the concept, and he therefore refuses to accept evidence about speakers' knowledge of concepts as evidence for anything other than knowledge of concepts. Fodor is thus committed to a modularity between linguistic structure and conceptual meanings; what's "below" the word level isn't strictly speaking linguistic, and isn't compositional. Pustejovsky must show that there's a strong resemblance between the compositionality below and above the word level. Fodor accepts the what-you-see evidence of syntactic composition as sufficient for supporting syntactic decomposition; I imagine he could be convinced to see overt morphological composition in the same way, arguing for word decomposition when you can see the composition. Since Pustejovsky is pushing decomposition where there is no overt composition, he needs very strict, reproducible evidence for syntax/semantics correlations (in putative cases of polysemy and for supporting a claim of a semantic feature with syntactic consequences). Fodor says Pustejovsky fails empirically, and I tend to agree. marantz at mit.edu From mcginnis at ucalgary.ca Mon Jan 24 18:18:38 2000 From: mcginnis at ucalgary.ca (Martha McGinnis) Date: Mon, 24 Jan 2000 11:18:38 -0700 Subject: DM list posting re: alec's lecture notes Message-ID: dear DM-listers -- I've read Alec's posting now, but not the reading list it came with, here are a few quick questions/comments off the cuff, with caveats about my uneducatedness, late-afternoon-on-fridayness, etc.ness. A. w/r to posting one, para 6 and para 19 Ai: seems like the lack of a paradgimatic structure for root morphemes (lack of gaps) is a coherent reason for assuming that roots are not subject to competition/not paradigmatic in any sense relevant to language, contra the representation of Jackendoff 'n pustejovsky in posting two, para 8a, and in favor of Rolf's and my distinction between l- (non-competing roots) and f- (competing non-roots) affixes. Is there any way to get the competes/doesn't distinction result out of the stipulative realm and into the principled? I.e. Noyer&Harley argued for a sorting of the Vocabulary Items into roots and everything else, on the basis of competition. What does that correlate with? Can't be association with encyclopedic info, (or maybe it can?) because derivational morphemes are in the f-morpheme block too (unless the types of info derivational morphemes seem to refer to ("agent of X action", "state or quality of X..." etc.) have privileged status? e.g. derivational morphemes occur in restricted syntactic frames that encode these meanings, much as we (some o' us) ascribe the appearance of theta-roles to certain frames/f-morphemic relations?) Question: in all you English-speakers' judgements out there,does the benefactive alternation when used with a potentially creation-implying verb like "bake" force the creation reading? That is, can "I baked Mary a Twinkie" mean "I put a Twinkie in the sun for Mary"? or can it only mean "I created a Twinkie for Mary?" I can't tell; I've convinced myself in both directions in the past hour. B. You imply (Alec) that Fodor could be convinced of the reality of morphological decomposition, and admit morphologically complex words to contain within them several atomic concepts. Indeed, he's basically said as much, that I've read. Could he, however, be convinced of the existence of zero morphemes? (My personal favorite, of course, being v=CAUSE in English). Alternatively, could he be convinced of the conceptual complexity of multimorphemic words whose morphemes don't obviously mean anything? (Latinate words in English, I'm mainly thinking of here). With respect to the last parentheses, in posting two, para 6a, is there supposed to be a star/question mark on "I devoured it up?" Contextually, I assume there's not; it seems to be presented as an argument against the devour=telic generalization, I think. Judgementally, though, seems pretty lousy to me, and it reminds me of remarks I've heard on some occasions from Alec and one particular discussion w/ rolf and Alec, where the conjecture that English speakers aggressively segment multisyllabic words (with the appropriate properties) into morphemes, and that this has syntactic ramifications. In particular, we concluded that the "a-" in "arise" is probably a prepositional head functioning like a particle in a verb-particle construction, forcing the verb into a certain aktionsart. And (many apologies if I'm remembering this wrong) it seems to me that I've heard the suggestion of a similar account of the failure of dative shift in Latinate-sounding ditransitive verbs from you, Alec: they are agressively segmented (do+nate, e.g.), and have a corresponding syntactic complexity, which forces them into the double complement construction, not permitting whatever the necessary zero-morph is that gets you the double object construction (let's, say, call it G). Seems like a similar story for the failure (if it exists) of "*devour it up" would work nicely. In this vein, I've recently been talking loosely with Mike Hammond about phonological evidence for agressive morphological segmentation in English; among other things, if Latinate words are monomorphemic in English, a bunch of medial consonant clusters that are otherwise forbidden. For example, the medial cluster [dh] is forbidden in English -- unless "adhere" is monomorphemic, in which case it occurs in English but only in words of Latinate origin. Mike says that there's a lot of nice phonolgical generalizations to make if you can claim that, e.g., "honest" is aggressively segmented into "hon+est". The point for current purposes, though, is, could Fodor be convinced of conceptual complexity on the basis of morphological complexity *in the absence of* good clear concept-morpheme matches? Another Fodorian remark: I've just read his recent book "Concepts: Where Cognitive Science Went Wrong", which is quite the cogent defense of conceptual atomism, and find to my surprise at the end that Fodor's conceptual atomism no longer entails for him radical nativism: he's now prepared to think that we do acquire DOORKNOB and aren't born with it, having come up with a metaphysical out that satisfies him. Just FYI. It's got a very entertaining and cutting chapter in it called "The Demise of Definitions (Part I): the Linguist's Tale." C: w/r to word-internal morphological rules being different or not from syntactic rules: Pinker presents a bunch of well-known facts in "Words and Rules" about the difference between morphological and syntactic composition; I'd sort of forgotten about them, but I was wondering what people (those of us for whom it's syntax all the way down) thought about 'em. He says: people have confusion about how to treat sequences like 'hole in one', 'gin and tonic', 'jack-in-the-box', 'mother-in-law'. Some analyse them as regular phrasal idioms, with associated syntactic structure, and inflect them as such: holes in one, 'jacks in the box', etc. Others analyse them as complex morphologcial words, without phrasal components (e.g. as flat structures rather than N+PP), and inflect them like words: 'hole in ones', 'jack in the boxes'. What does a Distributed Morphologist have to say about flat-structre analyses of these sequences? Are they multimorphemic? If so, why their evident non-phrasalness? I'm sure I'm embarassingly naive about this. But help is very welcome. So, that's it from me for the moment. Thoughts? comments? derogatory remarks? best, hh --------------------------------------------------------------------- Heidi Harley (520) 626-3554 Department of Linguistics hharley at u.arizona.edu Douglass 200E Fax: (520) 626-9014 University of Arizona Tucson, AZ 85721 From mcginnis at UCALGARY.CA Thu Jan 27 18:18:00 2000 From: mcginnis at UCALGARY.CA (Martha McGinnis) Date: Thu, 27 Jan 2000 11:18:00 -0700 Subject: Martha McGinnis: Reponses to Alec's and Heidi's postings In-Reply-To: Message-ID: Here are some comments and questions in response to the previous three postings. Cheers, Martha ---------------------------------- Alec's first posting: >8. Anderson: Forms of a word are created from a base form via >morphophonological rules. On this logic, the base form is the (universal) >default form, since no morphophonological rules need to apply to a base >with a full set of syntactic features (assigned to the base by the syntax). I'm not sure what this means -- what's a universal default form? And if forms of a word are created by the application of morphophonological rules to a base form, why say that no such rules need to apply to a base form with a full set of syntactic features? >10. Why isn't a whole sentence simply a "form of a word" in some >paradigm space defined by the syntax? > >Everyone assumes that one reaches bedrock at the "content" words. So, "The >cat is on the mat," can't be a form of the word "cat" since "mat" must be >an independent word. However, "He's always singing," could be a form of >the word "sing," and might be realized as a single phonological word in >some language. Indeed. A fine observation! >14. Let's suppose that this idealized view of grammar is too >simplistic. On the syntax side, for example, sets of features, as opposed >to individual features, may operate as atomic units in the syntax (which >means that the features aren't combined by "merge" and thus that their >combination need not be interpreted at PF or LF). Trying to figure out what this would mean.. Say one language lexically bundles together Tense and Agr (or, say, Aspect) into a single head, while another projects them as two separate heads (e.g. along lines suggested in Bobaljik's thesis). Would we expect a difference in LF interpretation? >23. Suppose true modularity for root semantics. The features of roots, >be they consituitive or classificatory, may be irrelevant/invisible to the >grammar. This assumption predicts no syncretism/suppletion for roots. Does it? As (e.g.) Noyer showed in his thesis, there's both contextual and intrinsic specification of vocabulary items. Even if the intrinsic features of roots are invisible to the grammar, the features of functional categories in their local context are visible -- so we might expect contextually determined root suppletion. Alec has argued in the past that any specificiation of root vocabulary items will lead to trouble, under the DM claim that more highly specified vocabulary items will always block less specified vocabulary items if both will fit into a particular context. Suppose, for example, that we had a root vocabulary item specified for nominal plural contexts ("cattle", say), while other root vocabulary items are unspecified for singular/plural distinction ("cat"). Assuming that Encyclopedic features (distinguishing cattle from cats) are invisible to the grammar, "cattle" will ALWAYS block "cat" from being inserted in a plural nominal context. Thus (the reasoning goes), under DM, "cattle" must not be specified for nominal plural contexts. But, as far as I can see, this is a stipulation -- it doesn't follow from the invisibility of the features of roots. Moreover, I don't think it's the only logical possibility. Another possibility is that "cattle" can't be inserted into the nominal plural contexts into which "cat" can be inserted because "cattle" is inconsistent with some _other_ feature in these contexts (for argument's sake, say [- herd animal]) -- that is, "cat" and "cattle" belong to distinct root classes. We would then predict other differences in their syntactic distribution. This alternative may be a non-starter for "cat" and "cattle," but it seems more promising for roots that participate in different verb alternations, for example. ---------------------------------- Alec's second posting: >4. The decompositional nature of Pustejovsky's lexical entries imply >that it makes sense to consider, e.g., what a lion would be if it weren't >an animal. Or what a cake would be if it weren't made by baking. Or what >a hammer would be if it weren't used for hammering. > >Cf., what's "raising" without agentive cause (=rising). If a "singer" is >"one who sings," what is, "John sings"? OK, I give up. What IS "John sings"? (No idea what this is about.) (Heidi wrote: >With respect to the last parentheses, in posting two, para 6a, is >there supposed to be a star/question mark on "I devoured it up?" I assumed so, but found this bit hard to follow too.) >7. Criteria of identity between roots > >polysemy (same thing has multiple meanings) vs. homophony (different things >have same sound) Eh? I thought polysemy was used for cases where the meanings typically have a shared core (e.g. cases of underspecification in DM), while homophony was used for cases where distinct meanings "accidentally" have the same pronunication (e.g., separate vocabulary items in DM). Or maybe that's what was meant here... >8. Private nature of meanings/source of meanings in individual... > > a. Nothing; they're not the same but they're close enough > b. The nature of cats > c. The structure of the human linguistic and conceptual system > >The issues here are tightly bound up with questions of acquisition. Both >in a. and in c., the child will know a cat when s/he sees one and will be >able to say, oh "cat" means cat. Both scenarios leave open the possibility >of suppletion for roots like "cat" (singular, "cat," plural, "blicks"), >depending crucially though on the exact structure of the grammar (where do >roots enter the grammar and in what form?). I don't see the connection between these different views and root suppletion. Does (b) rule out the possibility of root suppletion? How? ---------------------------------- Heidi's posting: >Question: in all you English-speakers' judgements out there,does the >benefactive alternation when used with a potentially >creation-implying verb like "bake" force the creation reading? That is, >can "I baked Mary a Twinkie" mean "I put a Twinkie in the sun for Mary"? >or can it only mean "I created a Twinkie for Mary?" I think either is possible, given appropriate context. >[Pinker] says: people have confusion about how to treat sequences like >'hole in one', 'gin and tonic', 'jack-in-the-box', 'mother-in-law'. Some >analyse them as regular phrasal idioms, with associated syntactic >structure, and inflect them as such: holes in one, 'jacks in the box', >etc. Others analyse them as complex morphologcial words, without phrasal >components (e.g. as flat structures rather than N+PP), and inflect them >like words: 'hole in ones', 'jack in the boxes'. What does a Distributed >Morphologist have to say about flat-structre analyses of these sequences? Titles are an even more extreme case -- why don't they make any more "One Flew Over the Cuckoo's Nest"'s or "Guess Who's Coming to Dinner"'s? I don't have anything insightful to say about these myself, but is there any reason why we couldn't add a nominalizing head (little n) to any old phrasal constituent and pluralize it at will? mcginnis at ucalgary.ca From mcginnis at UCALGARY.CA Fri Jan 28 22:34:28 2000 From: mcginnis at UCALGARY.CA (Martha McGinnis) Date: Fri, 28 Jan 2000 15:34:28 -0700 Subject: Dan Everett: Response to McGinnis posting Message-ID: > Martha McGinnis wrote: > > Here are some comments and questions in response to the previous > three postings. > > Cheers, > Martha > > ---------------------------------- > Alec's first posting: > > >10. Why isn't a whole sentence simply a "form of a word" in some > >paradigm space defined by the syntax? > > > >Everyone assumes that one reaches bedrock at the "content" words. So, "The > >cat is on the mat," can't be a form of the word "cat" since "mat" must be > >an independent word. However, "He's always singing," could be a form of > >the word "sing," and might be realized as a single phonological word in > >some language. > > Indeed. A fine observation! In fact, an entire sentence can be a word (although not exactly like the cases mentioned here). In my grammar of Wari' (ROUTLEDGE), co-authored with Barbara Kern, we discuss several cases of 'verbalization', whereby entire sentences are interpreted as and function as verbs, undergoing derivational morphology, etc. Yet the sentences themselves have undergone WH-Movement, etc. prior to verbalization. Moreover, there are co-reference requirements between constituents of the verbalized sentences and the matrix clause. A summary of the facts is found in my chapter in the _HANDBOOK OF MORPHOLOGY_. I have just emerged from several weeks of fieldwork in the Amazon, so I have missed a lot of the discussion here. But let me second Heidi's recommendation of Fodor's two new excellent books. All linguists oughta read them. Dan Everett