Prevailing approaches do not have a computational lexicon
Carl Pollard
pollard at ling.ohio-state.edu
Mon Sep 23 00:35:59 UTC 2002
Hi Andrew,
>
Yes, the debate is roughly as you've characterized it. "Classical" MP
(if I may use the term "classical" loosly) holds that lexical items come
fully formed into the syntax, complete with feature structures (albeit not
ones as rigourously defined as those in HPSG). The words are then combined
via merge, into phrase markers, where all the relevant features are
"checked" (very roughly equivalent to unified -- which is why some of us
lurk on these lists).
>>
That makes sense to me. Except for move (you only mentioned merge),
what you just described (which, with you, I'll refer to as
"lexicalism") is easily translatable into the standard assumptions
about the relationship between lexicon and syntax that have been made
within such frameworks as categorial/type-logical grammar, LFG, GPSG,
and HPSG as long as those frameworks have existed (in the case of
Lambek's kind of categorial grammar, since 1958). It's interesting
that you say "Chomsky and his buds" have come around to this position
too. I suppose some would say this is a logical place to end up
starting from "Remarks on nominalization".
By contrast, what you call the "alternative view (a la Distributed
morphology)"
>
that major class lexical items such as verbs and nouns come into the
syntax partly underspecified (including underspecification for syntactic
category). For example, a root such as "DIE" comes in only partly
specified, it is by virtue of combining it with functional categories such
as v (a verbalizer), aspect, tense, etc. it becomes either the verb die or
the noun death. The actual lexical item (technically vocabulary item) is
inserted at the end of the derivation.
>>
seems to be essentially the standard assumption made in
transformational generative grammar (and generative semantics) as long
as THAT framework has existed, retooled slightly to accord with MP
terminology: words are inserted into phrase markers. In short, the
view that lexicalism has always argued against.
Maybe this is the right time and the right venue for a review of the
arguments for lexicalism (any volunteers? I'm swamped with course prep
right now). For now I'll just ask a question: what are your basic
reasons (in not-too-technical or framework-specific terms) for
sticking to lexical insertion? Should we all (= everyone but
Distributed morphologists, as far as I can tell) abandon lexicalism,
and if so why? What should we tell our intro. syntax students about
this question? (Okay, so that's three questions, not one.)
Thanks,
Carl
More information about the HPSG-L
mailing list