Prevailing approaches do not have a computational lexicon
Carl Pollard
pollard at LING.OHIO-STATE.EDU
Mon Oct 7 06:58:33 UTC 2002
Hi Mark,
>
Perhaps more interestingly, there are a few cases in which feature
cancellation seems to provide a simpler and more natural treatment
(e.g., Ingria's examples of the failure of transitivity of agreement).
Mary and Ron and others came up with some very interesting alternative
analyses that involve making all agreement features into set-valued
features and express agreement as subset rather than equality
constraints. To my mind at least this is just a simulation of feature
cancellation in a unification grammar (assuming you think subset
constraints ought to be part of a unification grammar).
>>
Are you referring here to examples of syncretism (or neutrality, or
underspecification, as opposed to ambiguity) like the celebrated
Er findet und hilft Frauen
he finds and helps women-ACC/DAT
If so, what would it mean (in your terms) to analyze this in terms
of what you called feature cancellation?
>
Ultimately I suppose what
matters is whether the formal framework encourages linguists to make
interesting empirical discoveries; in that regard LFG, HPSG and CG
have all been productive, so perhaps it really doesn't matter whether
a theory uses feature unification or feature cancellation.
>>
Under the influence of people like you, Paul King, and Georgia Green,
HPSG-ers and many other feature-logic folk stopped talking about
feature management in terms of unification by the early 90's, in favor
of just talking about (token) equality of substructures accessed along
different feature paths. And what you call feature cancellation I
suppose is referring to resource-sensitive type logic extended one way
or another with features. So (as I read you) the choice you offer is
between feature logic and resource sensitive type logic.
However, feature logic can also be embedded inside traditional
NON-resource sensitive (Church/Curry/Lambek&Scott etc.) type logic,
and then solutions to the Ingria-type problems become available there
which were not available in standard feature logic. This solves some
problems left unsolved by the kinds of type-logical approaches to
Ingria-type problems that you and Sam Bayer proposed. These solutions
depend crucially on the availability of type constructors beyond the
(labelled) products offered by traditional feature logic, especially
coproducts and exponentials. Then the choice becomes one between
resource sensitive type logic and traditional type logic. Or, to put
it another way, the question is whether the (intuitionistic) logic of
syntactic types has structural rules. Mainstream type-logical grammar
says no. My current work explores the consequences of answering yes.
[Categorically this is saying the category of syntactic types is
cartesian closed, not (merely) monoidal closed.] This is prefigured in
some aspects by a 1961 paper by Curry, in others by Drew Moshier's
mid-90's work on type-theoretic HPSG. I think it is also implicit in
Montague grammar and CCG.
So then why do syntactic types SEEM resource sensitive? My answer is
that it arises from the need to associate syntactic terms (= the
linguistic entities that populate the types) with transmissible
signals (roughly, strings). The resource sensitivity arises,
therefore, not in the syntax but in the syntax-prosody interface. (To
put it categorically: it is not that syntax is monoidal-closed, but
just that prosody is a monoid.)
Carl
More information about the LFG
mailing list