AW: Increasing interest in the HPSG conference

Carl Pollard pollard at ling.ohio-state.edu
Wed Jun 30 20:12:30 EST 2004


Hi Ivan,

>
>   My personal answer to Carl's question is as follows: quite
>   obviously, trees are what McCawley conceived them to be,
>   i.e. structural representations.
>
> But if it were so obvious, it would not be such a bone of contention.
> It SEEMED obvious to me too for many years, then I wondered about it
> for a few years, and now I believe it is false.

I've also come around to this view, but doesn't it entail that grammars cannot
include constraints that are stated over non-local structures?
>>

It entails there are no structures period.

> > (The context-
> > freeness of GKPS was bought at the cost of imposing a linguistically
> > unmotivated prohibition on application of a given metarule more than
> > once in a derivation.)
>
> you replied
>
>   You seem to assume that the price to be paid for the restrictiveness was to
>   high.
>
> It is too high because it renders ungrammatical sentences like
>
>   A violin this well-crafted, even the most difficult sonatas are
>   easy [to play __ on __].
>
>   I have a personnel issue that I'm not sure who
>   [to talk to __ about __].
>
Wait a minute... You're going a little too fast for me.  The Peters/Uszkoreit
paper was about general properties of unconstrained `Metarule Phrase Structure
Grammars'. Your argument here seems to be presupposing a certain GPSG
analysis, one where SLASH-introduction is done by metarule, as in GKPS.
>>

Right.

>
You're arguing that GPSG `bought context-freeness' only at the cost of
preventing the SLASH-introduction MR from applying to its own output (by
Finite MR Closure), thus disallowing, e,g.

      VP/{NP_1,NP_2} --> V[4] PP/{NP_1} PP/{NP_2} or maybe
      VP/NP_1/NP_2 --> V[4] PP/NP_1 PP/NP_2
>>

Right.

>
But the PS-94 analysis of UDCs is GKPS-compatible, isn't it, assuming one adds
set-valued features to GPSG (as proposed by Maling and Zaenen 1983)?  And in
this case, the SLASH-introduction analysis would involve neither a metarule
nor a lexical rule, but it would allow multiple gaps.
>>

Yes, but then you lose context-freeness, since the set of categories
becomes infinite. I think it was conjectured that you end up with the
indexed languages, but I don't know if that was ever proved or disproved.

>
I think the issue of context-freeness in GKPS is more complex.  Let's leave
aside the possible effect of `trans-derivational' well-formedness conditions
(e.g. those involving the notion of `privileged' instantiation of a local
structure, which requires inspection of all possible projections of a given ID
rule). Although GKPS might generalize out of the CFLs via metarules (by some
presumably mild relaxation of the finite closure condition), it could also do
so via the introduction of set-valued features, which is required in order to
make the PS-94 UDC analysis GKPS-compatible. Maybe there's some natural way to
make this generalization `mild'...
>>

Set-valued features are very messy, unless you have functional types
and a boolean type (in which case set-of[A] is just A => Bool). But
I don't know how to make this "mild".

>
> But I would not advocate trees qua structural representations either.

Is what you mean by this that:

(1) the theory contains identity constraints, period (feature
    values are either atoms or functions),
(2) there are no reentrant structures, and
(3) trees can be constructed, if needed, but there are no constraints
    in the theory that make reference directly to tree structures.

FWIW, these are the assumptions embodied in the formal bits of the SWB
textbook, which is GPSG-like in certain respects....
>>

What I mean is elaborated some in my earlier reply to Ash. In terms of your
(1)-(3):

(1) The theory is stated in a (higher-order) predicate logic, so the things
    it talks about are indeed either constants or functions. And since
    all the logical constants and quantifiers are definable in terms of
    = and lambda, yes, all the assertions of the grammar are equalities if
    you expand out all the definitions.

(2) The closest thing to structures is the proof trees corresponding to
    the terms that denote sentences. What is the analog of re-entrance
    in a natural deduction proof? I suppose the closest thing is a lambda
    that bind two occurrences of the same variable, e.g. parasitic gaps.

(3) Right, the assertions of the grammar are ABOUT the syntactic entities
    denoted by lambda terms, but THESE ASSERTIONS DO NOT TALK ABOUT THE
    STRUCTURE OF THE TERM ITSELF, e.g. whether such-and-such a variable
    occurrence whatever=commands such-and-such a subterm. The syntactic
    structure of the terms themselves is irrelevant.


So we seem to have some common ground, quite a bit in fact.

Carl



More information about the HPSG-L mailing list