AW: Increasing interest in the HPSG conference

Ivan A. Sag sag at csli.stanford.edu
Thu Jul 1 06:11:12 UTC 2004


Hi Carl,

> > But if it were so obvious, it would not be such a bone of contention.
> > It SEEMED obvious to me too for many years, then I wondered about it
> > for a few years, and now I believe it is false.
>
> I've also come around to this view, but doesn't it entail that grammars canno
t
> include constraints that are stated over non-local structures?
> >>
>
> It entails there are no structures period.

Not really. In HPSG, the ARG-ST values are `structures', locally encoded in
the signs that are `generated' (or `derivable'). What I meant was, for
example, PHRASE STRUCTURE.  This doesn't exist on the view both you and I are
espousing. Hence notions like c-command or o-command aren't easibly definable,
right?

> But the PS-94 analysis of UDCs is GKPS-compatible, isn't it, assuming one add
s
> set-valued features to GPSG (as proposed by Maling and Zaenen 1983)?  And in
> this case, the SLASH-introduction analysis would involve neither a metarule
> nor a lexical rule, but it would allow multiple gaps.
> >>
>
> Yes, but then you lose context-freeness, since the set of categories
> becomes infinite. I think it was conjectured that you end up with the
> indexed languages, but I don't know if that was ever proved or disproved.

Right. That's what I meant by the other comments:

> I think the issue of context-freeness in GKPS is more complex.  Let's leave
> aside the possible effect of `trans-derivational' well-formedness conditions
> (e.g. those involving the notion of `privileged' instantiation of a local
> structure, which requires inspection of all possible projections of a given I
D
> rule). Although GKPS might generalize out of the CFLs via metarules (by some
> presumably mild relaxation of the finite closure condition), it could also do
> so via the introduction of set-valued features, which is required in order to
> make the PS-94 UDC analysis GKPS-compatible. Maybe there's some natural way t
o
> make this generalization `mild'...
> >>
>
> Set-valued features are very messy, unless you have functional types
> and a boolean type (in which case set-of[A] is just A => Bool). But
> I don't know how to make this "mild".

Well, I could imagine that there would be a (performance-induced?) bound
on the cardinality/length of this kind of structure. So suppose
that SLASH was slist-valued and the subtypes of slist were elist, 1list
and 2list, where:

1list ==> |REST elist|
2list ==> |REST|REST elist|

This would put a bound on multiple extractions and, asuming there's
no recursion in the category space, keeps the set of categories finite.

> > But I would not advocate trees qua structural representations either.
>
> Is what you mean by this that:
>
> (1) the theory contains identity constraints, period (feature
>     values are either atoms or functions),
> (2) there are no reentrant structures, and
> (3) trees can be constructed, if needed, but there are no constraints
>     in the theory that make reference directly to tree structures.
>
> FWIW, these are the assumptions embodied in the formal bits of the SWB
> textbook, which is GPSG-like in certain respects....
> >>
>
> What I mean is elaborated some in my earlier reply to Ash. In terms of your
> (1)-(3):
>
> (1) The theory is stated in a (higher-order) predicate logic, so the things
>     it talks about are indeed either constants or functions. And since
>     all the logical constants and quantifiers are definable in terms of
>     = and lambda, yes, all the assertions of the grammar are equalities if
>     you expand out all the definitions.
>
> (2) The closest thing to structures is the proof trees corresponding to
>     the terms that denote sentences. What is the analog of re-entrance
>     in a natural deduction proof? I suppose the closest thing is a lambda
>     that bind two occurrences of the same variable, e.g. parasitic gaps.
>
> (3) Right, the assertions of the grammar are ABOUT the syntactic entities
>     denoted by lambda terms, but THESE ASSERTIONS DO NOT TALK ABOUT THE
>     STRUCTURE OF THE TERM ITSELF, e.g. whether such-and-such a variable
>     occurrence whatever=commands such-and-such a subterm. The syntactic
>     structure of the terms themselves is irrelevant.
>
>
> So we seem to have some common ground, quite a bit in fact.

I agree. But I have to confess that I got clear on these concepts
(if in fact I am clear on them :-)) only after hearing your lectures
in Trondheim....

All best,
Ivan



More information about the HPSG-L mailing list