The syntax-semantics correspondence and underspecification

Shalom Lappin lappin at dcs.kcl.ac.uk
Tue Jul 6 10:17:28 UTC 2004


Hi Carl,
     Thanks for this reply. It is worth noting that the first option
that you mention "(1) give different scopings the same tectostructure
and have interpretations themselves be underspecified entities a la MRS
etc." can be applied within a lambda calculus model of grammar, although
one distinct from current versions of CG. van Eijk (2003), Computational
Semantics and Type Theory (ms, CWI, Amsterdam) constructs such a model
which he specifies in a  functional programming (Haskell) framework. Jan
gives underspecified scope representations as lambda terms in which a
function is applied to a list of scope operators and a relation (the
propositional core of a sentence) to yield the list of all resolved
scope readings. The function does not have to be fully computed at run
time, as Haskell uses lazy intepretation in which only a particular
element (or elements) of the list of scope readings are identified, and
this computation can be delayed until an appropriate point in the
interpretation process. Chris Fox and I adopt and develop this approach
in our work on Property Theory with Curry Typing, which is also a lambda
calculus based theory of semantic representation. Regards.

Shalom

Carl Pollard wrote:

>Hi Shalom,
>
>
>
>I am puzzled by Carl's recent replies to Tibor. Carl you appear to
>have returned to a classical Montague (PTQ) view of the
>syntax-semantics correspondence in which differences in semantic
>representation entail distinctions in syntactic structure. On this
>approach variant scope readings are obtained from alternative
>syntactic derivations/structures.  Have you, then, given up
>underspecified semantic representations of the MRS (or related)
>variety which can be resolved to different interpretations but
>correspond to a single syntactic source? If so, then what of the
>advantages of underspecified representations, such as the avoidance of
>spurious, unmotivated syntactic ambiguity, achieving greater
>computational efficiency in the interpretation process by not
>generating k!  syntactic-semantic correspondences for k scope taking
>elements in a sentence, etc.? If you have not given up underspecified
>respresentations, how are they accommodated in the Lambek calculus
>type grammar that you have sketched?
>
>
>
>The view I sketched is indeed neo-Montagovian, not the way he actually
>did things in PTQ but they way he mentioned (in passing, in PTQ) that
>he COULD have done it. The way he ACTUALLY did it was to make
>translation a RELATION between strings and IL expressions, but he
>pointed out that he COULD HAVE just as well have made it a FUNCTION from
>analysis trees (which can be considered tectostructures) to IL
>expressions.
>
>But then as always one gets a choice about how to handle the
>interpretive multiplicities (I resist calling them ambiguities, which
>implies making the first of the following choices): either (1) give
>different scopings the same tectostructure and have interpretations
>themselves be underspecified entities a la MRS etc., or (2) have
>disambiguating tectogrammatical operations (such as the scope
>operators used in some kinds of CG) that make a semantic difference
>but not a phenostructural one. The former has the well-known
>computational advantages you allude to, but I have a hunch the former
>(the "unmotivated spurious syntactic ambiguity" approach) might make
>linguistic description easier.
>
>[Of course categorial grammarians don't usually consider ambiguity of
>this kind unmotivated and spurious, since they consider the main point
>of tectostructure to be to drive semantic composition. Lambek even
>goes so far as to say that Curry's tectostructure IS semantics not
>syntax, but this seems to me a misreading of Curry.]
>
>An analogous choice arises in the tecto-to-pheno interpretation: you
>can either (1) put put less stuff into tecto and make phenostructures
>underspecified, or (2) put more stuff into tecto to make the relation
>to (fully specified) phenostructures a function (the usual categorial
>approach). The Dowty-style minimalism we've been discussing
>is of the former kind: the multisets of words (and frozen word
>sequences) are the analogs of underspecified semantic representations,
>and the LP rules are the analogs of "handle constraints" (or whatever
>you want to call the constraints that embody the options for resolving
>an underspecified semantic representation).
>
>Carl
>
>
>
>
>
>
>
>
>



More information about the HPSG-L mailing list