Promoting interest in HPSG
Georgia
g-green at uiuc.edu
Fri Jun 25 17:36:46 UTC 2004
I think this idea is right on.
Georgia
----- Original Message -----
From: "Ivan A. Sag" <sag at csli.stanford.edu>
To: <hpsg-l at lists.Stanford.EDU>
Sent: Friday, June 25, 2004 11:00 AM
Subject: Re: Promoting interest in HPSG
> Hi everyone,
>
> Just to throw in my 2 cents worth... I believe people working in HPSG
should
> submit their work to mainstream linguistic conferences and journals. I
enclose
> a message on this topic (with updated weblinks) that I sent to a number of
> mailing lists in 1998. If a new conference is created, it definitely
should
> not have anything like `nonderivational' or `nontransformational' in the
> title. Rather it should simply lead by example, showcasing excellent work
in
> nonderivational frameworks...
>
> (People disagree with me about this, but) I personally think that the
annual
> HPSG meeting should not be seen as the final publication destination for
much
> of anything and hence that it should be organized less like a conference
and
> more like a workshop, with time allocated for discussion, more topical
> sessions, invited presentations, maybe sessions dedicated to nurturing
student
> presentations, etc..
>
> Best,
> Ivan
>
> PS: The deadline for the January, 2005 LSA meeting in San Francisco is
> September 1.
>
> -------------------------------------------------------------------------
> Ivan A. Sag
> Professor of Linguistics and Symbolic Systems
> Stanford University. Stanford, CA 94305
> Email: sag at csli.stanford.edu
> WWW: http://lingo.stanford.edu/sag/
> -------------------------------------------------------------------------
>
>
> -----------------------------------------------------------------------
> from 1998:
> -----------------------------------------------------------------------
> All,
>
> The deadline for receipt of abstracts for the LSA meeting in January
> is September 1. In a recent study of syntax papers presented at the
> last two LSA meetings (by a colleague who will remain nameless),the
> distribution of frameworks employed in the papers presented looked
> something like this.
>
> ---------------------------------------------
> 1996-1997 LSA Meetings aggregated: # %
> Total theory-specific syntax 61 100%
> Chomskyan (GB/Minimalism) 50 82%
> Non-Chomskyan combined: 11 18%
> ---------------------------------------------
>
> People I talk to frequently express the opinion that it is pointless
> to submit non-GB/Minimalist linguistic papers to linguistics
> conferences (or journals!) because the reviewing is biased. While it
> would be surprising if there were no bias wrt article or abstract
> reviewing in a field with a predominant framework, nonetheless, I
> believe the primary reason there are so few LFG, HPSG, Categorial
> Grammar, or TAG papers at linguistics conferences is simple: there are
> very few submissions.
>
> This impression is confirmed by conversations I have had over the last
> few years with members of various program committees. Moreover, it
> appears that many people who work in these constraint-based,
> lexicalist frameworks, perhaps because they are trained in a different
> field, research tradition, or research culture, do not submit the kind
> of carefully argued abstract that stands a chance of getting accepted
> in a highly competitive review process. One colleague serving on a
> program committee complained to me that (s)he was tired of reading
> abstracts of the form:
>
> 1. Here is some data
> 2. Here is how X is analyzed in GB.
> 3. Here is how X is analyzed in HPSG.
>
> [end of abstract]
>
> I'd like to suggest that we should all submit more abstracts to
> conferences like LSA, WCCFL, NELS, GLOW, ESCOL, WECOL, CLS, BLS, etc.
> It may surprise some of you to know that some of these conferences are
> so concerned with having some kind of theoretical balance that they
> are tempted to look more sympathetically at the few non-GB abstracts
> they receive. You may also be surprised at how many people working in
> GB/P&P/Minimalism feel that conference review processes are biased
> against them.
>
> Anyway, here are a few suggestions for writing the kind of abstract
> that can be competetive. Doubtless, some of you reading this have
> other suggestions to make. I also enclose (with the permission of the
> authors, who will also remain anonymous) three abstracts that were
> accepted by program committees at recent general linguistics
> conferences. (The residual latex commands are my fault. Sorry - I
> wanted to get this out quickly.)
>
> Reactions welcome.
>
> Ivan
>
> -------------------------------------------------------------------------
> Ivan A. Sag
> Stanford University
> Email: sag at csli.stanford.edu
> Office: 415-725-2323 (Cordura 228)
> WWW: http://hpsg.stanford.edu/hpsg/sag.html
> Fax: 415-725-2166
> -------------------------------------------------------------------------
>
> Suggestions for writing syntax/semantics abstracts:
>
> 1. There are guidelines for abstract writing available from the LSA.
> They are useful. The guidelines and abstract submission forms can
> be found on-line at [updated June 25, 2004]:
>
> http://www.lsadc.org/annmeet/abstractguide.html
>
> See also the discussion of model abstracts at:
>
> http://www.lsadc.org/dec02bulletin/model.html
>
> 2. Deadlines are strictly enforced. Also, you should follow all
> specifications to the letter.
>
> 3. An abstract should succinctly state an empirical problem and the
> nature of the proposed solution. Don't waste any words. You will
> almost always want to use as many words as you are allowed. Learning
> how to state a problem clearly in a few sentences is essential.
>
> 4. An abstract should make reference to relevant previous work. The
> trick is to know how to summarize very succinctly what the
> essence of a previous insight is or what the defect in some previous
> analysis is. It is in your interest to cite relevant current work.
>
> 5. The kiss of death for an abstract is to say something like: a
> solution will be presented to problem X. The feel of the abstract
> has to be more like: `I show that X is really Y (pace
> Sapir 32). Five arguments support this claim: (1) X participate
> in nasal assimilation like Ys (`yabayaba';`yadayada'); (2) the common
> assumption that X is Z (ref) leads to a contradiction about agreement
> in relative clauses (in forms such as `xxx'); (3) Only Ys are subject
to
> W (ref), yet W also applies to X (cf. `foobar')...'
>
> 6. Keep the scope of the abstract modest. State a smaller problem and
> focus on your solution to that. Abstracts are often rejected on the
> grounds that what the author claims (s)he will present in 15 or 25
> minutes would obviously take 2 hours to present adequately.
>
> 7. Keep your readers in mind. Even if a generalist cannot evaluate
> all the details of your argument, (s)he should appreciate some of
> your points and above all the tightness of your reasoning.
>
> --------------------------------------------------------------------------
--
>
> West Greenlandic noun incorporation
> as a mixed category construction
>
> Noun incorporation (NI) in West Greenlandic exhibits a challenging mix of
> syntactic and morphological properties. To account for NI and related
> phenomena, Sadock (1985) proposes that morphology and syntax be treated as
> autonomous and orthogonal modules of grammar, so that "every expression
> will have two distinct representations, one morphological and one
> syntactic" (383). And, he argues, West Greenlandic NI can best be
> accounted for as a mismatch between syntax and morphology. However,
> syntactic and morphological structures are typically homomorphic and even
> in the most extreme cases they diverge in certain highly constrained ways.
> So, Sadock (1991) offers a number of principles governing the kinds of
> structural mismatch allowed.
>
> In this paper, I offer a strictly lexicalist analysis of West Greenlandic
> NI within the framework of Head-driven Phrase Structure Grammar (Pollard
> and Sag 1994) which does not require positing divergent morphological and
> syntactic structures. Furthermore, I show how Sadock's homomorphism
> constraints follow directly from the architecture of the lexicon.
>
> West Greenlandic NI is a category changing morphological operation that
> converts a noun into a verb by the addition of one of a set of bound
> verbalizing suffixes (Sadock 1991:94):
>
> (1) Marlunnik ammassattorpunga
> marluk-nik ammassak-tor-punga
> two-INST/PL sardine-eat-INDIC/1SG
> `I ate two sardines.'
>
> (2) Ammassannik marlunnik nerivunga
> ammassak-nik marluk-nik neri-vunga
> sardine-INST/PL two-INST/PL eat-INDIC/1SG
> `I ate two sardines.'
>
> The resulting denominal verb (DV) has the full distribution of a verb in
> West Greenlandic. Unlike a verb, though, a DV can also occur with an
> ergative possessor and a nominal complement which are associated with the
> incorporated nominal. On this basis, Sadock argues that the incorporated
> nominal must have an independent syntactic existence. I show that West
> Greenlandic NI is better viewed as a kind of mixed category construction,
> parallel to the English verbal gerund. The possessor and complement that
> occur with a DV are not stranded by incorporation nor do they bear a
> relation to the incorporated nominal directly. Instead, the DV inherits
> its subcategorization requirements from both the verbalizing suffix and
the
> incorporated nominal. This is exactly parallel to mixed category
> constructions in other languages, e.g., the English verbal gerund
> "devouring" in "Pat's devouring the pancakes," which occurs with both a
> specifier (like a noun) and a direct object (like a verb).
>
> Verbalizing suffixes can be accounted for in HPSG by a lexical rule which
> combines the valence properties of the verbal suffix with the valence
> properties of the `incorporated' noun stem. A verb derived by this
lexical
> rule will project a verb phrase following general principles of X-bar
> theory and argument saturation. In addition, any constraints which the
> noun stem places on its specifier and complement will be inherited by the
> DV. This accounts for the fact that the external specifier and complement
> have the properties they would have had if they had appeared with the
> incorporated nominal alone.
>
> This analysis does involve a limited kind of mismatch: a DV projects a VP
> but has noun-like valence requirements. However, the unusual properties
of
> DVs are restricted to the lexicon and HPSG's independently motivated
theory
> of lexical information places strong restrictions on the kinds of
> mismatches that can be induced. So, there is no need for additional
> construction-specific stipulations limiting the degree of mismatch between
> syntax and morphology.
>
> --------------------------------------------------------------------------
--
> Understanding Mandarin \ba as a verb
> Syntax
>
> Many researchers (e.g., Li (1990)) have analyzed the morpheme \ba of
> the Mandarin \ba construction as a direct object marker. Their
> evidence is two-fold: \ba does not pass the three established tests
> for verbhood, and, in examples like (\ref{core}), \ba does appear to
> mark the direct object of the following verb.
>
> (1)a.
> L\v{\i} S\`{\i} b\v{a} n\`{a} ji\={a}n f\'{a}ngzi ch\={a}i le
> Li Si \emph{ba} that CL(assifier) house demolish
perf(ective)
> `Li Si demolished that house'
>
> b.
> L\v{\i} S\`{\i} ch\={a}i le n\`{a} ji\={a}n f\'{a}ngzi
> Li Si demolish perf that CL house
> `Li Si demolished that house'
>
> In this paper, I argue that \ba is a verb. First, the verbhood tests
> are unreliable, since, for each one, there is a class of
> uncontroversial verbs which also fail. Second, there is a wide range
> of data that is highly problematic to other accounts, but expected if
> \ba is a verb and its valence orchestrates the sentence. This data
> includes: sentences where the subject of the sentence is not
> interpreted as the subject of the post-\ba verb; sentences where the
> NP following \ba is interpreted as the patient of a verb embedded one
> clause down; and sentences where the post-\ba verb has another object,
> in addition to the one that \ba is marking. (\ref{noncore}), from Li
> (1990:184), gives an example of the final type.
>
> (2) W\v{o} b\v{a} j\'{u}zi b\={o} le p\'{\i}
> I {\em ba} orange peel ASPECT skin
> `I peeled the skin off the orange'
>
> (3) [_{\tn{IP}} [_{\tn{NP:subj}} w\v{o}]
> [_{\tn{VP}} [_{\tn{\={V}}} b\v{a}
> [_{\tn{NP:obj}} j\'{u}zi]] [_{\tn{IP:comp}} b\={o} le
p\'{\i} ]]]]
>
> I argue that \ba takes a subject, an object and a complement clause
> (as in (3)) and stipulates that its object is interpreted as
> the (sentential) topic of its complement clause. This analysis predicts
> sentences like (\ref{noncore}), because \nba{'s} valence
> and that of its complement are independent and may be instantiated
> by different NPs. That \nba{'s} object is interpreted as the topic of
> the complement clause explains why \ba has so many topic properties
> (Tsao 1986) and why the two objects in sentences like (\ref{noncore})
> cannot be completely independent: there are restrictions on how a
> topic may be related to its clause (Tsao 1986).
>
> This analysis brings into question conclusions about Mandarin syntax
> based on treating \ba as a direct object marker, such as C.\ Li
> and Thompson's (1974) claim that Mandarin is becoming an SOV language,
> Huang's (1990) claim that Mandarin has prepositions, and Travis's (1989)
> word order parameters.
>
>
> {\bf Bibliography}
>
> Li, Audrey Yen-Hui. 1990. {\em Order and Constituency in Mandarin
> Chinese}. Boston: Kluwer.
>
> Li, Charles N.\ and Sandra A. Thompson. 1974. Historical change of
> word order: A case study in Chinese and its implications. In
> Anderson, J.\ and J. Charles, eds. {\em Historical Linguistics}.
> Amsterdam: North-Holland. 199--217.
>
> Travis, Lisa. 1989. Parameters of phrase structure. In Baltin, M.\
> and A.\ Kroch, eds. {\em Alternative Conceptions of Phrase
> Structure.} Chicago: Univ.\ of Chicago Press. 263--279.
>
> Tsao, Feng-fu. 1986. A topic-comment approach to the {\em ba}
> construction. {\em Journal of Chinese Linguistics.} 15(1):1--55.
>
> ---------------------------------------------------------------------
>
> {\bf `Extraction' Without Traces}
>
>
> Although more than one traceless analysis of unbounded filler-gap
> constructions has been presented in the recent syntactic literature
> (Steedman 87,88; Kaplan and Zaenen 89; Pollard and Sag 93), it is
> widely believed that the phonetically empty categories -- traces --
> that are profligate in modern GB analyses of the same phenomena (inter
> alia) have empirical motivation outside the technical apparatus of GB
> Theory. In this paper, we critically examine essentially every
> independent argument offered for the existence of traces, arguing that
> none of these is satisfactory from the perspective of modern
> linguistic theory. We also offer positive arguments for the position
> that filler-gap dependencies are terminated not by empty constituents,
> but rather by the lexical heads that are normally thought of as
> properly governing the trace.
>
> (i) It is generally assumed that the impossibility of auxiliary
> contraction in examples like *How tall do you think he's? is to be
> explained by some appeal to the presence of a trace in the
> post-auxiliary position. However, in the analysis of these phenomena
> provided by Selkirk (84), a morpheme cannot contract if it is needed
> to bear stress, a matter determined by the ABSENCE of other
> constituents in the prosodic phrase, not the PRESENCE of empty
> constituents. This approach is clearly superior to trace-based
> accounts in that it extends to further data, e.g. contraction failure
> around parentheticals.
>
> (ii) Wanna contraction is also thought to motivate traces whose
> presence blocks application of a contraction rule. But, quite apart
> from issues raised by Postal and Pullum (86) and Lightfoot (86), wanna
> is best treated (synchronically) as a defective verb, not the output
> of a contraction rule (see Fodor (in pr)).
>
> (iii) Floated quantifiers cannot appear at extraction sites (*What
> color are the cars all?), and this too has been accounted for by
> appeal to traces (Sag 78,80). Yet Dowty and Brodie (84) defend an
> analysis of floated quantifiers as base-generated VP modifiers that
> immediately explains this fact, under the crucial assumption that
> there are no traces, i.e. no following phrase for the quantifiers to
> be attached to in these examples.
> (iv) Experimental psycholinguistic data have shown that the meaning
> of a filler phrase is mentally activated at a gap position (Bever and
> MacElree (88); MacDonald (89)), and the appropriate theory of this has
> been assumed to require traces. But this finding is compatible with
> ANY theory that establishes a semantic link between a filler and the
> position of a missing argument - empty constituents are not required.
> Re-activation studies can show that dependencies are computed on-line
> but no experimental technique is yet known that can distinguish
> between empty and absent constituents.
>
> By contrast, Pickering and Barry 91 present data suggesting that
> hearers seek verbs to terminate the processing of filler-gap
> sentences, not traces. For this reason, the heavy NP in (1)
> encountered before the preposition terminator causes processing
> difficulty, but the verb terminator encountered before the heavy NP in
> (2) causes no such difficulty:
>
> (1) Which box did you put the very large and beautifully decorated
> wedding cake bought from the expensive bakery in?
> (2) In which box did you put the very large and beautifully
> decorated wedding cake bought from the expensive bakery?
>
> In the lexically based, traceless analysis of extraction constructions
> we present, such contrasts are precisely the expected result. A
> lexical rule applies to a lexical head, moving a complement from its
> COMPS list to its SLASH list. `Slashed' verbs project feature
> specifications about missing elements up the tree in accordance with
> universalprinciples of HPSG theory. We treat extraction irregularities
> in the lexicon, e.g. unextractable objects (Who did you allow to/*let
> leave early?) and Kayne's (80): Who did you assure us to be good?/*I
> assure you them to be good (see Postal 93). Our analysis of `crossover'
> phenomena also makes no appeal to traces.
>
More information about the HPSG-L
mailing list