25.5038, Review: Cognitive Sci; Ling Theories; Syntax: Culicover (2013)

The LINGUIST List via LINGUIST linguist at listserv.linguistlist.org
Thu Dec 11 22:08:11 UTC 2014


LINGUIST List: Vol-25-5038. Thu Dec 11 2014. ISSN: 1069 - 4875.

Subject: 25.5038, Review: Cognitive Sci; Ling Theories; Syntax: Culicover (2013)

Moderators: Damir Cavar, Indiana U <damir at linguistlist.org>
            Malgorzata E. Cavar, Indiana U <gosia at linguistlist.org>

Reviews: reviews at linguistlist.org
Anthony Aristar <aristar at linguistlist.org>
Helen Aristar-Dry <hdry at linguistlist.org>
Sara Couture, Indiana U <sara at linguistlist.org>

Homepage: http://linguistlist.org

Do you want to donate to LINGUIST without spending an extra penny? Bookmark
the Amazon link for your country below; then use it whenever you buy from
Amazon!

USA: http://www.amazon.com/?_encoding=UTF8&tag=linguistlist-20
Britain: http://www.amazon.co.uk/?_encoding=UTF8&tag=linguistlist-21
Germany: http://www.amazon.de/?_encoding=UTF8&tag=linguistlistd-21
Japan: http://www.amazon.co.jp/?_encoding=UTF8&tag=linguistlist-22
Canada: http://www.amazon.ca/?_encoding=UTF8&tag=linguistlistc-20
France: http://www.amazon.fr/?_encoding=UTF8&tag=linguistlistf-21

For more information on the LINGUIST Amazon store please visit our
FAQ at http://linguistlist.org/amazon-faq.cfm.

Editor for this issue: Sara  Couture <sara at linguistlist.org>
================================================================


Date: Thu, 11 Dec 2014 17:07:11
From: Luis Vicente [vicente at uni-potsdam.de]
Subject: Grammar & Complexity

E-mail this message to a friend:
http://linguistlist.org/issues/emailmessage/verification.cfm?iss=25-5038.html&submissionid=28505151&topicid=9&msgnumber=1
 
Discuss this message: 
http://linguistlist.org/pubs/reviews/get-review.cfm?subid=28505151


Book announced at http://linguistlist.org/issues/24/24-1602.html

AUTHOR: Peter W. Culicover
TITLE: Grammar & Complexity
SUBTITLE: Language at the Interface of Competence and Performance
PUBLISHER: Oxford University Press
YEAR: 2013

REVIEWER: Luis Vicente, Universität Potsdam

Review's Editors: Malgorzata Cavar and Sara Couture

SUMMARY

Suppose that you have a set of lexical items and you want to create a grammar.
You are going to need at least two types of rules, viz, syntactic rules (which
tell you how to combine lexical items with each other in order to produce
larger structures) and semantic mapping rules (which tell you how to interpret
the structures produced by your syntactic rules). Within the framework of
generative grammar, the overall trend since the 1970s has been to postulate
fewer rules, but each with a wider domain of application. In turn, this
requires treating idiosyncrasies as properties of lexical items, including
accepting the existence of non-trivial amounts of phonetically null functional
structure. For example, Potts 2002 argues that a number of apparent
exceptional properties of ''as''-parentheticals and ''which''-appositives
follow from a totally regular syntax and semantics, so long as one accepts
that ''as'' and ''which'' have certain, very specific lexical entries.
Similarly, Harley 1999 argues that various intra- and cross-linguistic
asymmetries of ditransitive predicates can largely be traced to one specific
difference in the functional structure of such predicates, i.e., whether they
contain a functional head that expresses transfer of possession or one that
expresses location. As a result of this trend, minimalist syntax has been
reduced to Merge and Agree. This trend is also apparent in semantics, where
the contention is that all structures can be assigned a compositional meaning
through Functional Application, Predicate Modification, and Predicate
Abstraction (possibly supplemented with Restrict, see Chung and Ladusaw 2003).
One might want to add some linearization rules and focus alignment rules to
deal with the PF interface (Kayne 1994, Zubizarreta 1998, Nunes 2002), but the
general direction of research is clear: people working in these frameworks
strive towards a total number of general-purpose rules in the single-digit
region -- or, in a worst case scenario, in the low-tens region.

''Grammar and complexity,'' which is heavily based on previous works by
Culicover and colleagues (especially Culicover 1999, “Syntactic nuts”;
Culicover and Nowak 2003, “Dynamical grammar”; and Culicover and Jackendoff
2005, “Simpler syntax”), is a reaction against this trend. Culicover's main
hypothesis is that a small number of general-purpose rules are insufficient to
account for the whole range of phenomena one finds in natural language syntax
and semantics. Rather, in order to provide adequate empirical coverage, a
theory of grammar has to include the concept of ''construction''. A
construction, as Culicover explains in Chapter 1 (“Theoretical background”),
is essentially a Saussurean sign, i.e., an arbitrary, irreducible
correspondence between form and meaning. Therefore, syntax-semantics
mismatches need not necessarily be covered by appeal to a more complex lexicon
(including a number of phonetically null functional heads); instead, one can
assume a more parsimonious syntax and handle the syntax-semantics mapping
through a dedicated constructional rule. 

Additionally (and importantly), Culicover proposes that constructions do not
exist in isolation from each other, but rather are arranged in a dependency
tree where the more specific constructions inherit the properties of the less
specific ones. For example, [main] and [embedded] are subconstructions within
the more general construction [clause]; as such, they both share all the
properties of [clause], and each one contributes certain additional properties
not specified in [clause]. For example, [main] only has the single
subconstruction [finite], whereas [embedded] is partitioned into [finite] and
[infinitival]; in turn, [finite] and [infinitival] are each subpartitioned
into various sub-subconstructions. Note that a node in the dependency tree may
be a subconstruction of several different superconstructions. For example, the
[zero] construction is specified as being a dependent of both
[embedded:finite], and [embedded:infinitival]. In constructional parlance,
this means that the properties that [zero] contributes (i.e., that the
embedded sentence in question has a null complementizer) are added to the
properties contributed by the relevant superconstructions in each individual
sentence. Thus, “the man he talked to” is an instance of
[clause:embedded:finite:zero], whereas “the man to talk to” is an instance of
[clause:embedded:infinitival:zero]. Generalizing this idea, the syntax of a
language can be modeled as a set of constructions arranged in a complex
dependency tree such that, for each grammatical sentence in the language, we
can specify a chain of constructions (from the least to the most specific one)
that define the properties of that sentence.

The ideas summarized in the previous paragraph lead to a discussion of
complexity in Chapter 2 (“The architecture of constructions”). Here, Culicover
argues that each postulated construction has a cost, which negatively
correlates with the generality of the construction, where generality is in
turn defined in terms of the formal similarity of the elements the
construction makes reference to and the evenness of the coverage of the
construction. The toy example that Culicover provides on p. 48 is that “a
construction that is restricted to the words ‘girl’ and ‘television’ is more
costly than one that is restricted to the words ‘girl’ and ‘boy’ [LV: because
‘girl’ and ‘boy’ are more similar to each other than ‘girl’ and ‘television’
are, in the sense of having features in common]. A construction that is
restricted to the words ‘girl’ and ‘boy’ is more costly than one that is
restricted to all of the words that refer generically to male and female
humans [LV: because the former needs to be restricted to a very specific
subset of humans, whereas the latter does not]”. In principle, the fact that
costlier constructions are dispreferred relative to cheaper ones can be used
as a measure against the postulation of superfluous constructions (e.g., one
can imagine that the best constructional grammar for a given language is the
one with the minimal overall cost). In practice, though, this is a difficult
idea to implement, as it involves measuring the density of regions defined by
each construction in a multidimensional feature space.

The rest of the book offers a lengthy discussion of various consequences of
this approach to syntax. Specifically, Chapters 3 and 4 in Part II develop a
constructional analysis for a number of English phenomena, namely, relative
clauses, focus inversion, sluicing, comparative correlatives, concessives,
imperatives, and ''not''-topics. Part III, which consists of the single
Chapter 5 (''Reflexes of processing complexity''), switches from formal
complexity to processing complexity as a means to deal with the idiosyncrasies
of island restrictions (drawing heavily on research by Hofmeister 2007 and
Hofmeister and Sag 2010) and parasitic gaps. Finally, the three chapters of
Part IV apply this line of reasoning to various aspects of language
acquisition and change. As I mentioned above, these case studies are mostly
updated versions of selected previous work by Culicover and colleagues. As a
consequence, readers familiar with Culicover's research will find few new
things in this book. Readers that are not familiar with Culicover's research
might find the book useful as an introduction to Culicover's research program;
however, unless they are pressed for time, they should supplement their
reading of ''Grammar and complexity'' with reading of the referenced works
(alternatively, they might also want to consult the existing reviews of
''Syntactic Nuts'' (Fodor 2001) and/or ''Simpler Syntax'' (McDonald 2005)).

EVALUATION

The attentive reader might have noticed that, by defining ''construction'' in
the way Culicover does, one effectively places no upper bound on the number of
possible constructions. In fact, it is possible to define a language that is
purely constructional, in that each sequence of sounds is arbitrarily (i.e.,
non-compositionally) mapped to some specific meaning. Fortunately, work on
computational language evolution has shown that, given certain reasonable
assumptions about the language faculty, this class of languages is highly
unstable and quickly decays into a largely compositional language (cf. chapter
2 of Brighton 2003 for an illustrative summary). Thus, although in principle
there are infinitely-many possible constructions, the number that one might
actually find in any given language is considerably smaller (although, in any
case, the number of constructions will still be considerably larger than the
number of operations postulated in minimalist syntax and semantics). 

More specifically, Culicover's proposal can be seen as a qualified appeal to
Occam's Razor. As discussed above, his general notion of complexity is that
degree of complexity of a grammar depends on the specificity of its rules
(chapters 1 and 2). That is, a grammar with a small number of general-purpose
rules is less complex than a grammar with a large number of idiosyncratic
rules. This approach is illustrated throughout the book with the idea that
constructions exist in a dependency hierarchy, with constructions at the
bottom of the hierarchy being special cases of those constructions higher up.
In this sense, Culicover seems to agree with the standard generative view that
construction- or language-specific rules/constraints are to be avoided.
However, this position is supplemented with statements like ''when we see a
construction with idiosyncratic properties, a plausible analysis is that it is
a construction with idiosyncratic properties'' (p. 129), which suggest that
Culicover is willing to take at least some idiosyncrasies at face value --
i.e., properties that appear to be idiosyncratic reflect an actual underlying
idiosyncrasy in grammar, rather than a failure of analytic insight on the part
of the linguist. One might wonder to what extent this intuition is justified,
rather than being a reflection of Culicover’s theoretical bias. Consider for
illustration, the discussion of comparative correlatives (''the more I know
people, the more I love my dog'') in chapter 4; here, Culicover argues that
this class of sentences exhibit a number of idiosyncrasies that suggest a
constructional analysis. The issue here is not whether a constructional
analysis captures the properties of the comparative correlative (it does, as
one can always define constructions in whichever way is necessary to capture
the relevant properties); the issue is that, in line with his general
approach, Culicover is implicitly asking the reader to accept that the
properties of the comparative correlative are constructionally irreducible,
i.e., not derivable from a more parsimonious syntax/semantics mapping. A
number of arguments to the effect that such an alternative is possible (e.g.,
den Dikken 2005, Taylor 2009, Smith 2011, etc.) are dismissed almost without
comment. Most of the space of chapters 3 and 4 is taken up by this type of
argumentation: Culicover argues that a certain construction (focus inversion,
sluicing, imperatives, etc.) is irreducibly constructional, with little to no
mention of published arguments to the effect that a reduction to more general
operations is feasible.

Because of his willingness to rely on a relatively large (effectively
unbounded) set of constructions, I find that Culicover's proposals lack
predictive bite. For example, chapter 4, which has the suggestive title
''Constructions and the notion of possible human language'', concludes with
''this perspective does not promise to distinguish between possible and
impossible languages'' (p. 135). I find this conclusion surprising, given that
other (non-constructional) proposals make very clear predictions about which
types of languages are possible and which are not. For example, Kayne 1994
predicts the impossibility of certain word orders (e.g., penultimate position
effects), and Hornstein 1999 predicts the impossibility of certain types of
control relations (e.g., obligatory control with split antecedents). We know
which kinds of data support these analyses, and under which kinds of data they
break; if we find good evidence that the latter kind of data exists, then
these analyses should have to undergo significant amendments, or even be
rejected outright. Culicover’s proposals are not easily amenable to this kind
of reasoning. Compare, for example, the standard Ross/Merchant analysis of
sluicing with Culicover's (chapter 4, section 1). Ross and Merchant propose
that an example like ''someone arrived, but I don't know who'' is derived from
''someone arrived, but I don't know who arrived'' by PF deletion
(non-pronunciation) of the embedded interrogative. In contrast, Culicover
proposes that there is no unpronounced structure in the sluiced sentence:
''who'', a bare wh- word, is the immediate complement of ''know''. Note that
Ross and Merchant provide several theory-independent arguments in favor of a
clausal-complement-plus-deletion analysis -- i.e., they point out that, with
respect to several tests (word order, agreement patterns, selectional
requirements, etc), ''who'' behaves as part of an embedded (but superficially
invisible) clause. These results might seem to directly falsify Culicover's
analysis, but he gets around them by appealing to the power of constructions:
specifically, he proposes that, in “someone arrived, but I don’t know who”,
the complement of “know” is a bare wh- word, but it is imbued with certain
clause-like properties (e.g., it is exceptionally assigned category S, rather
than NP) that make it behave as if it was a sub-constituent of a silent
embedded interrogative. More generally, any property of language, no matter
how problematic for other theories, falls under the scope of Culicover's
analysis, so long as we are willing to grant it a construction status.

The ultimate question then, is whether Culicover's proposals merit further
attention from linguists who are committed to a different view of syntax and
semantics. As a member of this group of linguists, my opinion is arguably
somewhat biased against Culicover's proposals, and as such I feel that the
best way to conclude this review is by referring to the conclusion of Fodor's
2001 review of ''Syntactic Nuts'': we cannot deny that the relevant phenomena
are worthy of investigation, but it is at least questionable whether a
constructional theory like Culicover's is the best way to approach them.

REFERENCES

Brighton, Henry. 2003. Simplicity as a driving force in linguistic evolution.
Doctoral dissertation, University of Edinburgh.

Culicover, Peter. 1999. Syntactic Nuts. Oxford: Oxford University Press.

Culicover, Peter, and Andrzej Nowak. 2003. Dynamical grammar. Oxford: Oxford
University Press.

Culicover, Peter, and Ray Jackendoff. 2005. Simpler syntax. Oxford: Blackwell.

den Dikken, Marcel. 2005. Comparative correlatives crosslinguistically.
Linguistic Inquiry 36:497-532.

Fodor, Janet Dean. 2001. Parameters and the periphery: reflections on
syntactic nuts. Journal of Linguistics 37:367-392.

Harley, Heidi. 1996. If you have, you can give. In Agbayani and Tang (eds.)
''Proceedings of WCCFL 15''.

Hofmeister, Philip. 2007. Representational complexity and memory retrieval in
language comprehension. Doctoral dissertation, Stanford University.

Hofmeister, Philip, and Ivan Sag. 2010. Cognitive constraints and island
effects. Language 86:366-415.

Hornstein, Norbert. 1999. Movement and control. Linguistic Inquiry 30:69-96.

Kayne, Richard. 1994. The antisymmetry of syntax. Cambridge: MIT Press.

McDonald, Edward. 2005. Review of Simpler Syntax. LinguistList 17.718.

Merchant, J. 2001. The syntax of silence: Sluicing, identity, and the theory
of ellipsis. Oxford University Press.

Potts, Christopher. 2002. The lexical syntax and lexical semantics of
parenthetical 'as' and appositive 'which'. Syntax 5:55-88.

Ross, John Robert. 1969. Guess who? The Fifth Regional Meeting of the Chicago
Linguistic Society (CLS 5), 252–286. Chicago, IL: Chicago Linguistic Society.

Smith, E. Allyn. 2011. English comparative correlatives, conditionals, and
adverbs of quantification. In Reich et al (eds.) ''Proceedings of Sinn and
Bedeutung 15''.

Taylor, Heather Lee. 2009. The syntactically well-behaved comparative
correlative. In Brucart et al (eds.), ''Merging features: computation,
interpretation, and acquisition''. Oxford: Oxford University Press.


ABOUT THE REVIEWER

I'm a lecturer and researcher at the University of Potsdam (Germany),
specializing in theoretical syntax and semantics. I work primarily on
ellipsis, parentheticals, and coordinate structures, with occasional forays
into other topics.








----------------------------------------------------------
LINGUIST List: Vol-25-5038	
----------------------------------------------------------







More information about the LINGUIST mailing list