<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1">
<title></title>
</head>
<body><><><><><><><><><><><><>--This is the Language List--<><><><><><><><><><><><><><BR>
<BR>
<br>
<br>
-------- Original Message --------
<table cellpadding="0" cellspacing="0" border="0">
<tbody>
<tr>
<th valign="baseline" align="right" nowrap="nowrap">Subject: </th>
<td>[evol-psych] On Grammar vs. Language in Neurolinguistics</td>
</tr>
<tr>
<th valign="baseline" align="right" nowrap="nowrap">Date: </th>
<td>Fri, 18 Oct 2002 18:26:56 -0500</td>
</tr>
<tr>
<th valign="baseline" align="right" nowrap="nowrap">From: </th>
<td>Ian Pitchford <a class="moz-txt-link-rfc2396E" href="mailto:ian.pitchford@scientist.com"><ian.pitchford@scientist.com></a></td>
</tr>
<tr>
<th valign="baseline" align="right" nowrap="nowrap">Reply-To: </th>
<td>Ian Pitchford <a class="moz-txt-link-rfc2396E" href="mailto:ian.pitchford@scientist.com"><ian.pitchford@scientist.com></a></td>
</tr>
<tr>
<th valign="baseline" align="right" nowrap="nowrap">Organization: </th>
<td><a class="moz-txt-link-freetext" href="http://human-nature.com/">http://human-nature.com/</a></td>
</tr>
<tr>
<th valign="baseline" align="right" nowrap="nowrap">To: </th>
<td><a class="moz-txt-link-abbreviated" href="mailto:evolutionary-psychology@yahoogroups.com">evolutionary-psychology@yahoogroups.com</a></td>
</tr>
</tbody>
</table>
<br>
<br>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<meta content="MSHTML 6.00.2800.1106" name="GENERATOR">
<style></style> <font face="Arial">_________________________________________________________________<br>
<br>
<br>
>From Scienceweek October 4, 2002 Vol.6 Number 40<br>
<br>
<br>
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=<br>
<br>
5. On Grammar vs. Language in Neurolinguistics<br>
<br>
Massimo Piattelli-Palmarini (University of Arizona, US) discusses<br>
grammar vs. language, the author making the following points:<br>
<br>
1) Two styles of explaining the science of mind and behavior<br>
have been competing for as long as anyone cares to remember:<br>
empiricist, centering on habit formation, statistical learning,<br>
imitation and association; and rationalist, focusing on the<br>
projection of internally represented rules. Despite relentless<br>
effort, the former has delivered rather meager results, whereas<br>
the latter, with its pivotal concept of an internally<br>
represented grammar, has produced the solid "conceptual<br>
cognitive revolution".<br>
<br>
2) For a rationalist cognitive scientist, a grammar is a finite<br>
mental object, systematically assigning abstract structures to<br>
all the well-formed expressions of a language --that is, to each<br>
member of a set that, for natural languages (such as Chinese or<br>
Italian), is infinite and discrete. Infinite, because every<br>
speaker of a language can produce and understand an unlimited<br>
number of new grammatical sentences. Discrete, because<br>
continuous modification of a sentence to change it into another<br>
is impossible. No sentence could be halfway between "It's a good<br>
car, but they don't sell it" and "It's a good car, but they<br>
don't tell it."<br>
<br>
3) A grammar capable of generating complex structures for all<br>
well-formed sentences of a natural language must have recursive<br>
rules, because phrasal constituents can contain other phrasal<br>
constituents of the same or higher kinds ("The young doctor's<br>
three beautiful sisters" is a noun phrase containing another<br>
noun phrase; "The spy who came in from the cold" is a noun<br>
phrase containing a sentence). Moreover, structural rules of<br>
sentence formation can be applied recursively to embed relative<br>
clauses embedding other relative clauses, without limit (as in<br>
"This is the cat that killed the rat that ate the malt that lay<br>
in the house that Jack built"). Because such grammars are<br>
finite, whereas the languages they generate are infinite and<br>
contingently shaped by use, it is advantageous, and<br>
methodologically cogent, to consider the concept of grammar as<br>
primary, and that of language as derived.<br>
<br>
4) Since the mid-1950s, powerful formal criteria, derived from<br>
analysis of the artificial languages of mathematics and computer<br>
programming, have been applied to the study of natural languages<br>
to determine principles by which a given class of grammars can<br>
generate a given target language. A universal ('Chomsky')<br>
hierarchy of grammars (automata) was established: the most<br>
powerful class contains as a subclass the immediately less<br>
powerful one, and so on. In tune with the dominant<br>
empiricist-inductivist tradition of the 1950s, the first<br>
grammars to be explored at the lowest level in the hierarchy<br>
were probabilistic and finite-state. From a very large corpus of<br>
ascertained utterances of the language, one can compute the<br>
conditional probability that a word (or string of words) will<br>
follow another.<br>
<br>
References:<br>
<br>
1. Wasow, T. in Foundations of Cognitive Science (ed. Posner,<br>
M.) 161-205 (MIT Press, Cambridge, Massachusetts, 1991)<br>
<br>
2. Chomsky, N. The Minimalist Program (MIT Press, Cambridge,<br>
Massachusetts, 1995)<br>
<br>
3. Pullum, G. K. & Scholtz, B. C. Nature 413, 367 (2001)<br>
<br>
Nature 2002 416:129<br>
<br>
Web Links: neurolinguistics Chomsky<br>
<br>
Related Background Brief:<br>
<br>
MORE THAN WORDS. In the popular view, a language is merely a<br>
fixed stock of words. Purists worry about foreign loanwords;<br>
conservatives decry slang; and groundless claims that there are<br>
hundreds of Eskimo words for snow are constantly made in popular<br>
writing, as if nothing matters about languages but their<br>
lexicons. But the popular view cannot be right, because (as<br>
linguist Paul Postal has observed) membership in the word stock<br>
of a natural language is open. Consider this example: "GM's new<br>
Zabundra makes even the massive Ford Expedition look<br>
economical." If English had an antecedently given set of words,<br>
then this expression would not be an English sentence at all,<br>
because 'Zabundra' is not a word (we just invented it). Yet the<br>
sentence is not just grammatical English, it is readily<br>
interpretable (it clearly implies that the Zabundra is a large,<br>
fuel-hungry sports utility vehicle produced by General Motors).<br>
Similar points could be made regarding word borrowing, personal<br>
names, scientific nomenclature, onomatopoeisis, acronyms, loaned<br>
words, and so on; English is not a fixed set of words. A more<br>
fundamental reason that a language cannot just be a word stock<br>
is that expressions have syntactic structure. For example, in<br>
most languages, the order of words can be significant: "Mohammed<br>
will come to the mountain" contains the same words as "The<br>
mountain will come to Mohammed", but the expressions are very<br>
different. Geoffrey K. Pullum: Nature 2001 413:367.<br>
<br>
Related Background:<br>
<br>
ON THE ACQUISITION OF LANGUAGE BY CHILDREN<br>
<br>
J.R. Saffran et al (University of Wisconsin Madison, US) discuss<br>
the acquisition of language by children, the authors making the<br>
following points:<br>
<br>
1) Before infants can begin to map words onto objects in the<br>
world, they must determine which sound sequences are words. To<br>
do so, infants must uncover at least some of the units that<br>
belong to their native language from a largely continuous stream<br>
of sounds in which words are seldom surrounded by pauses.<br>
Despite the difficulty of this reverse-engineering problem,<br>
infants successfully segment words from fluent speech from<br>
approximately 7 months of age.<br>
<br>
2) How do infants learn the units of their native language so<br>
rapidly? One fruitful approach to answering this question has<br>
been to present infants with miniature artificial languages that<br>
embody specific aspects of natural language structure. Once an<br>
infant has been familiarized with a sample of this language, a<br>
new sample, or a sample from a different language, is presented<br>
to the infant. Subtle measures of surprise (e.g., duration of<br>
looking toward the new sounds) are then used to assess whether<br>
the infant perceives the new sample as more of the same or<br>
something different. In this fashion, we can ask what the infant<br>
extracted from the artificial language, which can lead to<br>
insights regarding the learning mechanisms underlying the<br>
earliest stages of language acquisition.<br>
<br>
3) Syllables that are part of the same word tend to follow one<br>
another predictably, whereas syllables that span word boundaries<br>
do not. In a series of experiments, it has been found that<br>
infants can detect and use the statistical properties of<br>
syllable co-occurrence to segment novel words. More<br>
specifically, infants do not detect merely how frequently<br>
syllable pairs occur, but rather the probabilities with which<br>
one syllable predicts another. Thus, infants may find word<br>
boundaries by detecting syllable pairs with low transitional<br>
probabilities. What makes this finding astonishing is that<br>
infants as young as 8 months begin to perform these computations<br>
with as little as 2 minutes of exposure. By soaking up the<br>
statistical regularities of seemingly meaningless acoustic<br>
events, infants are able to rapidly structure linguistic input<br>
into relevant and ultimately meaningful units.<br>
<br>
Proc. Nat. Acad. Sci. 2001 98:12874<br>
<br>
ScienceWeek </font><a href="http://www.scienceweek.com"><font
face="Arial">http://www.scienceweek.com</font></a><br>
<br>
<font face="Arial">=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=<br>
<br>
</font> <br>
<tt> Top Books - Behavioral Sciences<br>
<a
href="http://www.amazon.com/exec/obidos/redirect?tag=darwinanddarwini&path=tg/browse/-/226685">http://www.amazon.com/exec/obidos/redirect?tag=darwinanddarwini&path=tg/browse/-/226685</a></tt>
<br>
<br>
<tt>Your use of Yahoo! Groups is subject to the <a
href="http://docs.yahoo.com/info/terms/">Yahoo! Terms of Service</a>.</tt>
<br>
---<><><><><><><><><><><><>----Language----<><><><><><><><><><><><><><BR>
Copyrights/"Fair Use": http://www.templetons.com/brad/copymyths.html<BR>
The "fair use" exemption to copyright law was created to allow things<BR>
such as commentary, parody, news reporting, research and education <BR>
about copyrighted works without the permission of the author. That's<BR>
important so that copyright law doesn't block your freedom to express<BR>
your own works -- only the ability to express other people's. <BR>
Intent, and damage to the commercial value of the work are <BR>
important considerations. <BR>
<BR>
You are currently subscribed to language as: language@listserv.linguistlist.org<BR>
To unsubscribe send a blank email to leave-language-4283Y@csam-lists.montclair.edu
</BODY>
</html>