long-distance control

Brian MacWhinney macw at cmu.edu
Thu Dec 24 02:35:56 UTC 2009


Dan,

    Our current understanding of neural processing requires that recursion in the brain must recursion in grammar in evolutionary terms.  Recursion requires a stack and a processor that operates on the stack.  The brain provides a (highly limited) simulation of the operation of a stack  in the form of item-based memory.  The items in the pseudo-stack can be syllables, words, phrases, or propositions.  Activation of items in this stack can be achieved by the establishment of cortical loops that  create working memories often with support from the hippocampus.  The contents of these memories can then be inserted in a second concept being formulated.  Voilá, you have recursion. 
    Because the properties of cortical areas and the patterns of connectivity vary, it is certainly true that memory works in slightly different ways to store the results of these various levels of linguistic structure, just as it works in slightly different ways to store the results of spatial navigation, visual search, musical themes, and so on.  But, in each case, the operation is one of storage and then reinsertion of the stored product in the next unit.  Recursion is available in any system that is able to simulate stacks and a processor that works on the stacks.  
   Of the four views you note, I have the greatest difficulty with idea that recursion has to be "reinvented" evolutionarily inside each module.  Why would a module reinvent a general process to which it already has access.  In the case of gap filling in questions with initial wh fillers, the actual processing involves a communication between anterior syntactic and posterior lexical areas.  So, even in this parade example of sentence-internal recursion, the stacked is relying on one "module" and the processor on another.   
   Recursion arises from pre-existing mnemonic and processing methods, but how these methods are used is up to language, thought, and culture, as you are arguing.  Again, it seems that the interesting issue with Pirahã and languages of this type is about the methods they choose for creating fillers and filling gaps (i.e. push and pop from stacks) and which methods they seem to avoid.

-- Brian MacWhinney

On Dec 23, 2009, at 7:57 PM, Daniel Everett wrote:

> Brian,
> 
> The reason that languages like Piraha, perhaps Hixkaryana, and perhaps Warlpiri and others, are important to the discussion is that if you can show that a language lacks recursion in the sentential syntax, but has it in the discourse, then it becomes possible to argue empirically for the idea that recursion in the brain precedes recursion in grammar.
> 
> There is an alternative, though, which is that each cognitive module has its own recursion. Some researchers, such as Tom Roeper, have suggested this, at least informally, in questions to me after my presentation at the recursion conference last May at U Mass (the first ever conference on recursion was held at Illinois State in 2007, co-sponsored by ISU and the Max Planck Institute for Ev Anthro in Leipzig).
> 
> A lot of research would need to be done to sort out the different possibilities, but here they are:
> 
> 1. Innate recursion in language leads to more intelligent primates by jumping into general cognition.
> 
> 2. Innate recursion exists in various cognitive modules (vision, language, etc.)
> 
> 3. Innate recursion is a property of general cognition and can be 'delegated' to discourse and/or sentential syntax according to different principles (my hypothesis is that culture plays a role).
> 
> 4. Recursion is not innate so much as a solution that the brain must adopt (a la Herbert Simon) in order to compete with conspecifics in the management of information.
> 
> Dan 
> 
> 
> On 23 Dec 2009, at 17:03, Brian MacWhinney wrote:
> 
>> Dan,
>> 
>>   Good reply and clarification.  The idea of focusing on the process of gap filling helps a lot.  Perhaps we could phrase the question this way:  how do people/languages fill gaps?  English speakers tend to fill them from other material inside the sentence.  We could think of languages or people that do this as "word string oriented gap fillers".  Chinese and Pirahã fill gaps not from the word strings that are in their echoic memory, but directly from the mental models they have constructed to support their own discourse or from their interlocutors' discourses.  I think all of us (Givon, Everett, Fauconnier, Simon ...) would agree that recursion preexists language (although HCF apparently do not).  So, the issue seems to be about whether and how a language comes to "download" the natural (probably mammalian) recursive processes utilized by mental models into grammaticalized forms that operate first on word strings and then, only at the second remove, on the mental models.  
>> 
>> -- Brian MacWhinney
>> 
>> On Dec 23, 2009, at 4:04 PM, Daniel Everett wrote:
>> 
>>> Brian,
>>> 
>>> Hauser, Chomsky, and Fitch make a big deal of long-distance dependencies:
>>> ""At the lowest level of the hierarchy are rule systems that are limited to local dependencies, a subcategory of so-called “finite-state grammars.” Despite their attractive simplicity, such rule systems are inadequate to capture any human language. Natural languages go beyond purely local structure by including a capacity for recursive embedding of phrases within phrases, which can lead to statistical regularities that are separated by an arbitrary number of words or phrases long-distance, hierarchical relationships are found in all natural languages for which, at a minimum, a “phrase-structure grammar” is necessary. It is a foundational observation of modern generative linguistics that, to capture a natural language, a grammar must include such capabilities." HC&F (2002:1577)
>>> 
>>> I reply to this in the paper just put up on my webpage, mentioned earlier today, as follows:
>>> 
>>> "Whether humans choose a finite vs. phrase structure grammar is precisely the empirical point that Pirahã raises. The 'infinity' of the Pirahã language, for example, might lie outside the grammar in the Chomskyan sense - in discourse - via the ability to fashion stories out of sentences rather than sentences out of phrases. There could, in other words, be a longest sentence in Pirahã, yet not a longest story. If that were the case, then NP&R would be wrong, since Merge applies only to form sentences and phrases from lexical items.  And HC&F would be misguided by failing to relate the general property of recursion to stories in lieu of or in addition to recursion in sentences. Theories that do not have anything to say about facts external to sentences (e.g. all versions of Chomskyan theory) cannot appeal to discourse, thought, etc. for support for their theory of grammar, e.g. the role that recursion plays in the FLN.  To beat this horse another way, recursion could be responsible for the infinitude of natural languages in a way unanticipated by Chomskyan theory, by allowing infinity to be a property of discourses, rather than sentences."
>>> 
>>> The kinds of examples that are standardly adduced for long-distance dependencies include:
>>> 
>>> (1) a. 'Who do you think John believes __ (that Bill saw__)?'
>>> b. 'Ann, I think he told me he tried to like ___' 
>>> 
>>> Piraha does not have structures like this. However, Piraha does have gaps. There are both places where pronouns are  'understood' (what generative theory calls/ed 'empty categories') and there is some displacement of constituents (all described by me in various places). But the gaps are not like those in  (1). There are structures like:
>>> 
>>> (2) 'Dan did not used to speak Piraha. His children did not know either. Neveretheless __ knows well now.' , where the gap refers back to 'Dan'. 
>>> 
>>> The latter example is a long-distance dependency involving a gap, but it is not a fact about sentential structure, but discourse structure. Generative theory itself recognizes the difference, e.g. work by Huang in the mid 80s on the difference between pro-drop and missing objects in languages like Brazilian Portuguese:
>>> 
>>> (3) __ Coloca __ ai. '(You) put (it) there.' 
>>> 
>>> In (3) the subject gap behaves like a regular pronoun but the object gap like a discourse variable, according to some analyses. 
>>> 
>>> Whether or not long-distance dependencies are different kinds of things within and without the sentence depends on the functional strategies for questions, topics, etc. exercised by a particular language. But there aren't studies comparing and contrasting the different notions that I know of (which probably only shows my ignorance).
>>> 
>>> For functionalists, the Piraha data might be rare, but it shouldn't be a shock, because - if I am right - these data simply suggest that the 'infinitude' produced by the computational system of Piraha is found in discourses, not in sentences. 
>>> 
>>> Dan 
>>> 
>>> 
>>> 
>>> On 23 Dec 2009, at 15:27, Brian MacWhinney wrote:
>>> 
>>>> Dear FunkNetters,
>>>> 
>>>> I am puzzled that no one has questioned the merger of long-distance relations into a single type or a single process, as advocated by Givon and Everett.  I tend to think of pronominal anaphora as the prototypical case of a long-distance discourse process, although devices such as class inclusion ("the bus" referring anaphorically or even cataphorically to "the vehicle) are relevant too.  For sentence-internal long-distance processes, I assume that we are thinking about things like the placement of wh-words at the beginning of English sentences.  If we look at just these two cases, do we really want to say that the two domains/processes are cognitively equivalent?  In the discourse case, the reference is established in mental space and processing involves re-invoking that referent.  In the within-sentence case for English wh placement, processing is radically different.  The referent is unknown.  In fact there is no problem or issue with referent identification.  Instead, we are typically trying to fill verb argument structures that have been morphed around, possibly to serve the interests of focal marking (as opposed to methods such as wh in situ or sentence final question markers etc.).   I see good reasons to link up the discourse anaphoric processes to relativization and complementation, but not to things like wh placement.  If we focus on relativization, is the non-repetition of the head what makes it long-distance?  But the distance in that case is not the same as for wh, right? Perhaps we don't want to talk about long-distance, but just about gaps.  But then does the existence of resumptive pronouns in the relative clause mean that the language is no longer "long distance" for that construction?  
>>>> 
>>>> Basically, I am not sure that the notion of "long distance" in itself provides any leverage and I am curious whether the real issue with Pirahãn type languages is the non-presence of gaps (whatever those are).
>>>> 
>>>> -- Puzzled in Pittsburgh  (Brian MacWhinney)
>>>> 
>>> 
>>> 
>> 
> 
> 



More information about the Funknet mailing list