storage versus computation

Dick Hudson dick at LINGUISTICS.UCL.AC.UK
Wed Oct 14 10:07:09 UTC 1998


A message from Liz Bates, for which I'm just acting as postman:

I won't be able to send this reply to the funknet, because I'm in Rome for
the month and they won't let me "in" from this foreign site, believe it or
not!  By I have two quick reactions to your note:

1) There is no such thing as an overgeneralization stage, where "goed"
replaces "went".  This is one point on which Pinker and I agree, because
the data for English are crystal clear no matter what horse you are betting
on: children typically go from of error-free production of a handful of
(probably rote?) high freuency irregular forms, to a long long phase of
OCCASIONAL overgeneralizations.  But "goed" always coexists with "went" in
every child who has ever been studied.  The highest rate of
over-generalization on public record was by Stan Kucjac's son Abe (I'm
probably spelling Stan's name wrong, again, by the way), who reached a high
of 50% overgeneralization.  Most kids are in the 10-17% range.  By the way,
my daughter was rather peculiar: she starting produces an interchangeable
array of overgeneralizations vs. correct regular past tense forms without
ever passing through a rote irregular phase.  That is, there was no "went"
before "goed".  Just an unmarked "go" until such a time as the past tense
started to be marked, whereupon the vacillation began.

2) In all of this discussion of storage vs. rules (a.k.a. rote vs. rules)
people seem to be unaware of the third possibility: analogy.  A single
device based on analogy (generalization from partial similarity) can give
you both the rote-looking behaviors and the rule-like overgeneralizations
that are assumed by two-mechanism theories.  This is, of course, the basis
of connectionist claims, since networks are essentially analogy-making
device that operate across distributed representations.  Whether or not
children are neural nets is another question, but it is important to at
least be open to the LOGICAL possibility of a third solution that is
neither rote nor rules.  Pinker has indirectly acknowledged this in the
most recent version of his theory: he still insists on two mechanisms, and
one of them makes rules without regard for internal similarity or freuency,
but the other one is an analogy-making devices.  That's required in order
to account for 'irregularizations' (e.g. children who make "pat" the past
tense of "put" and so on, generalizing from an irregular form).  In this
regard, it is important to note that any of three different sources of
similarity are sufficient to support novel generalizations in an
analogy-making device: (1) similarity in physical form (e.g. wug --> rug
--> rugs --> wugs), (2) similarity of meaning (e.g. wug --> little animal
--> little animals --> wugs), or (3) common fate or similarity in contexts
of occurrence (e.g. "wug" appears in a discourse slot that seems to be
occupied by a class of items that a LINGUIST would call "nouns", so do the
nouniest thing with them....).  There are existing simulations showing that
any of these three sources of similarity can give rise to novel
overgeneralizations in a neural network.

If you think this is helpful, feel free to pass it on to Funknet, but I'm
happy to stick with a private interchange too. -liz




 ==============================================================================
Richard (=Dick) Hudson
Department of Phonetics and Linguistics,
University College London,
Gower Street,
London WC1E 6BT
work phone: +171 419 3152; work fax: +171 383 4108
email: dick at ling.ucl.ac.uk
web-sites:
  home page = http://www.phon.ucl.ac.uk/home/dick/home.htm
  unpublished papers available by ftp = ....uk/home/dick/papers.htm



More information about the Funknet mailing list