Rules vs. Lists
lists at chaoticlanguage.com
Wed Jul 2 02:23:03 UTC 2008
I agree you can see the now extensive "usage-based" literature as a
historical use of "lists" to specify grammar. That is where this
thread came from after all.
But there is surely something missing. As you say, your own papers go
back 20 or more years now. Why is there still argument?
And the "rules" folks are right. A description of language which
ignores generalizations is clearly incomplete. We don't say only what
has been said before.
And yet if you can abstract all the important information from lists
of examples using rules, why keep the lists?
So the argument goes round and round.
What I think is missing is an understanding of the power lists can
give you... to represent rules. That's why I asked how many rules you
could abstract from a given list.
While we believe abstracting rules will always be an economy, there
will be people who argue lists are unnecessary. And quite rightly, if
it were possible to abstract everything about a list of examples using
a smaller set of rules, there would be no need to keep the examples.
The point I'm trying to make here is that when placed in contrast with
rules, what is important about lists might not be that you do not want
to generalize them (as rules), but that a list may be a more compact
representation for all the generalizations which can be made over it,
than the generalizations (rules) are themselves.
You express it perfectly:
"...if you have examples A, B, C and D and extract a schema or rule
capturing what is common to each pair, you have 6 potential rules,
(AB, AC, AD, BC, BD, and CD), so sure, in theory you could have more
rules than subcases."
It is this power which I am wondering has never been used to specify grammar.
You say you wouldn't "expect" to find this true in practice: "I
wouldn't expect to find more rules than examples". But has anyone
looked into it? It is possible in theory. Has anyone demonstrated it
is not the case for real language data?
Consider for a minute it might be true. What would that mean for the
way we need to model language?
I'll leave aside for a moment the other point about the importance of
the concept of entrenchment from Cognitive Grammar. I think the raw
point about the complexity, power, or number, of rules which can be
generalized from a list of examples is the more important for now.
I'd like to see arguments against this.
On Tue, Jul 1, 2008 at 10:49 PM, David Tuggy <david_tuggy at sil.org> wrote:
> Thanks for the kind words, Rob.
> Right—I am not saying the number is totally unconstrained by anything,
> though I agree it will be very, very large. What constrains it is whether
> people learn it or not.
> Counting these things is actually pretty iffy, because (1) it implies a
> discreteness or clear distinction of one from another that doesn't match the
> reality, (2) it depends on the level of delicacy with which one examines the
> phenomena, and (3) these things undoubtedly vary from one user to another
> and even for the same user from time to time. The convention of representing
> the generalization in a separate box from the subcase(s) is misleading in
> certain ways: any schema (generalization or rule) is immanent to all its
> subcases —i.e. all its specifications are also specifications of the
> subcases—, so a specific case cannot be activated without activating all the
> specifications of all the schemas above it. The relationship is as close to
> an identity relationship as you can get without full identity. (It is a if
> not the major meaning of the verb "is" in English: a dog *is* a mammal,
> running *is* bipedal locomotion, etc.)
> Langacker (2007: 433) says that "counting the senses of a lexical item is
> analogous to counting the peaks in a mountain range: how many there are
> depends on how salient they have to be before we count them; they appear
> discrete only if we ignore how they grade into one another at lower
> altitudes. The uncertainty often experienced in determining which particular
> sense an expression instantiates on a given occasion is thus to be expected.
> If you do a topographical map a altitude intervals of one inch you will have
> an awful lot of peaks. Perhaps even more rules than the number of examples
> they're abstracted from. But normally, no, I wouldn't expect to find more
> rules than examples, rather, fewer. It generally takes at least two examples
> to clue us (more importantly as users than as linguists, but in either case)
> in to the need for a rule, and the supposition that our interlocutors will
> realize the need for that rule as well, and establish it (entrench it) in
> their minds. Of course, as you point out, if you have examples A, B, C and D
> and extract a schema or rule capturing what is common to each pair, you have
> 6 potential rules, (AB, AC, AD, BC, BD, and CD), so sure, in theory you
> could have more rules than subcases. Add in levels of schemas (rules
> capturing what's common to AB-CD, AB-AC, ...) and you can get plenty of
> You wrote: In the light of [the possibility of more rules than
> subcases]lists seem a very powerful way to specify grammar to me. Not to
> mention explaining certain idiosyncratic and inconsistent aspects of
> grammar. In practice we have not used lists in this way. Any idea why not?
> I'm not sure what you are saying here. If you're saying that listing
> specific cases and ignoring omitting rules is enough, I disagree. If you're
> saying that trying to specify grammar while ignoriing specific cases won't
> work, I agree strongly. Listing specific cases is very important, as you
> say, for explaining idiosyncratic and inconsistent aspects of the grammar
> (as well as for other things, I would maintain.) I and many others have in
> practice used lists in this way. (Read any of the Artes of Nahuatl or other
> indigenous languages of Mexico from the XVI-XVII centuries: they have lots
> of lists used in this way.) So I'm confused by what you're saying.
> The reason that generalizations must be entrenched is that (the grammar of)
> a language consists of what has been entrenched in (learned by) the minds of
> its users. If a linguist thinks of a rule, it has some place in his
> cognition, but unless it corresponds to something in the minds of the
> language's users, that is a relatively irrelevant fact. Cognitive Grammar
> was important in that it affirmed this fact and in other ways provided a
> framework in which the analysis was natural.
> --David Tuggy
More information about the Funknet