complexity measures
bwald
bwald at HUMnet.UCLA.EDU
Tue Jan 20 15:07:28 UTC 1998
----------------------------Original message----------------------------
A few additional comments on the complexity issue.
First, if I had read the ensuing postings before sending my previous
message I might have more thoughtfully rephrased my use of the term
"dogma". I wrote:
"My understanding is that linguists make the "equal complexity" assertion
because of the analytical equality of languages principle, fashioned into a
dogma for pedagogical purposes"
Various other respondents made points similar to mine about the motivation
for the principle, having to do with avoiding ethnocentrism and unfounded
prejudice about what constitutes "complexity".
(Actually, I was also saying that underlying the principle is a stance
claiming the need for the SAME linguistic resources to analyze any
language.) The "dogma" part simply alludes to the internalization of the
principle so that it becomes dissociated from its purpose and is simply a
reflex reaction to any proposal to the contrary, no matter how reasoned. I
would not accuse any linguist of doing this, but from my self-monitoring of
how I react to statements or questions about relative complexity of
different languages by non-linguists, I recognize that the dogma alerts me
to the principle before I question the non-linguists further to see what
they mean. I hasten to add that I do not dismiss everything a non-linguist
says about language, far from it. So the dogma acts as a flag, not
necessarily a bad thing, once you get beyond it to the principle it is
intended to flag.
Next, I found the discussion stimulating, particularly Fertig's musings,
and Carrasquer Vidal's
citations from Malcolm Ross. Fertig's leading suggestion is intriguing, to
the effect that languages generally work according to some principle of
entropy at near maximal brain capacity, and therefore they are all
equivalent in overall complexity. If this is interpretable, and if that
were the case, then indeed as soon as something was simplified in the
grammar, something else would more or less instantaneously occupy the brain
"space" and maintain a constant overall complexity. I find the idea
interesting, but wanting clarity for any kind of operationalization of the
brain mechanism or mental activity, so that it might become clear how it
could possibly be empirically tested.
Both Fertig and Carrasquer Vidal mentioned PIDGINS, as candidates for
"simpler" languages. This is tricky for a number of reasons. First,
pidgins are not first languages, and they are known to be parasitic on
first languages for their overall complexity. For one thing, obviously the
phonological complexity of the pidgin as spoken by any particular speaker
is to some extent parasitic on other languages spoek by the speaker. So,
it is not a forgone conclusion that pidgins can be examined as independent
languages for purposes of comparison for complexity with other kinds of
languages, most notably the languages of monolinguals or late
multilinguals.
Second, as far as comparison or status as independent languages, phenomena
which are called "pidgins" vary greatly in complexity and
conventionalization -- from the kind of on-the-spot makeshift forms of
communication, for which my first point applies most unproblematically, to
highly conventionalized and complex auxiliary languages, such as the
Neo-Melanesian which has a close creole counterpart in Tok Pisin, or some
quite stable varieties of West African pidgin English (which are
historically related to such creoles as Sierra Leonean Krio). To the
extent that "learnability" can be taken as criterial of "complexity", there
have been provocative statements by some linguists, e.g., the creolist
Derek Bickerton, that creoles are maximally learnable because they are only
minimally arbitrary, with regard to grammar. However, such a claim has
generated great controversy, to say the least. And it has certainly not
been empirically demonstrated that they are more learnable than any other
languages. Bickerton's arguments came from certain theoretical assumptions
he made. They are no less suggestive than Fertig's proposal, mentioned
above, but they are no more well founded in terms of direct empirical
support.
Finally, in considering make-shift pidgins -- and I have been in situations
where I had to try to create one with the cooperation of interlocutors
(maybe most people have, if they have tried to communicate without a common
language), they are very stressful and hardly economical from a
production-perception point of view. From this I detect an unclarity in
the concept "complexity". Part of the stress has to do with trying to
communicate with a limited vocabulary, which leads to a lot of longer
circumlocutions according to some syntactic principles. If we take
"complexity" to be a measure of abstract language knowledge, competence, or
whatever you want to call it, then the burden of using multi-word phrases
instead of semantically more complex (!?) single words is ignored, because
the same syntactic resources may be used in a fluent language as in a
make-shift pidgin. Nevertheless, the constant appeal to multi-word phrases
in the makeshift pidgin instead of phonologically more compact single words
should count for something "complicated", shouldn't it? After all, single
words are syntactically less complex than phrases. How do we measure the
complexity of a language which has a limited vocabulary and constant appeal
to circumlocution involving more complex syntactic phrases against a
language which has a larger vocabulary and (for the sake of argument) the
same syntactic resources? Should we say that it is "theoretically simpler"
but "more complex in practice"? To be sure, no make-shift pidgin has
anywhere near the syntactic complexity of a first language; it's not even
close. But no language belabors its limited resources more than a
makeshift pidgin. From this at least one thing is clear, the amount of
effort involved is no measure of linguistic complexity.
P.S. With respect to language acquisition, let me end with a reminder that
"language" is an abstraction to begin with. Thus, who would deny that
child language is less complex than adult language, for the "same"
language. OK. Where do we stop? Wouldn't a 40 year-old's language be
less complex than a 60 year-old's? Why not? Because we're not interested
in the additional complexity, say, additional vocabulary acquired in living
that long? If we are, do we have to compare age-mates across languages to
compare language complexity (assuming we could do it, to begin with)?
Then, given individual and cultural differences, is the claim of equal
complexity one that asserts that for any speaker of a language of any age
we can find some speaker of another language of that age who has equivalent
complexity? And, if linguists consider additional complexity, most likely
additional lexicon, trivial with regard to the problem, beyond a certain
age, as I suppose they would, then what precisely is non-trivial in
comparing languages "globally" for relative complexity? This might have a
bearing on whether Fertig's entropy theory is as interesting as it first
appears. I think it also has a bearing on how interesting, or not, the
notion of "global linguistic complexity" actually is.
More information about the Histling
mailing list