Complexity in language

Isidore Dyen isidore.dyen at yale.edu
Wed Jun 24 19:54:42 UTC 1998


----------------------------Original message----------------------------
I regard a theory as a proposition that is assumptive in nature and
concerns change in the universe. Of course in linguistic science we
restrict our interest ot the changes of languages. I recognize that for
some, perhaps all linguistic scientists, and perhaps all scientists, my
view of the nature of a theory may be regarded as heretical, useless, or
worse, but I believe it is helpful to distinguish between theory as I have
defined it, and hypothesis.
I define a hypothesis as a proposition that explains data on the basis of
a theory and go on to distingusih hypotheses that are inferences from
those which are speculative. An inference is a hypothesis that is superior
to any other hypothesis that is claimed to explain the same data, whereas
a speculation does not have this characteristic.
 
It is in this sense that the equicomplexity of (natural) languages is a
theory. It can be the basis to explain the type of phenomenon that was
brought up for discussion. Its general utility is one that needs to be
tested in the form of the hypotheses that can be based on it, one of which
you have suggested. I should add that, as I see it, theories can no be
tested--they can only be revised or replaced--but the hypotheses based on
a theory can be tested. Obviously a theory on which no hypothesis can be
based is not worth proposing.
 
The measurement of a language for any purpose--i.e. regardless of the
theory of equicomplexity--is a complex matter itself. We think of some
languages as being more complex than others, usually the ones we think of
as more difficult than others, so that it is very common to think f one's
own language as easy and others as difficult.
 
Perhaps the greatest complication in measuring a language directly is the
apparent incommensurability of its parts. How can the inventory and
distribution of the phonemes, which a appear to be measurable, be measured
so that it is commensurable with the morphology, the syntax, the lexicon,
and/or the semantics and how are the latter four to be reduced to
commensurability. The theory of equicomplexity implies that these
structures, when measured in different languages, will somehow form an
equation.
 
Your question about bilingualism should go on to raise the question of
trilingualism, quatrilingualism and so on. But then there is no test by
which we attempt to find out whether a bilingual's control of hhis two
languages is equal or for example whether the complexity that the brain
is dealing with is double that for a monolingual or less or, for that
matter, more.
At the same time it should be remembered that the ntuarl languages that we
are dealing with are the product of a long period of evolution that did
not produce better languages, as far as we can tell, or, for that matter,
worse languages. What we do have are languages that have fared morfe
successfully in competition with other languages, but are no better than
the less successful languages. On this basis we could form a theory of the
equioptimality (or equipessimality) of languages, to which in any case I
subscribe.
 
 
 
On Wed, 24 Jun 1998, Wouter Kusters wrote:
 
>
>
> On Fri, 19 Jun 1998, Isidore Dyen wrote:
>
> > ----------------------------Original message----------------------------
> > I am responding to what appears below. I gave a paper some time ago at a
> > Lacus forum that got published in which I spoke of the equicomplexity of
> > languages. The paper proposed the theory that all natural languages were
> > equally complex. The consequence is that any change that
> > introduces complication anywhere requires a compensatory simplification
> > elsewhere nad vice versa. A simple name for what is involved might be the
> > equicomplexity principle, but, as I see it, what is involved is a theory,
> > since the proposition is an assumption;
>
> I wonder in what sense this can be called a theory when what is proposed
> is not more than a kind of dogma: All languages must be equally complex.
> When it were a theory it would be embedded and connected to other
> theories, further it should be falsifiable, which it is probably not. There
> are many examples of language changes which are obviously simplifying (cf.
> Trudgill 1992, etc. on Scandinavian cases, Werner 1987 on Germanic,
> Andersen 1988 on diverse European languages and dialects, Muhlhausler,
> Thurston 1992(?) on Melanesian, Versteegh on Arabic varieties, and so on)
> But of course if you want to stick to such a kind of 'theory' you can
> always claim that 'somewhere' in the grammar, phonology, semantics or
> even pragmatics there MUST be an opposite change towards more complexity.
> So the theory of equicomplexity is either unfalsifiable either false.
>
> Further I would be very interested in a mechanism which can measure the
> amount of complexity in a whole language and which can cause the
> same amount of complexity to appear or disappear elsewhere. If this
> mechanism is anyway connected to the capacity of the brain, I would be
> interested in how proponents of the equicomplexity theory handle
> bilingualism.
>
>
>
> Wouter Kusters
> University of Amsterdam
>



More information about the Histling mailing list