complexity measures
Gregory {Greg} Downing
downingg at is2.nyu.edu
Sat Jan 17 20:17:16 UTC 1998
----------------------------Original message----------------------------
At 11:52 AM 1/16/98 EST, you (David Lightfoot <dlight at deans.umd.edu>) wrote:
> Recent postings suggest that some people believe that languages are all
>equally complex (although this is not entailed by Larry Trask's original
>question, giving rise to this discussion).
> One possibility is that this belief is an empirical finding. In which
>case, there must be a way of measuring the overall complexity of a language and
>somebody has found that languages all emerge with the same index.
>Alternatively, it might follow from some basic principles or some
>theory that languages must be equally complex. I know of no such
>empirical support nor of any theoretical underpinning for such an idea.
>What am I missing?
> I should have thought that if there is a simplification in some part
>of a system, there doesn't necessarily have to be compensating
>complexification elsewhere.
>
I don't think it has been measured, and to do so in a genuine fashion would
require the most careful calibration and integration of every aspect of a
lot of languages, which I don't believe anyone has done. So it is not a
hypothesis based on all the empirical heavy lifting that would truly be
involved. I'd account for it in cultural-history or professional-history terms.
I'm not a linguistics professor but a professor of literature and cultural
history -- but my research area for several years has been ideas about
language in the second half of the nineteenth century and the early
twentieth century, and how those ideas might be related to literary uses of
language in the later-nineteemth and early-twentieth centuries.
As everyone probably knows, nineteenth-century language-authors of all
stripes tended to assert that some languages were more complex and
sophisticated, and some simpler and more primitive. When "comparative
philology" was becoming, or being replaced, by "modern linguistics" in the
very late nineteenth and early twentieth century, there was a felt
conceptual and methodological need to avoid making assumptions of this kind,
which seemed not to be based on anything empirical but instead began to be
seen as quite possibly reflections of nonscientific attitudes about which
cultures were prima facie superior and inferior. These attitudes were seen
as standing in the way of taking all languages seriously, thus hampering
sound study of the ones taken less seriously.
To get rid of this attitude, the idea was formulated that no language should
be assumed prima facie to be superior or inferior to another, or more
complex or less complex than another. So it's not an empirical hypothesis
grounded in heavy lifting -- more like a methodological axiom intended by
linguists to keep themselves from making unscientific assumptions.
Obviously, for many this has evolved into "all languages are equally
complex," in that way axioms and rules have of tending toward absoluteness.
Not infrequently I heard the assumption stated in just that absolute a way
in undergrad and grad linguistics classes at the University of Michigan
fifteen years ago.
So my guess would be that the "all languages are equally complex" idea is an
absolutized version of a very functional methodological axiom -- but
unfortunately stated in a positive, flat-out fashion. The more
unexceptionable form of the axiom is "no language should be assumed prima
facie to be superior or inferior to another, or more complex or less complex
than another," until and unless someone has produced a solid comparison of
lg x and lg y that makes such an argument and finds general assent -- and
during the century now closing there seem to have been more pressing
projects to deal with in lx than global comparisons of lg x and lg y in all
their details.
Greg Downing/NYU, at greg.downing at nyu.edu or downingg at is2.nyu.edu
More information about the Histling
mailing list