11.602, Disc: Phonemic Analysis

The LINGUIST Network linguist at linguistlist.org
Fri Mar 17 22:14:17 UTC 2000


LINGUIST List:  Vol-11-602. Fri Mar 17 2000. ISSN: 1068-4875.

Subject: 11.602, Disc: Phonemic Analysis

Moderators: Anthony Rodrigues Aristar, Wayne State U.<aristar at linguistlist.org>
            Helen Dry, Eastern Michigan U. <hdry at linguistlist.org>
            Andrew Carnie, U. of Arizona <carnie at linguistlist.org>

Reviews: Andrew Carnie: U. of Arizona <carnie at linguistlist.org>

Associate Editors:  Ljuba Veselinova, Stockholm U. <ljuba at linguistlist.org>
		    Scott Fults, E. Michigan U. <scott at linguistlist.org>
		    Jody Huellmantel, Wayne State U. <jody at linguistlist.org>
		    Karen Milligan, Wayne State U. <karen at linguistlist.org>

Assistant Editors:  Lydia Grebenyova, E. Michigan U. <lydia at linguistlist.org>
		    Naomi Ogasawara, E. Michigan U. <naomi at linguistlist.org>
		    James Yuells, Wayne State U. <james at linguistlist.org>

Software development: John Remmers, E. Michigan U. <remmers at emunix.emich.edu>
                      Sudheendra Adiga, Wayne State U. <sudhi at linguistlist.org>
                      Qian Liao, E. Michigan U. <qian at linguistlist.org>

Home Page:  http://linguistlist.org/

The LINGUIST List is funded jointly by Eastern Michigan University,
Wayne State University, and donations from subscribers and publishers.

Editor for this issue: Karen Milligan <karen at linguistlist.org>

=================================Directory=================================

1)
Date:  Fri, 17 Mar 2000 17:40:40 +0000
From:  larryt at cogs.susx.ac.uk (Larry Trask)
Subject:  Re: 11.429, Disc: Phonemic Analysis

-------------------------------- Message 1 -------------------------------

Date:  Fri, 17 Mar 2000 17:40:40 +0000
From:  larryt at cogs.susx.ac.uk (Larry Trask)
Subject:  Re: 11.429, Disc: Phonemic Analysis

Peter Menzel writes:

>  Dear Fellow Linguists,
>
>  concerning the present discussion about 'phonemic analysis', it seems to me
>  that one major point is being ignored; namely, that of the implications of
>  philosophy of science (or perhaps I ought to say: what we should learn from
>  it).  Now, while I'm going to take Larry Trask's (LT) latest contribution to
>  this discussion as my point of departure, I don't want this to be
>  (mis)construed as an attack upon him.

OK.  I won't take it as an attack.  But I *will* be disagreeing fundamentally
with much of what Peter says.

Since this discussion is going to be about science, maybe I should put a few
cards on the table.  I was originally trained as a scientist.  I did my
first two degrees in chemistry, and I did a lot of chemistry, physics, and
math, as well as some biology, geology, biochemistry, and history of science.
I worked for a while as a research chemist, and then for several years as
a teacher of chemistry and physics, before moving into linguistics.

Since it will be relevant below, I might comment on my experience of
teaching physics to beginners.  The chief problems I faced were not in
getting the students to understand the theory, but rather in getting
them to do the basics right.

For example, surprising as this may seem, it is not obvious to all
beginners that an equation should have a quantity on each side of the
equal sign, and that those two quantities should moreover be equal.
This has to be learned.

Also, beginners have to learn the importance of units, which very many
of them regard as so much window-dressing.  They tend to insert the data in
whatever units they have been given, and to assume that the answer will
automatically come out in whatever units they would like.  In fact, of
course, they get gibberish.

Beginning students have to learn this stuff, so well that it becomes
second nature, before they are ready to tackle theory.  The most profound
understanding of the most powerful and beautiful theory is useless to you
if you can't write and solve an equation correctly.

>  In fact, I have noticed in this whole
>  discussion (and in a number of others, not only before this forum), a
>  certain tendency among us linguists to "backslide into Logical Positivism".
>  Thus, e.g., when LT says things like

>  "I don't present it as a theory, but only as a useful way of getting to
>  grips with the data.  As always, I prefer to teach students to look at
>  linguistic data, and to see terminology, notational devices and theoretical
>  concepts as tools that can aid them in this.  In my experience, a heavily
>  theoretical approach can all too easily lead to a state of mind in which
>  the theory becomes paramount, and the data become little more than grist
>  for a theoretical mill.  I have seen the consequences of this for myself,
>  and I don't like them."

Indeed I have, and I don't.

>  I get worried.  This sounds to me very much like the position Logical
>  Positivism held; namely, that there are data 'out there' *independent of any
>  theory*, and that 'scientists generalize upon these data' and thus 'slowly,
>  step by step, as it were, build up their theories'.  I had thought that this
>  naive view of what science is all about went out with empiricism.

Oh, no: far from it.  See below.

>  What
>  philosophy of science (POS) has been telling us for at least *twenty-five*
>  years, is that there are no data without theory (the theory tells us what
>  'out there' counts as data);

Sorry; I can't accept this.

The view just described has indeed been prominent in POS for quite a while
now.  But Peter seems to be telling us that this position is now universally
accepted, that all philosophers firmly believe it, that the issue is closed
and beyond discussion.  Not so.

The "no data without theory" idea has been pursued, in various directions
and to varying degrees, by quite a number of philosophers.  But it has been
a conspicuous failure.  Its proponents, after increasingly desperate
maneuvers, have signally failed to come up with an adequate account of
how science proceeds.  In fact, no version of it has yet been able to
explain how scientists ever make any progress at all.

And this is hardly surprising.  While the view obviously contains some
truth, as a general statement of how science works, it is simply false.

Consider thermionic emission, one of the most important (in the real world)
scientific discoveries of the last century or so.  Thermionic emission was
discovered by Thomas Edison -- the only significant scientific discovery
ever made by that distinguished inventor -- and it was at first called
'the Edison effect', until people had figured it out.

Now, Edison was not even a scientist.  He had *no* theory to guide him
in his discovery.  He wasn't looking for it; he wasn't expecting it;
he had no reason to suppose that such a thing might exist.  He stumbled
across it entirely by accident, while fooling around.  Having discovered
it, he didn't understand it.  He had no theory within which to base any
account, and he had no idea what was going on.  Puzzled, he put it aside
as a curiosity, and forgot about it.  It was left for the theorists to
make sense of this novel and unanticipated phenomenon -- which they did,
after which the engineers were able to start building radio tubes (valves),
and thus to lay the foundations of the entire electronics industry.

It is cases like this -- which are not rare -- that cause the greatest
difficulty for the "no data without theory" crowd, though there are many
other problems with this approach.

Given the failure of this view, the doubtless inevitable reaction has now
set in.  A number of philosophers of science are now challenging the
"no data without theory" view by trying to develop a clearer understanding
of the role of data, observation and experiment in science.  Most
prominent here, perhaps, is the philosopher Deborah Mayo, who has developed
a vigorous interpretation of the role of experiments and data collection
in science, *independent* of the part played by theories.  This view
now commands a good deal of support in the field.

Accordingly, the "no data without theories" view is just that: a view.
It is not a piece of truth; it has failed to work in practice; and it is
now being superseded by more sophisticated interpretations which expressly
*do* allow significant data to be obtained without the guidance of theory.

Hence Peter's stance must be rejected.

I close this section with a quote from the latest (1999) edition of Alan
Chalmers's well-known textbook of the philosophy of science:

"I reaffirm that there is no general account of science and scientific
method to be had that applies to all sciences at all historical stages
in their development.  Certainly philosophy does not have the resources
to provide such an account."

In other words, we linguists have our own agenda and our own problems,
and it is up to us to figure out how best to deal with them.  We cannot
slavishly ape what we think the physicists (or anybody else) might be
doing, and we cannot allow ourselves to be hypnotized by the ever-so-
magisterial pronouncements of some group of philosophers who happen
to be in fashion at the moment.  Nobody is going to tell us how to do
linguistics, or how to teach it, and we shouldn't let them try.

>  what philosophy of psychology (POP) has been
>  telling us for the last *ten* years (or more) is that even perceptions are
>  'theory laden' (our theories about the world around us tell us what possible
>  and likely perceptions are/can be).

I know little about POP.  But, even given the essential truth of the view
described, I cannot see that it denies the possibility of obtaining useful
data without theory.

Take a linguistic example: ergativity.  When European linguists first
stumbled across ergative languages, they were dumbfounded.  They had no
reason to suspect such a thing, and no theoretical framework within
which it could be accommodated.  But they still noticed that there
was an issue there, something that needed to be accounted for.  And,
after a good deal of fumbling around with inadequate Latin-based views
of what languages were supposed to be like, they eventually managed
to come up with tolerably adequate descriptive accounts.  Even today,
we still don't have a good general theory of ergativity, but that
doesn't stop us from working on the problem.

Isn't this a good example of data preceding theory?

>  There is an important point that needs to be made here:  Regardless of
>  whether LT holds this position or not, if he teaches the course the way he
>  describes, (as data quasi independent from *later* theorizing) his students
>  will *think* he does, and will most likely conclude that this is the correct
>  way of approaching scientific (or other) explanations.

>  Given the insights into the workings of the human mind I presented above, it
>  seems to me that  approaches like that of classical phonemics, which BTW
>  sprang from Logical Positivism,

I confess I am astonished to read this.

The phoneme principle was known to the ancient Indian grammarian Patanjali,
and to the 12th-century Icelandic First Grammarian.  In Europe, it was
slowly worked out during the 19th century.  More or less explicit
understandings of the phoneme principle can be found, for example, in
Odell (1806), Whitney (1843), Müller (1843), Ellis (1844), Pitman (1846),
Winteler (1876), and Sweet (1877).  By most accounts, the first fully
explicit statement of the phoneme principle was developed by Baudouin de
Courtenay and Kruszewski at Kazan in the late 19th century.  From Kazan,
the principle was carried west to Britain by Shcherba, reaching Daniel
Jones in 1911.  Jones and his colleagues began regularly teaching phoneme
theory in London in 1915 (by Jones's own account), and Jones made use of
the idea in his subsequent publications, and published an explicit account
in 1929.

Meanwhile, the Prague School linguists were developing their own view of the
phoneme, and the Americans were developing theirs.  Sapir's 1925 paper
is usually considered the classic American presentation, and Bloomfield
adopted phonemes unhesitatingly in his 1933 book.  By 1934 Twaddell was
writing that the phoneme principle was generally accepted.

And I'm not aware that *any* of these people were significantly influenced
by Logical Positivism.  Most of them had never even heard of it.

Logical Positivism was invented by the Vienna Circle, which didn't begin
meeting before the 1920s, which hadn't really formulated its LP doctrines
before the late 1920s, and which didn't become influential outside a small
circle before the 1930s.  LP was really only introduced to the English-
speaking world when its first British disciple, Freddie Ayer, published
his famous book in 1936.

So how can phonemics be viewed as having sprung from Logical Positivism?
The dates are all wrong.

>  are seriously misguided, because they are
>  based on the latter's view of scientific inquiry, which, to put it bluntly,
>  has been proven inadequate.

This last may be so, but there are two things.

First, even if it *were* true that phoneme theory sprang from LP -- which
I do not believe -- this is no argument against it.  Concluding that LP is
generally inadequate is no argument that its entire program is devoid of
value, or that anything which arose out of it is therefore worthless.
One might as well argue that, because Newton's alchemical investigations
were worthless -- which they were -- *all* of his work was worthless --
which it certainly was not.

Second, this reasoning constitutes a kind of group version of the *ad
hominem* argument: your case is no good because you are not fit people
to put it forward.  Phoneme theory, like anything, must stand or fall
on its own merits, and not on the general merits of any intellectual
position from which it may (or may not) have sprung.

>  Can you imagine a present day college physics
>  course teaching a Newtonain, or even an Aristotelian view of the universe?

I don't have to imagine it: I can *see* it.

It is a grave error to assume that Newtonian physics is today dismissed
by physicists as a laughable relic.  This is the very opposite of the
truth.

Look at *any* current university-level textbook of physics.  (There are many
and they are all big.)  You will find that Newtonian physics is covered.
And not just in passing, out of historical respect.  There is normally
a whole chapter on Newtonian mechanics, and another on Newtonian gravitation,
not to mention Newton's optics.  Students are required to master this
material, to learn to use it to solve real physical problems.

Why?  Because physicists still use Newton's physics today.

When NASA scientists send a spacecraft off to the outer reaches of the solar
system, what do they use to calculate its trajectory?  Good old Newtonian
physics, that's what.  They use it because it works perfectly well for
their purposes.  There is simply no need to invoke relativistic physics,
or still less quantum mechanics, in solving this kind of problem.

Recently NASA lost a craft sent to Mars.  (Two, actually, but I'm thinking
of the first one.)  What went wrong?  A failure of theory?  Certainly not.
The mission went wrong, in spite of careful preparation and the most
immaculate theory, because somebody inserted into the calculation
quantities *in the wrong units*.  So, millions of dollars' worth of
spacecraft crashed into the planet, instead of going into orbit around it.

What was I saying earlier about the need to get the basics right before
you start worrying about theory? ;-)

In fact, about 90% of the content of any physics textbook consists of
physics which was firmly in place before 1900 -- often long before 1900.
Newtonian mechanics, Newtonian gravity, electromagnetism, wave theory,
classical thermodynamics, geometrical optics, and so on.

The large and important body of physics done since 1900 is invariably
relegated to a final section called 'modern physics'.  And even this
section is almost entirely devoted to work done before 1950: special
and general relativity, quantum mechanics, nuclear reactions, and the
like.

Physics done since 1950 is generally absent.  Even such important
contributions of the last 30 or 40 years as the electroweak unification
and GUTs are ignored, or at best receive only a brief passing mention
on the last page or two of the book.  And such major topics as Bell's
inequality, the Aspect experiment, superstring theory, and the many-worlds
interpretation are not so much as mentioned in any physics textbook I've
ever seen -- even though these are among the hot theoretical issues
which physicists work on and discuss eagerly every day.

The exciting theoretical issues are left for later, and more advanced,
courses, aimed at students who already know their way around the field.

So, if the physicists have a lesson for us linguists, on the teaching
of beginning students, it is this:

"Learn to walk before you try to run.  Learn what has already been
achieved by your predecessors.  Learn the basics of your craft.  Learn them
so well that they become second nature to you, so that you don't ruin your
later work with childish blunders.  In short, learn how to do physics.
Then, and only then, will you be ready to tackle the hot theoretical issues
of the day."

I think the physicists have got it right.  And their practice contrasts
glaringly with what happens in linguistics departments in which students
are taught, from day one, *only* the currently fashionable theoretical
ideas.

>  (yes, I know, it's gets tiresome to always cite physics the *the* science to
>  be emulated, but still, this branch of science has been more successful at
>  explaining complex aspects of the universe than many others. It has
>  certainly been more successful at providing explanations in it is own field
>  than linguistics has been.)

It has, yes, but look.

First, physics has been up and running since the 16th century, at least.
Linguistics is much newer, and hardly any branches of it -- apart from
historical linguistics -- are even a century old.

Second, the phenomena examined by linguists are arguably much more complex
than those examined by physicists.  They look at the behavior of electrons
and crystals.  We look at certain aspects of the behavior of people.

There may be another reason why progress in linguistics is slower than
we might wish.

When I was a student, in the 1970s, the dominant theoretical approaches
were classical generative phonology -- in phonology -- and the EST, just
giving way to the REST -- in syntax.  The theoreticians of the day were
often loud in their claims that they were doing cutting-edge scientific
research on language, and at times contemptuous of those linguists who
declined to hop onto their theoretical bandwagons.

Well, I'm a historical linguist, and recently I gave a talk to the research
students at my old university.  I illustrated my talk with a number of
examples of phonological change, and I presented these in the notation
invented by the classical generative phonologists -- you know, stuff
like this:

     l --> r / V ___ V

We historical linguists still use this notation, not for any theoretical
reason, but merely because we find it convenient.  But what did I discover?
Many of the research students there *could not read* this notation!  I was
flabbergasted.  But they don't get taught this stuff, because that's 1970s
linguistics, and they're all busy learning the currently fashionable
non-linear phonology instead.

So, all that 1970s work, which at the time was cutting-edge research,
is now so much wastepaper.  If contemporary students can't read the
notation, then they can't read the work, either, and so presumably
they don't.

How, then, can they possibly have any understanding of what their
predecessors might have achieved?  What is there to stop them, and
their successors, from inventing the wheel, over and over again, in
one theoretical framework after another?  How can the field maintain
any continuity, and how can progress ever be cumulative -- as it is in
physics -- if each generation is incapable of understanding the work of
the preceding generation?

And what's going to happen to today's cutting-edge research, if the
students twenty years from now can't even read it?

Isn't there something wrong here?

>  The question, then, as far as I can see, comes down not to whether we want
>  to teach classical phonemics in intro linguistics course, but whether we
>  want to confront our beginning students with important questions about the
>  nature of scientific inquiry, and of human inquiry in general.  And, if we
>  don't trust our beginning students to come to grips with such a complex
>  question, how long do we want to wait?  Until grad school?

Well, far be it from me to denigrate the teaching of scientific inquiry.
But there are ways and ways of doing that, and teaching only the currently
fashionable theories is not the only way, nor perhaps even the best way.

Most of my linguistics students come to university with solid arts
backgrounds.  They have done no math beyond the required minimum, and
they have done no science at all.  Most of them are shocked by the kind
of tight analytical thinking we require in linguistics, and by the use
of explicit formal notations, because they've never encountered anything
like that before.

So, the approach I focus on is one much less ambitious than a dive
straight into hot theoretical issues.  I try to teach my students what
I regard as the basics.  Learn to spot patterns.  Learn to express those
patterns explicitly and accurately.  Learn how to spot shortcomings in
your analysis.  Learn how to modify your analysis so as to overcome
those shortcomings, while at the same time checking that your modified
analysis doesn't bugger up things you were getting right before.
(Beginners find this last point particularly difficult: they are
constantly inclined to think "OK; I've got that bit right, so now
I'll change my analysis to cope with these other troublesome cases,
and I won't worry about the earlier parts, because I've already dealt with
them.")

These, I think, are among the fundamental skills which linguists
must acquire early, and which they will need later in their careers
*regardless* of what theoretical frameworks they may choose to use.

Like the physicists, I believe that students must learn to walk before
they can run.  There's plenty of time for theory later, *after* they've
learned the basics of their field, *after* they've learned how to do
linguistics.

>  As LT says:  "Phonemics is a theory", albeit, I say, a sorely inadequate
>  one.  (As are most likely *all* of our present theories.)  If you think that
>  it is an appropriate theory with which to introduce students to the
>  complexities of phonology (and thereby those of linguistics), then you ought
>  to do so with what I should call "the proper respect for theories"; that is,
>  with an adequate explanation of the importance of theories in our quest of
>  understanding the world around us.  Of course, this is no easy task, and
>  I've often despaired at ever getting this point accross to my students.

I am not surprised.

>  And
>  I have also witnessed teaching where "data become little more than grist for
>  a theoretical mill", and I'm not about to condone this kind of teaching.

Good.

>  As I've said:  It's certainly a difficult question to decide how much theory
>  to teach and how early or late to teach it.  But it's one every teacher,
>  even a linguist, has to face.  For me, personally, the theory laden aspect
>  of all human endeavor at explaining the world around us has always been one
>  of the few ways in which I could tie my introductory courses to the
>  interests of the (general) students:  By showing them that what we were
>  doing here was, in essence, no different from what they (we all) were doing
>  every day in our attempts at explaining the world around us to ourselves.

Well, at this point, I think we'll have to agree to differ.  I don't
avoid theoretical interpretations, when I think they're appropriate
and helpful, but I think I serve the interests of my students better
by teaching them data and ways of analyzing data.

We might make a little bet.  Of all the work done in linguistics
in the last ten years or so, which has the best chance of still
being regarded as important in 20 years?

My bet is that the currently fashionable theories won't be there:
they will have given way to other fashionable theories.  Among the
stuff that I think *will* stand up are the work on social networks,
the work on creoles and creolization, and, perhaps most importantly
of all, the work on sign languages.  I rank all this stuff far, far
ahead of Optimality Theory, non-linear phonology, and Minimalism.
But then everybody knows I'm a tedious old grouch with his head
in the sand. ;-)

>  Again, LT, your contribution was only what got me to thinking about this
>  whole question.

And many thanks for giving me the chance to get a few things off my
chest.


Larry Trask
COGS
University of Sussex
Brighton BN1 9QH
UK

larryt at cogs.susx.ac.uk

---------------------------------------------------------------------------
LINGUIST List: Vol-11-602



More information about the LINGUIST mailing list