Reflections on Grammaticalization, Epiphenomena, etc....

Alexander Gross2 language at
Wed Mar 15 18:28:05 UTC 2006

Well, that was one of the most remarkable dustups i've ever seen here on FUNKNET, so remarkable that some awards are surely in order.  Unless I'm mistaken, one of those awards would have to be for sheer length--what I've seen so far in the archives suggests that it's the second longest thread in FUNKNET history since September of 1994, measuring some 45 messages.  And also the second most verbose, coming close to 18,000 words.  Which is one reason it's taken me a while to absorb, I've even had to print it out.  Plus which, it's truly profound stuff. But then it was meant to be, wasn't it?

Some of it was positively brilliant.  And just so it doesn't sound as though I'm merely being satirical, I'd also like to present a few awards for some of the very best comments by those taking part, at least from my point of view.  Here they come now:

"The plethora of contradictory labeling systems with which linguistics is plagued suggest that perhaps no single complete system is to be found." 
--Rob Freeman

"I hope this word [epiphenomenon] falls out of fashion in linguistics sooner rather than later."  --Suzanne Kemmer

"We probably should assume that the rules of effective information 
exchange are at least as rigid as putting up a barn.  There is not a lot of wiggle 
room."  --Steve Long

"Seems to me that 'paradigm shift' is more like an earthquake or someone popping your balloon."  --Jess Tauber 

"Insularity/isolation, rather than multidisciplinary interfacing, is the norm in the publish or perish world (speaking as a member of the latter realm). Who has time and resources for anything else?"  --Jess Tauber 

"I think that in linguistics and in phonology this is a big problem. I used to
get a lot of pressure not to read--just get in there and crack phonemes.  It is
part of the crisis of linguistics as a science--which I find to be quite Balkanized,  and waters are tricky to navigate.  When that hurdle is overcome
there is the aspect of competition and professional jealousy--a minefield for a
newcomer in the field."  --Diane Lesley-Neuman 

"I think that the apparent crisis of linguistics as a science looks a lot
less dangerous if we realize that not all linguists are actually practicing science, nor even want to practice science -- even if some of them may think and say otherwise."  --Mark P. Line

I'm particularly pleased by these barbed remarks since they seem to be going at least part-way in the direction I set forth at last year's LACUS Conference.  

But I was also a bit dismayed by how much else I had to wade through to find these gems.  And by how much of that material couldn't possibly qualify as functional linguistics, on the contrary it was mostly an apologetic for structural linguistics, even (yugh!) the generative kind.

I'm playing with the idea that Linguistics has truly come no further than where it was in 1983, except that back then Geoffrey Pullum was at least able to deliver a pep talk to the troops containing two inspiring arguments that provided a ray of hope.

Even though both of those arguments have long since fallen flat on their face.  It was in that year that he first published his essay that was re-published in 1991 as "The Stranger in the Bar," as part of that delightfully written book "The Great Eskimo Vocabulary Hoax," which for all its geniality contained a great deal of misinformation about language and linguistics.

Pullum wanted to help linguists raise their own self-esteem by being able to persuade laypersons, even someone they might meet in a bar, that there is after all a useful, practical point to the work they do. His first ploy was to suggest to such strangers that linguists are busily at work trying "to program a computer to understand plain English, like the HAL 9000 computer in Stanley Kubrick's film `2001: A Space Odyssey.'  Linguistics is the subject that figures out what you'd need to know about language in order to do that, for English or for any other language, in a general and theoretically principled way."

And just in case linguists themselves were less than convinced by this explanation, in a further pep talk he urged them to attend a conference of the Association for Computational Linguistics, as he has just done, where they would soon discover that the work of theoretical linguistics is so advanced that it will soon launch unbelievably powerful machine translation systems that will be so successful that it is difficult to "take in how fast the technology is progressing, or how much money will be made by the companies involved in producing it, or how many diverse sorts of people it is going to put out of work."

It ought to be obvious that both these arguments have failed.  Assuming a project to construct a HAL-like computer were launched today, does any one still imagine it might be feasible, much less a good idea?  And far from putting anyone out of work--read translators--it also ought to be obvious that those companies still active in what is left of the MT field, after they and their colleagues have spent decades squandering billions* in public and private funding, are now desperately trying to entice translators to strap themselves into the demanding translation memory systems they have finally blundered their way into devising.

Of course these failures go even deeper and date back far beyond 1983 for  almost a full fifty years.  They spring from two major errors subscribed to by far too many linguists, one altogether abstract and theoretical, the other altogether practical.

The first error lay in the unfounded and unprovable belief that somehow, despite the obvious diversity of languages, there simply _had_ to be a unifying pattern, there _had_ to be a unifying principle that would tie all these seemingly  disparate phenomena together, even an ultimate "universal" solution.  And having evolved such an all-embracing concept, it was naturally only a short step for these same linguists to assign themselves the unerring ability to ferret out and define this unifying principle in great detail.

After all, this unifying principle simply had to exist, there could be no other possible solution, how could there be?

The second practical error lay in the willingness of the US military to not merely believe this theoretical approach had merit but to provide vast funding for it over several decades.  To his credit Pullum provides at least a partial critique of this aspect in the same piece.

And there you have it, fifty years of linguistics--or most of it--in a nutshell.  

Under these circumstances, it is not at all surprising that a far larger truth about language has gone largely unexamined during the same time period.  

Namely that language--any language, all language--may not truly be a system of communication at all but functions in large measure as a part of our biological defense system, intended not so much to inform us about the nature of reality but to blind us and protect us from that reality whenever it becomes necessary.  

And that different peoples have created their own systems for doing so in different ways under different circumstances in various cultures over the centuries, even over the millennia.

And that this process may in fact furnish at least part of the reason, part of the underlying cause, for all the conflicting religious, political, and social ideologies we find on all sides of us.

And finally, if any of the foregoing is true, how do we break out of it and move on to something less artificial and more real?

Such a study of linguistics would be truly worthy of the time we spent studying it and once we had even a few solutions in hand would be fairly easy to explain to others, even to "the stranger in the bar."

all the best to all!


*In his extremely thorough analysis of MT for WIRED Magazine in 2000, Sheldon Silverman credited this project with having "burned through billions of dollars."  Among all the Ph. D. theses in linguistics approved each year, it might be useful if one could finally come up with a definitive total for the amount spent.  Whoever authored it would require far greater skills than mere knowledge of linguistics and MT--such a researcher would also need the accounting skills to search through national and institutional spending going back decades and ferret out funding for MT & related NLP projects hidden within our nation's military balance sheet, itself hidden within the total national budget.

Some relevant references:

Pullum, Geoffrey.  1991 & 1983.  "The stranger in the bar."  Part of  (pp. 17-22) "The great Eskimo vocabulary hoax" (University of Chicago Press). Originally published as "Linguists and computers" in the journal "Natural Language and Linguistic Theory" (D. Reidel Publishing Company).

Silverman, Sheldon. "Machine Translation Today," WIRED Magazine, May, 2000.

Two presentations on evidence based linguistics at last year's LACUS conference, online at:

Also: "Suggested Minimal Requirements for the Advanced Study of Linguistics," online at:

More information about the Funknet mailing list