for FUNKNET (fwd)

Spike L Gildea spikeg at OWLNET.RICE.EDU
Mon Mar 29 02:01:33 UTC 1999


from: T. Givon
RE: FUNCTIONALIST PHONOLOGY

I think Scott Dleancey's historical comment was very perceptive. But
perhaps something else could be added:

Phonology has always been much more adaptively transparent. To begin
with, two of its coding processes (speech perception, articulation)
are relatively concrete, & their adaptive value is rather transparent.
But even the third, more abstract functional dimention of phonology
("phonology proper")--the neurological coding of conceptual meaning--
has been implicit in much of the traditional (pre-Chomsky) work on
**minimal pairs**, **phonemic contrasts**, **complementary distribution**
etc. So much so that even the grand abstract edifice of ordered rules
(that deft abduction of diachrony into synchrony...) and other gratuitous
abstractions could not quite succeed in obliterating the manifest
adaptive dimentions of phonetics/phonology.

The bane of syntax/grammar has been, of course, that (i) the adaptive
('functional') dimentions associated with it are much less transparent,
in as much as they are not easily discoverable by the traditional
clause-level, (reflective, conscious, speculative) methodology. And (ii),
the code itself is so much more abstract. This latter fact has tended
to yield two curious results:
**Among formalists, the genuine abstraction of the grammatical code
  has licensed unmotivated, excessive abstract descriptions ('generaliza-
  tions'), constrained primarily by (a species of) formal economy rather
  than by the data.
**Among functionalists, there is an unfortunate tendency to ignore the
  genuine, manifest abstract dimensions of grammar. What we do then,
  is either focus solely on grammar's more concrete dimentions (morphology,
  word-order, intonation); or, worse, we deny the reality of grammar
  altogether (often denying the relevance of the notion "code" to grammar).

Much of the recent discussion on "Grammar with G" seems to have fallen
prey to some version of these attitudes. Which is, leastwise from where
I stand, rather unfortunate. Grammar is the most complex domain of human
language. On the functional side, it ranges over (i.e. 'interacts with')
a big chunk of lexical semantics, particularly of verbs (event frames,
argument structuree); over all of propositional (combinatorial) semantics;
and over most of **systematically coded** discourse pragmatics (here
excepting gesturally-, intonationally- & facially-coded pragmatics).
It has massive interaction with both episodic memory & working memory
(thus attention). It is thus hardly surprising that the coding instrument
itself is complex and (partially) abstract. Complexity & abstraction are
siamese twins in systems design.

The thing that worries me most, in science in general & but in linguistics
in particular, is how prone we all are--again & again--to seek simple
models ('solutions') to complex domains ('problems'). This intellectual
scourge is called **reductionism**, and it is killing our science just
as conspicuously as its twin scourge--**ideological nationalism**--is
killing people. If our preoccupation with iconicity should have taught us
anything at all, it is that reductionism, anywhere except perhaps in logic,
is a very dubious methodological maneuver (Occam's Razor notwithstanding).

Y'all be good, y'hear.   TG



More information about the Funknet mailing list