Senses of "formal"
lakoff at COGSCI.BERKELEY.EDU
Wed Mar 8 08:19:25 UTC 2000
I've been observing the discussion of what "formal" means. I
generally draw a three-way distinction:
1. Pertaining to form.
2. Formal systems, in the technical mathematical sense. This is the sense
intended in the Chomskyan tradition and more recent variants on that
3. A new sense coming out of work on the Neural Theory of Language,
referring to neural parameterizations that can be given a symbolic notation
useful in linguistic descriptions and natural language processing.
Sense 1: Pertaining to form.
Meaning is expressed in terms of form; e.g., phonological and morphological
form, surface orderings, agreement, and so on. This is just part of
language. All serious linguists have to describe how form and meaning are
related. This has nothing to do with "formal linguistics" per se.
Sense 2: At the beginning of Syntactic Structures, Chomsky sets out a
metaphor characterizing what has commonly come to be called "formal
Sentences are strings of abstract symbols.
A language is a set of such strings.
A grammar is an algorithmic mechanism for generating such sets of strings
of abstract symbols.
Chomsky's Metaphor had many entailments:
1. Autonomy: Within a mathematical formal system, generative rules or other
algorithmic mechanisms can only refer to the abstract symbols inside the
system. This eliminates many things from the content of syntactic rules:
the meaning of the symbols (all of semantics and conceptual systems), the
use of the symbols in context (all of pragmatics), communicative function
(old and new information, topicality, etc.), degrees of conventionality
(and hence, grammaticalization in process), anything from the sensory-motor
system, cognitive mechanisms, memory, anything at the neural level, and so
2. Data restrictions: Any real linguistic phenomena having to do with the
causal effect of any of the above on the distribution of surface forms
cannot be characterized within a formal system proper, unless the
information is somehow coded in appropriate symbolic terms. An example of
such a coding was the coding of minuscule aspects of semantics and
pragmatics in terms of logical forms introduced by myself and Jim McCawley
back in Generative Semantics days and since adopted by Chomsky and others.
3. Disembodied theoretical constraints: Ideas like "generative power" only
make sense within Chomsky's Metaphor. Thus, arguments like "such and such a
form of rule is too powerful" only makes sense using Chomsky's Metaphor.
I gave up on Chomsky's Metaphor back in the 60's, because I saw language as
a product of embodied human minds, not disembodied formal systems in the
However, before then, I and many of my close friends learned a lot
about language using that metaphor, as limiting and distorting as I now
think it is. Smart people with good linguistic intuitions can do
interesting research despite that metaphor.
Sense 3: NTL formal notation
One of the insights coming out of the neural theory of language, is
the distinction made by Feldman between dynamic neural processes and the
neural parameterizations that trigger them. This has enabled our group at
ICSI in Berkeley to develop symbolic notations for both parameterizations
and dynamic processes that can be mapped to neural models (structured, not
PDP). We are developing notations for parameterizations of cognitive
semantics - image-schemas, force-dynamic schemas, frames, metaphorical
maps, blends, and so on. Ben Bergen is in the process of developing such
notations for phonology and morphology.
We see grammar as consisting of constructions - neural maps
connecting parameterized semantics and parameterized phonology, with
constraints provided by all aspects of context, background knowledge, and
communicative function. In short, grammar is in the connections. NTL no
autonomous syntax and doesn't miss it. Grammatical generalizations, so far
as we can tell to date, can be stated perfectly with without it. And a
neural learning theory is being developed to accommodate real acquisition
These symbolic notations are constrained in three ways:
They must be reducible to plausible neural models.
They must be able to state real linguistic generalizations.
They must be able to be used in natural language processing models for
We call these notations "formalisms." They are precise - precise enough to
be used in natural language computing. Yet that are capable of representing
semantics, pragmatics, context, communicative function, iconicity,
probabilistic phenomena, degrees of entrenchment (and hence
grammaticalization in process), and so on.
NTL formalisms differ in a deep way from the formalisms used in
They are not constrained by purely formal ideas like generative power.
The constraints on such systems are biological; they have to be reducible
to neural systems.
They are embodied, not disembodied. They have to be grounded ultimately
in the sensory-motor system or other bodily systems.
What this shows is that the issue in "formalism" is not whether linguistic
form must be described; it must. It is not whether precise notations can be
used in real linguistic descriptions; they can. The issue, at least for us,
is whether we accept Chomsky's Metaphor and all the limitations that go
with it (we don't), or whether we seek to create a precise way of
symbolically notating both the static neural parameterizations and the
neural dynamics in such a way that we accurately characterize real natural
I hope this helps a bit to clarify issues.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Size: 6618 bytes
Desc: not available
More information about the Funknet