14.202, Review: Language Description: Mesch (2002)
LINGUIST List
linguist at linguistlist.org
Tue Jan 21 05:42:11 UTC 2003
LINGUIST List: Vol-14-202. Tue Jan 21 2003. ISSN: 1068-4875.
Subject: 14.202, Review: Language Description: Mesch (2002)
Moderators: Anthony Aristar, Wayne State U.<aristar at linguistlist.org>
Helen Dry, Eastern Michigan U. <hdry at linguistlist.org>
Reviews (reviews at linguistlist.org):
Simin Karimi, U. of Arizona
Terence Langendoen, U. of Arizona
Home Page: http://linguistlist.org/
The LINGUIST List is funded by Eastern Michigan University, Wayne
State University, and donations from subscribers and publishers.
Editor for this issue: Naomi Ogasawara <naomi at linguistlist.org>
==========================================================================
What follows is a review or discussion note contributed to our Book
Discussion Forum. We expect discussions to be informal and
interactive; and the author of the book discussed is cordially invited
to join in.
If you are interested in leading a book discussion, look for books
announced on LINGUIST as "available for review." Then contact
Simin Karimi at simin at linguistlist.org.
=================================Directory=================================
1)
Date: Mon, 20 Jan 2003 23:40:23 +0000
From: Nicla Rossini <tattvamasi at libero.it>
Subject: Mesch (2002), Tactile Sign Language
-------------------------------- Message 1 -------------------------------
Date: Mon, 20 Jan 2003 23:40:23 +0000
From: Nicla Rossini <tattvamasi at libero.it>
Subject: Mesch (2002), Tactile Sign Language
Mesch, Johanna (2002) Tactile Sign Language: Turn Taking and Questions
in Signed Conversations of Deaf-Blind People. Signum, paperback ISBN
3-927731-80-3, EUR 23, International Studies on Sign Language and
Communication of the Deaf, Volume 38.
Book Announcement on Linguist:
http://linguistlist.org/get-book.html?BookID=3518
http://linguistlist.org/issues/13/13-2014.html
Nicla Rossini, Department of Linguistics, University of Pavia (Italy)
This monograph provides an interesting highlight on communication
among deaf-blind subjects. Deaf-blindness may be caused by several
factors: the most common one deals with age-related changes in vision
and hearing. Another common cause is Usher Syndrome, which is
hereditary, and has at least eight different variants. Due to this
syndrome, some persons may be born deaf with visual impairment. The
symptoms aggravate over the years. Although sign language is not used
by all deaf-blind people (some of them may use a home-made
communication code), it is usually used by those subjects who are born
deaf. When the visual impairment of these subjects does not allow the
use of classical sign language, then subjects adopt a special kind of
sign language, which is called tactile sign language. This particular
version of sign language is based on tactile reading of the speaker's
signs. The reading can be one-handed (in dialogs between a sighted
deaf subject and a deaf-blind subject) or two- handed (in dialogs
between deaf-blind subjects). In two handed reading, monologue
position as well as dialog position is possible. In monologue
position, the speaker has both hands under the hands of the listener;
in dialogue position, the speaker holds his right hand under the
listener's left hand and his left hand over the listener's right
hand. Of course, this mechanism implies new rules for conversation
regulation: this problem is worked out extensively in the book. In
particular, the author addresses the questions concerning turn change,
feedback, and questions (yes/no questions, alternative questions, wh-
questions). Nine deaf-blind subjects were video-recorded while having
spontaneous conversations. None of the subjects taken into exam was
born deaf-blind.
Apart from some limits in the theoretic background - nodding is not a
facial expression (p. 32), but rather a gesture. See for this Morris,
1977; Davis & Vaks, 2001 - or in the structure of the book (the
description of monologue and dialog positions is certainly important,
although the description of the perspective of the speaker and that of
the listener in both monologue and dialogue is, in fact, a
repetition), Mensch's work provides a captivating insight into the
rules used tactile sign language to regulate dialogs and, in fact, the
chapters focusing on dialogue are to be considered the most
interesting of the book. Conversational tuns are well addressed and
explained: the conversation begins with both speakers holding each
other's hands in rest position (which is located in the lower bust
area). When turn level begins, the hand are raised up to the upper
bust level. In case of hesitation, the speaker holds the listener's
hands in the upper bust level and closer to him, in a ''half-
complete'' sign; when the speaker intends to give the turn to the
listener, he lowers his hands and the hands of the listener to the
rest position. Feedback may be provided by the listener to the speaker
in various ways: for example, the listener may use linguistic and non
linguistic feedback. As to linguistic feedback, the receiver may spell
a YES sign on the speaker's palm; as for non linguistic feedback, the
listener may use a simple YES-TAP with all fingers or only the thumb
on the speaker's hand. In case of monologue, the YES-TAP is made with
both hands.
This way to provide feedback to the speaker is defined as non-
linguistic (pg. 105) by the author (I would rather define it
''gestural''). Anyhow, this analysis provides important pieces of
information about gestures in sign language, and a more detailed study
about this issue would be needed. I also find interesting the note
that ''as for non-manual feedback, deaf-blind people nod sometimes
even though they are aware that the other deaf-blind receiver can not
read this.'' This observation concerns the discussion on the
communicative function of gesture: since nodding is considered a
gesture (see above), its performance when the receiver is not able to
read it may be taken as a further piece of evidence that gestures are
not communicative (see Rime, 1982). Further research on this
particular problem would also be needed. The chapters concerning
question-making in tactile sign language begin with a brief survey of
functions and forms of questions. The author proceeds then by
analyzing the different forms of question.
The problem with tactile sign language is that the use of non manual
facial signals to differentiate questions form declarative clauses is
not possible. Nonetheless, tactile sign language avoid fuzziness by
means of manual signals, which can be listed as follows:
- the WHAT gesture (in utterances like '' do you want to follow
along, or what?'', pg. 131);
- the extended duration of the last sign;
- pointing to the addressee (in final position);
Note that the WHAT gesture is lexicalized and functions as a question
marker even in alternative questions, such as ''which hand do you use,
the left or the right?''.
In addiction to these manual signals for questions, question words
(such as the signs for ''which'', ''how'', ''what'') are used. The
question words usually appear in initial position, but may repeated
several times in an utterance. Support questions (which are meant to
support turns by requesting feedback in the case of the speaker asking
these questions, or to request clarification in the case of the
listener asking these questions) are also analyzed. Usually, the
signer's support questions occur at the end of a long utterance, while
the receiver's support questions occur during or after the signer's
turn. These questions are characterized by an extended duration of the
beginning sign of the utterance. The receiver can ask clarification by
using non- linguistic signals (which I would better define gestures)
like waving and thumb pressure.
In general, the book is well structured and easy to read. The
information provided on deaf-blind communication are to be considered
valuable, although, as above stated, this research was made on
subjects who were not born deaf-blind: a similar study on inborn
deaf-blind subjects would also be interesting.
REFERENCES
Davis, J. W. & Vaks, S. 2001. A Perceptual User Interface for
Recognizing Head Gesture Acknowledgements. ACM Workshop on
Perceptual User Interfaces, Orlando, Florida.
Morris, D. 1977. Manwatching. A Field Guide to Human Behavior.
Harry N. Abrams, Inc., Publishers, New York.
Rime, B. 1982. The Elimination of Visible Behavior from Social
Interactions: Effects of Verbal, Nonverbal and Interpersonal
Variables. European Journal of Social Psychology, 73: 113-129.
ABOUT THE REVIEWER
Nicla Rossini is currently a Ph.D. student in Linguistics. She is also
Professor of Nonverbal Communication at the S.I.L.S.I.S., University
of Pavia. Her research is related to different fields such as Gesture
and its Cognitive Origin, Gesture and Handicap, Gesture and Second
Language Acquisition, Gesture and Sociolinguistics. Her new
interpretation of the Gesture Category by means of Prototype Theory
has recently been published.
---------------------------------------------------------------------------
If you buy this book please tell the publisher or author
that you saw it reviewed on the LINGUIST list.
---------------------------------------------------------------------------
LINGUIST List: Vol-14-202
More information about the LINGUIST
mailing list