FASTEN SEATBELTS - A guided tour of the research on deafblind communication
in 45 min.
Plenary presentation at the International Symposium on Development
and Innovations in Interpreting for Deafblind People, Netherlands,
By Ole E. Mortensen, Information Center for Acquired
In this presentation I will try to give you an overview
of the research that has been undertaken with relevance for communication
with the deafblind. I will present the studies that I have found, and some
of the findings in the studies. The purpose is to give an idea of who have
been doing what with which results. There will be included a list with
references to the literature I will be talking about, so that you may read
further, if you are interested.
I will focus on tactual communication, as the one mode of communication
that is unique for deafblind people. But I do realise that most deafblind
communication through speech, and that some deafblind people communicate
using visual modes. Research on hearing tactics, speech recognition, visual
sign language will therefore not be included in this presentation.Internationally
there has not been very much research published specifically about communication
for people with acquired deafblindness. Most of the reseach that has been
undertaken has been on tactual sign language. Among the reasons for this,
I think there are the following:
In the past 30 years or so many studies have focused
on different linguistic aspects of visual sign language, such as phonology,
morphology, syntax and
discourse. With the growing knowledge of this it seems obvious to ask
"How does tactile sign language differ from this", especially following
the realisation that many elements in visual sign language was expressed
by facial or body movement, and not the hands that are the channel through
which tactile sign
language is transferred.Tactile sign language is a working tool for
interpreters for the deafblind. A theoretical knowledge of the linguistic
and functional aspects of tactile sign language is very important for interpreters
and educators of interpreters for the deafblind. This forms a motivation
to conduct or initiate research to develop this knowledge. As a research
field tactile sign language is a new one and easily defined one, with a
lot of obvious and important questions just waiting to be answered.
In addition to the more scientific research and
studies that I will get back to in a moment, there are quite a few publications
with explanations and guidelines for communicating with deafblind people.
In many cases it is practitioners who take the time and effort to put down
their personal experience and knowledge to the benefit of others. Examples
of this are:
On some of the schools for sign language interpreters and interpreters
for the deafblind students have to do written assignments as part of their
curriculum. This is the case in Finland, Denmark and Iceland to name a
few places that I know of personally. These assignments may give insight
to a new unexplored area.
Center for Sign Language and Sign Supported Communication, Copenhagen:
"Taktil Tegnsprogskommunikation" (Tactual sign language communication),
KC, København, 1997. A study on different aspects on tactual communication
for deafblind sign language users resulting in a book (in Danish) used
at the deafblind curriculum of Center's education of sign language interpreters.
This project has been extended until the end of this year.
Nordisk Uddannelsescenter for Døvblindepersonale: "Kommunikation
med døvblindblevne" (Communication with people with acquired deafblindness),
Nordisk Vejleder nr. 22, Forlaget Nord-Press, Dronninglund, 1998. 11 articles
(in Danish, Swedish and Norwegian) covering different aspects of communication
with people with acquired deafblindness.
Dona Sauerburger: "Independence without sight or sound", American Foundation
for the Blind, USA, 1993.
Theresa Smith: "Guidelines: Practical tips for working and socializing
with deaf-blind people", Sign Media Inc., Maryland, USA, 1994.
Bruce D. Snider (ed.): "Being in touch: Communication and Other Issues
in the Lives of People who are Deafblind", Gallaudet University Press,
Susie Morgan: "Sign Language with People who are Deaf-Blind: Suggestions
for Tactile and Visual Modifications" in Deaf-Blind Perspectives, Fall
For instance I have a report done by Finnish students
that focuses on how much it matters what size, shape and temperature the
hands of the interpreter are. These assignments very seldomly are available
for a wider international audience. Considering how small a professional
field this is and how little new
knowledge is generated, we might want to consider ways of sharing this
knowledge as well.
As part of our work at the Information Center for
Acquired Deafblindness we monitor the research on communication for people
with acquired deafblindness using Internet, literature databases and personal
contacts. We focus on publications in a language, that is possible for
us to read, so most of what I will be talking about here is written in
English. Let's look at the major studies that have been published in this
decade (that we know of at the moment).
Some of the studies have touched upon the same areas,
but I will not discuss the findings in relation to each other. Time does
not permit that here.
The publicised studies we have found are these:
On sign language:
On sign language and finger spelling:
Johanna Mesch: "Dövblindas teckenspråk" (Sign language of the
deafblind), in Forskning om teckenspråk XVIII, Stockholms Universitet,
Institutionen för lingvistik, 1994. A pilot study of the Swedish sign
language used by deafblind people. Report in Swedish. Followed by a Ph.D.thesis
Charlotte M. Reed, Lorraine A. Delhorne, Nathaniel I. Durlach (MIT) and
Susan D. Fischer (NTID): "A Study of the Tactual Reception of Sign Language",
Journal of Speech and Hearing Research, p. 477-489, USA, 1995.
Johanna Mesch: "Teckenspråk i taktil form" (Sign language in its
tactual form), Institutionen för lingvistik, avd. för teckenspråk,
Stockholms Universitet, 1998. Ph.D. thesis on tactual Swedish sign language.
In Swedish, with an abstract in English.
Sarah Reed: "Communication through touch". Report from an 18 month project
with two aims: To gain a better understanding of the transition from visual
to tactual reception of sign language and to pilot a training program in
tactual sign language and make recommendations on a national training strategy.
Sarah Reed will be presenting this work herself later at this symposium.
Steven Collins and Karen Petronio: "What Happens in Tactile ASL?", in "Pinky
extension and Eye Gaze – Language Use in Deaf Communities", Gallaudet
University Press, Washington D.C, 1998.
Leena Hassinen, Finland in association with Bencie Woll, UK:
"A preliminary study of tactile forms of communication" (1990) This preliminary
study focused on the following categories in material including both fingerspelling
On finger spelling:
- Body posture and hand arrangements
- Space and location
- Turn taking
- Signalling of grammatical structure
On computer recognition of sign language:
Charlotte M. Reed, Lorraine A. Delhorne, Nathaniel I. Durlach (MIT) and
Susan D. Fischer (NTID): "A study of tactual and visual reception of finger
spelling", Journal of Speech and Hearing Research, p. 786-797, USA, 1990.
Live Fuglesang: "Kommunikasjon med døvblinde. Kan jeg bruke en av
døvblindes viktigste taktil-kinetiske kommunikasjonsformer, to-hånds
alfabetet, på talespråkets premisser?", Universitetet i Oslo,
Live Fuglesang: "Kommunikasjon med døvblinde. Non-verbale aspekter
ved bruk av to-hånds alfabetet?", Universitetet i Oslo, 1988.
On transition from one communication mode to another, and on communication
Thad Starner, Joshua Weaver and Alex Pentland: "Real-Time American Sign
Recognition Using Desk and Wearable Computer Based Video", MIT Media Laboratory
Perceptual Computing Section Technical Report no/. 466, Boston, USA.
Live Fuglesang and Ole E. Mortensen: "Communicative Strategy –
Including transfer to Tactile Mode", plenary presentation at the 4th European
Conference on deafblindness, Madrid, Spain, July 1997. (Click here to read.)
(We know of more studies on computer recognition of sign language and of
several studies on the Tadoma method, but they are not included on this
list. For further information on these you are welcome to contact us at
the Information Center.)
I will now briefly present a few of the main points in three of
the most comprehensive works on tactual sign language, focusing on different
What Happens in Tactile ASL?
In their study (see reference) Steven Collins and Karen Petronio focused
on four linguistic aspects of tactile ASL and how they differ from
visual ASL. The questions that were addressed were:
On morphology: Many adjectives and adverbs in visual ASL are composed
of nonmanual facial expressions. How are these morphemes conveyed in tactual
ASL, when the receiver cannot see them?
On phonology: Signs can be broken into smaller parts (phonemes).
The basic parameters include handshape, movement, localisation and orientation.
In tactile ASL the receiver's hand is placed on the signer's hand. Does
this result in changes in these four parameters?
On syntax: In visual ASL questions may occur with a variety
of word orders. What word orders occur in questions in tactual ASL?
On discourse: In visual ASL feedback is given by head nods and
different facial expressions. How is feed back given in tactual ASL?
Among the findings were:
Regarding phonology: The two persons hold hands during conversation.
This means that the sign space is limited compared to visual ASL. The result
is that signs that are located near the outer edges of the signing space
in visual ASL are articulated within a smaller space in tactual ASL.
In signs with body contact, the part of the body where the sign should
be located moves toward the signing hand to make it easier to make contact.
This is not the case in visual ASL.
Some modifications of the signs, for instance in relation to orientation,
are made to accomodate the flexibility and comfort of the listener's hand.
Regarding questions: Questions have to be marked in a more explicit
way in tactual sign language. For instance, in visual ASL the sign "question"
may be used at the end of a yes/no-question. In tactual ASL, the sign "question"
occured after each yes/no-question.
Another interesting finding had to do with marking who a question is
directed to. In visual sign language you will look directly at the person
a question to him/her. In tactual ASL the receiver cannot see who the
question is directed toward, so the sign "you" is used at the beginning
of the question to mark, that what follows is directed to the receiver.
Regarding morphology: There are nonmanual adverbs in visual
sign language that combine meaning with the sign that is produced. In the
study two adverbs "ee" and "mm" are used in connection with the verb "Drive".
These two adverbs - expressed by the face - describe different ways of
driving a car. In tactual ASL these (and other) adverbs were substituted
by small differences in the way the sign was produced. When the meaning
was "drive in a casual manner", the movement of the sign became slower
and muscle tension was lax. When the meaning was "drive in a very intense
manner", the movement of the sign became quicker and more tense.
Regarding discourse: In visual sign language as well as in spoken
language, feed back can occur while the sender is giving his message. In
tactual ASL the receiver need to hold the hand of the sender, and in order
to send feed back such as "Oh-I-See" the sender and the receiver would
need to change hand positions for this message to be signed. (The analysed
material was conversations between deafblind people!) In these instances
feed back is given
without changing hand positions by tapping on the hand of the signer
to tell that you understand, that you agree. The tapping could be with
one finger or all
four fingers. Another way was to "nod" tactually by pushing the sender's
hand gently up and down. And yet another way identified was to squeeze
the hand of the sender lightly. (In this material all used only one hand
for receiving the signs.) What was particularly interesting was that none
of the deafblind people in this study were aware that they were using these
ways of giving tactual feed back.
Charlotte M. Reed, Lorraine A. Delhorne, Nathaniel I. Durlach (MIT)
and Susan D.Fischer (NTID) did a study (see reference) in 1995 of the effectiveness
of tactual sign language as a communication method, measured as accuracy
in the perception compared to the communication speed rate. 10 experienced
users of tactual sign language were tested for their ability to receive
both isolated signs and whole sentences.
122 isolated signs, both one hand and two hands and both symmetrical
and asymmetrical were received with an average accuracy of 87 % correct,
with scores ranging from 75 % to 98 % for the 10 individuals.
100 sign language sentences, representative of normal conversation,
were adapted to tactual sign language taking into consideration the lack
of possibilities for perception of visual elements such as facial expression,
head- and body posture etc. The average score was here 78 % correct perception,
with individual scores ranging from 60 % to 85 %. The production rate,
i.e. the speed of signing, ranged from slow to normal and the results were
more or less independent of the speed.
It seems that isolated signs were easier to receive than sentences,
contrary to what is true for reception of for instance speech, where the
context helps the
understanding. The study showed that the largest part of the errors
in perceiving isolated signs were due to a misperception of the sign's
Two signs can be identical with the exception of location, i.e. the
place where the signs are articulated.
This indicates that one of the major difficulties in receiving tactual
sign language for a deafblind person lies in determining exactly where
signer's body the sign is produced. In Leena Hassinen's study at Bristol
University (see reference) she describes a deafblind person who had developed
special communication technique with his family. Signs that would normally
be articulated near the signer's body were articulated near the receiver's
body instead. This made it much easier for the deafblind person to determine
the exact location of the sign. Johanna Mesch from the University of Stockholm
noted in her pilot study (see reference) that it seemed that deafblind
people use fewer pointing gestures and make less use of the communication
space in front of them in connection with nouns and as person deiksis.
This might also be related to the same difficulty of determining exact
locations of signs.
Johanna Mesch's Ph.D. thesis from 1998 (see reference) was published
in a book of app. 240 pages, so I will not be able to do it justice in
this short time. In her study Johanna Mesch focuses on turn-taking and
questions in the conversation between deafblind persons using tactile sign
language. Her material was video recordings of six conversations, four
with two deafblind persons and two where one was deaf and the other was
The study shows that deafblind signers use their hands in two different
In the monologue position both the signer's hands are held under the
hands of the listener, whereas in the dialogue position both participants
hold their hands in identical ways: the right hand under the other person's
left hand, and the left hand on top of the other person's right hand. It
is also described how these two positions affect two handed signs, and
how feed back is given in the two positions.
The study also shows how differences in the vertical and horizontal
planes between the two persons are used in turntaking regulation. In the
different conversational levels were identified in the vertical plane,
i.e. places where their hands are during conversation.
Resting level (neutral)
Turn change level
The speaker (turn holder) may signal that he is ready to end his turn
by lowering the hands from the turn level to the turn change level. Or
signal that he is not ready to give up his turn, but need a moment
to think before continuing by holding the hands at the hesitation level.
In the horisontal plane three different turn zones were identified.
Closest to the speakers are their own turn zones, and in the middle is
the joint zone. When finishing his turn, the speaker moves the hands to
the joint zone. The study also analyzes 137 questions in the material,
both yes/no questions and
wh-questions to determine what elements in tactile sign language makes
up for the lack of interrogative facial expression as in sign language
Here the findings were along the lines of the results from the study
by Collins and Petronio.
Johanna Mesch's pilot study from 1994 focused on some of the same elements
as in her doctoral thesis.
To summarize, the topics that have received special focus in the research
on tactual sign language I have mentioned here are
Marking of questions
Transformation of nonmanual grammatical elements
Effectiveness (how fast and accurate) – and where
I have looked briefly at some main points from the research until now
on tactual sign language. The findings until now are very interesting and
useful, which I hope that you will agree with. But all researchers, whose
work we have seen, express the need for further research and development
work in the field of
tactual sign language, and deafblind communication in general.
There are lots of different manuals to be used by touch, but there
has been carried out few studies so far on finger spelling. One of them
is the study of
tactual and visual reception of finger spelling by Charlotte M. Reed,
Lorraine A. Delhorne, Nathaniel I. Durlach (MIT) and Susan D. Fischer (NTID)
in 1990 (see reference).
The purpose of the study was to examine the ability of experienced
deafblind subjects to receive finger spelled materials, including sentences
and connected text through the tactual sense. A parallel study of the reception
of finger spelling through the visual sense was also conducted using sighted
subjects. Accuracy of reception was examined as a function of rate
of presentation. In the tactual study rates were limited to those that
could be produced
naturally by an experienced interpreter. In the tactual reception of
finger spelling, the hand ( or hands) of the receiver is placed on the
hand of the
sender to monitor the hand shapes and movements associated with the
letters of a manual alphabet. The subjects of this study were 5 deafblind
individuals who were highly experienced in the tactual reception of the
American One-Hand Manual Alphabet, using it for 10-40 years. A certified
interpreter finger spelled directly into their palms or the deafblind wrapped
their fingers and palm around the side and back of the interpreter's hand.
The subjects responded orally. There were lists of sentences, some representative
of everyday conversation, and some more difficult. This lists were finger
spelled at various rates ranging from "slow" to "very fast".
Concerning the everyday sentences, the subjects scored 80-100%, but
lower for the more difficult sentences. There was also a tendency for a
gradual decrease in performance with increasing presentation rate. In general,
the reception of tactual finger spelling appears to be accurate at normal
rates of presentation that are considered to be comfortable for the finger
speller (i.e. roughly 5 letters/s.). The results with sped-up finger spelling
suggests that sighted deaf subjects can understand substantial amounts
of information at rates as high as two or three times normal finger spelling
How to deal with the words when using a manual then? Is it more like
speaking or more like writing, which are quite different! Live Fuglesang
of Norway did two studies on this in 1987 and 1988 (see references), exploring
pragmatic aspects of using the Norwegian two-handed alphabet in communication
with the deafblind.These studies are only available in Norwegian, but click
here to read more about the findings in Live Fuglesang's and Ole E. Mortensen's
presentation "Communicative Strategy - Including Transfer to Tactile Mode".
OTHER RELEVANT STUDIES:
Live Fuglesang, Norway and Ole E. Mortensen, Denmark did a survey of
78 deafblind persons from 11 countries on their experience of their transition
tactile mode with regards to areas such as:
We did this survey for a plenary presentation at the European conference
the physical contact with strangers that is necessary
the lack of speed using a new method
the accuracy of the method
how hard and exhausting it is to use
how difficult it is to learn
deafblindness in 1997 (see reference). It was not a scientific approach,
we do know that! This is straight forward answers from the deafblind persons
who were willing to participate. For those who answered, it's representative,
but probably not for the whole deafblind population in Europe! On the other
hand, we have not seen a survey like this done before. This could be a
small step further towards getting to know this area better. (Click here
to read more about the survey and the findings.)
Tadoma is a very rarely used method of communication, but it
has been well studied and documented by Charlotte Reed and colleagues at
MIT. In this method the deafblind person holds his hand on the face and
neck of the talker. By feeling the movement of the mouth and the
jaw, feeling the air from the mouth and the vibrations on the neck and
so on, normal speech can be received through the sense of touch.
As far as we know, only very few persons in the world are using the
Tadoma method today. About 20 people in USA and maybe the same number in
the rest of the world, according to Charlotte Reed's own estimate. Although
there is some discussion about this, the method is generally believed
to have been developed by a Norwegian teacher at the end of the 19th
century. Around 1920 two children called Tad and Oma were the first to
be trained successfully in this method in the USA, and the method was named
after them. Tadoma can be a fairly accurate method of speech perception.
Reception of about 40 % of isolated words, and about 80% of sentences can
be achieved by experienced users. But the implications of having to put
your hands on the face of the person who speaks, may be an important reason
for the metod not being more widely spread.
RALPH is short for Robotic Alphabet Hand. It is an artificial
hand – the size of a ten-year-old – that is capable
of forming the letters of the International manual alphabet on the basis
of electronic input. According to one of its inventors, David L. Jaffe,
it may be used in situations like
Person-to-person communication using a small keyboard and display
connected to Ralph. This works when the two communicators are
in physical proximity to each other.
Person-to-person communication via modem (or Internet). The
telephone-based alternative would require a text telephone or computer/modem
at one end and Ralph on the other end.
Reading email - as a "display" device for a computer.
With TV - the serial output from a closed-caption device
would drive Ralph.
Telephone answering machine - a device would record TDD
or modem messages for
In a classroom, conference, or courtroom situation. A
stenographer would translate the conversation into text which would
be displayed on Ralph.
Text-to-fingerspelling. A scanner/computer with OCR capabilities
could convert printed material to fingerspelling.
Speech-to-fingerspelling. Speech recognition software
could convert spoken words to fingerspelling.
Its top speed is 4 signs per second, but the speed may be slowed down
by including longer intervals when going from one letter to the next. RALPH
not go to a neutral position between each letter, but has been programmed
to move smoothly from one letter to the next.
The hand has been tested with very good results by deafblind people.
It does take a little training, however, to learn RALPH's accent, so to
letters are not produced exactly the way a human does. However it is
not in mass production yet due to problems with finding a company
that will manufature and distribute it. The inventors are employed
at a government research institution and are not allowed to do this themselves.
have been in contact with the project leader David L. Jaffe at the
Veterans Affairs Medical Center, Rehabilitation Research and Development
California (e-mail: firstname.lastname@example.org). He is very interested
in hearing from anyone at this symposium who is interested in RALPH.
Computer recognition of sign language.
Computer recognition of sign language is a relatively new area of research.
It takes places different places in the world, and is at a stage where
developments and breakthroughs occur or may occur constantly. Therefore
the best way to keep informed is through the Internet. One place to start
The purpose of this research is developing ways of interacting with
the computer using sign language. This may be the basis for –
for instance – the development of automatic translation to and
from sign language in the future. There are two basically different ways
of registration of the signs. One is where the signer wears an electronic
glove that is connected to the computer, and that registers the movements,
hand shapes etc. The other is registration of the signs using a video camera.
Some of the most promising results are reached by Thad Starner, Joshua
Weaver and Alex Pentland at MIT in Boston (see reference) using a video
camera. They have developed a setup that is capable of recognising sentences
of ASL withbetween 92 and 98 % accuracy (depending on where the camera
is situated). This test used a 40 word lexicon, that is the computer was
able to recognise 40 different signs, that when put together randomly following
the word order "personal pronoun, verb, noun, adjective, (the same) personal
pronoun" they would form coherent sentences. The 40 words (signs) were:
This study shows that recognition of sentences of this kind is possible,
depending on the size of the vocabulary. However, one important point is
pronouns: I, you, he, we, you (pl), they
verbs: want, like, lose, don't want, don't like, love, pack, hit,
nouns: box, car, book, table, paper, pants, bicycle, bottle, can,
wristwatch, umbrella, coat, pencil, shoes, food, magazine, fish, mouse,
adjective: red, brown, black, gray, yellow
the "ASL" used in this test consisted only of the signs. Other grammtical
features such as pointings to objects or persons that are "stored" in the
signing space, or facial expressions were not included. A very complex
issuewill be to recognise correctly non manual grammatical elements that
add meaning to the signs.
Computer sign language recognition is based on the same principle as
computer recognition of speech and writing and has much to do with the
techniques in "fuzzy logic" and "neural networks", where the software is
capable of learning as it works.
Holistic and interactive communication
As you have heard from Riitta Lahtinen during this symposium, there
is a very interesting project going on in Finland on holistic and interactive
communication. For more information contact Riitta Lahtinen.
Two Nordic projects
Two projects are going to be started in the Nordic countries. One is
a Ph.D. study of Norwegian tactual sign language by Eli Raanes, who is
teacher and head of the education of sign language interpreters in Norway.
The other is a joint venture between all the Nordic countries. We have
applied for app. 260.000 Pounds for a three year project in two phases
that should provide the answers to the following questions:
First phase is a questionnaire survey covering all deafblind people in
the Nordic countries, and the second phase is an interview survey following
Which communication methods and -forms are used by people with acquired
deafblindness in the Nordic countries?
Which possibilities and limitations does the individual method provide?
How does the transition from one communication method to another
How do deafblind people experience this transition?
the first one and providing more in depth answers. The decision about
the grant will be made in September. The head of the project is professor
Claes Möller, a Swedish audiologist who is very much involved in research
on Usher syndrome. For more information contact us at the Information Center.
COMMUNICATION SPEED AND ACCURACY
Finally I would like to say a few words about one aspects that these
studies of deafblind communication has made clear – effectiveness
of different tactual methods, measured as speed compared to accuracy. According
to Charlotte Reed and colleagues the natural speed of tactual
communication – Tadoma, finger spelling and sign language
– ranges from one fourth to three fourths that of normal speech
and visually received sign language. (The normal communication speed of
speech and visually received sign language is the same with regards to
the time that it takes in the two codes to transfer the same information,
for instance the contents of a sentence.) The slowest speed but most accurate
reception was obtained in finger spelling with near perfect reception at
a natural speed approximately one fourth of that of normal speech and sign
language. For both the use of Tadoma and tactual sign language the average
score is around 80 % reception, at a speed of approximately three fourths
of normal speech and
visually received sign language. It should be stressed that these figures
are averages from the results obtained in the different studies. Individual
users of these methods may accomplish results different from these! They
are merely examples to show the correlation between speed and accuracy.
Research and development work like the studies I have mentioned here
is crucial for the continuing work to build a base of knowledge, or theoretical
framework if you will, regarding the communication of people with acquired
deafblindness. And just as important is it to make the work that has been
carried out accessible to as many people as possible, for instance by distributing
and – if possible and necessary – translating it. And
this is absolutely essential for the professional level and expertise of
interpreters and educators of interpreters, who must be able to communicate
fluently in the preferred code and mode of the deafblind person. As one
of the deafblind responders wrote in a comment to the survey that Live
Fuglesang and I did in 1997:
"For any deafblind person being able to fluently communicate, is the
doorway to life itself, and as individuals we need to have a choose in
the communication methods that suit us best as individuals."
I found this on the Web. It is a talk given
by a researcher a couple of years ago in Denmark.
It is the first thing I have found which talks about
issues like turn-taking, etc.
It has a bunch of good references at the end.
--Sile 2/27/01 7:56 PM
(return to top of page)
(return to COMTOUCH