z-logo
open-access-imgOpen Access
Widening the lens: what the manual modality reveals about language, learning and cognition
Author(s) -
Susan GoldinMeadow
Publication year - 2014
Publication title -
philosophical transactions of the royal society b biological sciences
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.753
H-Index - 272
eISSN - 1471-2970
pISSN - 0962-8436
DOI - 10.1098/rstb.2013.0295
Subject(s) - gesture , modality (human–computer interaction) , sign language , computer science , spoken language , manual communication , gesture recognition , sign (mathematics) , linguistics , psychology , communication , natural language processing , human–computer interaction , artificial intelligence , mathematical analysis , philosophy , mathematics
The goal of this paper is to widen the lens on language to include the manual modality. We look first at hearing children who are acquiring language from a spoken language model and find that even before they use speech to communicate, they use gesture. Moreover, those gestures precede, and predict, the acquisition of structures in speech. We look next at deaf children whose hearing losses prevent them from using the oral modality, and whose hearing parents have not presented them with a language model in the manual modality. These children fall back on the manual modality to communicate and use gestures, which take on many of the forms and functions of natural language. These homemade gesture systems constitute the first step in the emergence of manual sign systems that are shared within deaf communities and are full-fledged languages. We end by widening the lens on sign language to include gesture and find that signers not only gesture, but they also use gesture in learning contexts just as speakers do. These findings suggest that what is key in gesture's ability to predict learning is its ability to add a second representational format to communication, rather than a second modality. Gesture can thus be language, assuming linguistic forms and functions, when other vehicles are not available; but when speech or sign is possible, gesture works along with language, providing an additional representational format that can promote learning.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom