z-logo
open-access-imgOpen Access
Rapid perceptual learning of noise-vocoded speech requires attention
Author(s) -
Julia Jones Huyck,
Ingrid S. Johnsrude
Publication year - 2012
Publication title -
the journal of the acoustical society of america
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.619
H-Index - 187
eISSN - 1520-8524
pISSN - 0001-4966
DOI - 10.1121/1.3685511
Subject(s) - active listening , speech perception , sentence , perception , noise (video) , task (project management) , speech recognition , background noise , audiology , perceptual learning , psychology , computer science , communication , artificial intelligence , medicine , neuroscience , economics , image (mathematics) , telecommunications , management
Humans are able to adapt to unfamiliar forms of speech (such as accented, time-compressed, or noise-vocoded speech) quite rapidly. Can such perceptual learning occur when attention is directed away from the speech signal? Here, participants were simultaneously exposed to noise-vocoded sentences, auditory distractors, and visual distractors. One group attended to the speech, listening to each sentence and reporting what they heard. Two other groups attended to either the auditory or visual distractors, performing a target-detection task. Only the attend-speech group benefited from the exposure when subsequently reporting noise-vocoded sentences. Thus, attention to noise-vocoded speech appears necessary for learning.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom