z-logo
Premium
Audiovisual integration improves task performance in AD and bvFTD
Author(s) -
McLoughlin Bethany,
Benhamou Elia,
Sivasathiaseelan Harri,
Hardy Chris J.D.,
Bond Rebecca L.,
Russell Lucy L.,
Rohrer Jonathan D.,
Warren Jason D.,
Agustus Jennifer L.
Publication year - 2020
Publication title -
alzheimer's and dementia
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 6.713
H-Index - 118
eISSN - 1552-5279
pISSN - 1552-5260
DOI - 10.1002/alz.042118
Subject(s) - stimulus (psychology) , psychology , audiology , cued speech , frontotemporal dementia , sensory system , cognition , multisensory integration , stimulation , visual perception , dementia , perception , cognitive psychology , neuroscience , medicine , disease , pathology
Abstract Background Complex analysis of sensory inputs, such as auditory scene analysis and spatial localisation, is impaired in Alzheimer’s disease (AD) and behavioural variant frontotemporal dementia (bvFTD). Multisensory integration research demonstrates that synchronous stimulation in one sense (e.g. vision) can modify processing of suboptimal sensory inputs from another sense (e.g. hearing) in the healthy brain. However, the potential benefit of multisensory stimulation to reduce common symptoms in dementia has not been fully examined. Method Patients living with AD (n = 20) or bvFTD (n = 12), and healthy age‐matched controls (n = 18) were recruited to perform three brief cognitive tasks: 1. Cocktail party effect, identify a name spoken in a noisy environment; 2. Spatial ventriloquism, discriminate the location of an auditory stimulus (left/right of centre); and 3. Detect a suboptimal visual stimulus. Task performance was compared between unisensory, audiovisual congruent, and audiovisual incongruent conditions, and between groups. Result 1. Identification of spoken names in a noisy auditory environment was significantly worse for both patient groups than controls. All groups performed significantly better when viewing an accompanying video of congruent lip movements. The AD group showed some improvement for incongruent videos that cued the onset of the spoken name. 2. Auditory spatial discrimination was significantly worse in the AD group than controls, but both groups were significantly biased by the locations of the accompanying visual stimulus. The bvFTD group was only influenced by spatially incongruent visual stimulation. 3. Detection of a near‐threshold peripheral visual stimulus was improved by a synchronous auditory stimulus, irrespective of spatial congruency, in the bvFTD and control groups. In the AD group, only spatially congruent or neutral auditory stimuli benefitted visual stimulus detection. Conclusion Congruent audiovisual stimulation improves the ability of people living with AD or bvFTD to understand voices in a noisy environment, locate auditory sounds, and detect unreliable visual events. Congruency of audiovisual stimulation in different domains (e.g. semantic, spatial) provides evidence for disease‐specific stratification of multisensory integration profiles. As such, multisensory integration provides a mechanism to improve reliability of sensory inputs and the potential to alleviate symptoms in daily life if tailored according to disease.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here