z-logo
open-access-imgOpen Access
Selective detrending method for reducing task‐correlated motion artifact during speech in event‐related FMRI
Author(s) -
Gopinath Kaundinya,
Crosson Bruce,
McGregor Keith,
Peck Kyung K.,
Chang YuLing,
Moore Anna,
Sherod Megan,
Cavanagh Christy,
Wabnitz Ashley,
Wierenga Christina,
White Keith,
Cheshkov Sergey,
Krishnamurthy Venkatagiri,
Briggs Richard W.
Publication year - 2009
Publication title -
human brain mapping
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.005
H-Index - 191
eISSN - 1097-0193
pISSN - 1065-9471
DOI - 10.1002/hbm.20572
Subject(s) - artifact (error) , functional magnetic resonance imaging , artificial intelligence , computer science , voxel , pattern recognition (psychology) , signal (programming language) , motion (physics) , task (project management) , speech recognition , computer vision , neuroscience , psychology , management , economics , programming language
Task‐correlated motion artifacts that occur during functional magnetic resonance imaging can be mistaken for brain activity. In this work, a new selective detrending method for reduction of artifacts associated with task‐correlated motion (TCM) during speech in event‐related functional magnetic resonance imaging is introduced and demonstrated in an overt word generation paradigm. The performance of this new method is compared with that of three existing methods for reducing artifacts because of TCM: (1) motion parameter regression, (2) ignoring images during speech, and (3) detrending time course datasets of signal components related to TCM (deduced from artifact corrupted voxels). The selective detrending method outperforms the other three methods in reducing TCM artifacts and in retaining blood oxygenation level dependent signal. Hum Brain Mapp 2009. © 2008 Wiley‐Liss, Inc.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here