z-logo
Premium
Phosphene Object Perception Employs Holistic Processing During Early Visual Processing Stage
Author(s) -
Guo Hong,
Yang Yuan,
Gu Guan,
Zhu Yisheng,
Qiu Yihong
Publication year - 2013
Publication title -
artificial organs
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.684
H-Index - 76
eISSN - 1525-1594
pISSN - 0160-564X
DOI - 10.1111/aor.12005
Subject(s) - phosphene , visual processing , perception , visual prosthesis , artificial intelligence , computer science , visual perception , computer vision , image processing , feature (linguistics) , psychology , pattern recognition (psychology) , neuroscience , image (mathematics) , linguistics , philosophy , stimulation , transcranial magnetic stimulation
Psychophysical studies have verified the possibility of recovering the visual ability by the form of low‐resolution format of images, that is, phosphene‐based representations. Our previous study has found that early visual processing for phosphene patterns is configuration based. This study further investigated the configural processing mechanisms of prosthetic vision by analyzing the event‐related potential components ( P 1 and N 170) in response to phosphene face and non‐face stimuli. The results reveal that the coarse processing of phosphenes involves phosphene‐specific holistic processing that recovers separated phosphenes into a gestalt; low‐level feature processing of phosphenes is also enhanced compared with that of normal stimuli due to increased contrast borders introduced by phosphenes; while fine processing of phosphene stimuli is impaired reflected by reduced N 170 amplitude because of the degraded detailed features in the low‐resolution format representations. Therefore, we suggest that strategies that can facilitate the specific holistic processing stages of prosthetic vision should be considered in order to improve the performance when designing the visual prosthesis system.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here