z-logo
open-access-imgOpen Access
Estimating Critical Stimulus Features from Psychophysical Data: The Decision-Image Technique Applied to Human Faces
Author(s) -
JH Macke,
Felix A. Wichmann
Publication year - 2010
Publication title -
journal of vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.126
H-Index - 113
ISSN - 1534-7362
DOI - 10.1167/9.8.31
Subject(s) - computer science , artificial intelligence , stimulus (psychology) , perception , pattern recognition (psychology) , sensory system , embedding , psychophysics , machine learning , computer vision , psychology , cognitive psychology , neuroscience
One of the main challenges in the sensory sciences is to identify the stimulus features on which the sensory systems base their computations: they are a pre-requisite for computational models of perception. We describe a technique---decision-images--- for extracting critical stimulus features based on logistic regression. Rather than embedding the stimuli in noise, as is done in classification image analysis, we want to infer the important features directly from physically heterogeneous stimuli. A Decision-image not only defines the critical region-of-interest within a stimulus but is a quantitative template which defines a direction in stimulus space. Decision-images thus enable the development of predictive models, as well as the generation of optimized stimuli for subsequent psychophysical investigations. Here we describe our method and apply it to data from a human face discrimination experiment. We show that decision-images are able to predict human responses not only in terms of overall percent correct but are able to predict, for individual observers, the probabilities with which individual faces are (mis-) classified. We then test the predictions of the models using optimized stimuli. Finally, we discuss possible generalizations of the approach and its relationships with other models

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here