z-logo
open-access-imgOpen Access
Can you see what you feel? Color and folding properties affect visual–tactile material discrimination of fabrics
Author(s) -
Bei Xiao,
Wenyan Bi,
Xiaodan Jia,
Hanhan Wei,
Edward H. Adelson
Publication year - 2016
Publication title -
journal of vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.126
H-Index - 113
ISSN - 1534-7362
DOI - 10.1167/16.3.34
Subject(s) - gloss (optics) , computer vision , artificial intelligence , perception , computer science , chromatic scale , matching (statistics) , tactile perception , ground truth , mathematics , psychology , acoustics , materials science , physics , statistics , neuroscience , composite material , coating
Humans can often estimate tactile properties of objects from vision alone. For example, during online shopping, we can often infer material properties of clothing from images and judge how the material would feel against our skin. What visual information is important for tactile perception? Previous studies in material perception have focused on measuring surface appearance, such as gloss and roughness, and using verbal reports of material attributes and categories. However, in real life, predicting tactile properties of an object might not require accurate verbal descriptions of its surface attributes or categories. In this paper, we use tactile perception as ground truth to measure visual material perception. Using fabrics as our stimuli, we measure how observers match what they see (photographs of fabric samples) with what they feel (physical fabric samples). The data shows that color has a significant main effect in that removing color significantly reduces accuracy, especially when the images contain 3-D folds. We also find that images of draped fabrics, which revealed 3-D shape information, achieved better matching accuracy than images with flattened fabrics. The data shows a strong interaction between color and folding conditions on matching accuracy, suggesting that, in 3-D folding conditions, the visual system takes advantage of chromatic gradients to infer tactile properties but not in flattened conditions. Together, using a visual–tactile matching task, we show that humans use folding and color information in matching the visual and tactile properties of fabrics.Google (Firm) (Faculty Grant

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom