z-logo
open-access-imgOpen Access
Research on Fabric Image Retrieval Method Based on Multi-feature Layered Fusionon
Author(s) -
Yunrong Ji,
Weidong Wang,
Yamin Lv,
Weirun Zhou
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1549/5/052038
Subject(s) - clothing , computer science , the internet , image retrieval , feature (linguistics) , database transaction , variety (cybernetics) , maturity (psychological) , image (mathematics) , information retrieval , artificial intelligence , computer vision , world wide web , database , psychology , developmental psychology , linguistics , philosophy , archaeology , history
In recent years, with the maturity of computer technology and the increasing development of Internet technology, online transaction has become an important and popular sales method. As a necessities of life, clothing accounts for a considerable proportion of online transactions. Whether it is clothing manufacturers buying fabrics, or customers buying clothing online, they rely on fabric images on the Internet to browse, compare and select satisfactory products. This method of manual comparison is both time consuming and error prone. To effectively manage and use the fabric, it is very important to establish a fabric image retrieval system. The variety of fabrics and various styles, and the fabric image itself has the characteristics of large amount of data and large amount of information. The traditional retrieval method cannot achieve fast and accurate retrieval of fabric images. This paper proposes a retrieval method based on multi-feature fusion, which can accurately analyze the characteristics of fabric images. Experiments show that the using this method to retrieve fabric images can achieve good results.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here