z-logo
Premium
Hierarchical attention model for personalized tag recommendation
Author(s) -
Sun Jianshan,
Zhu Mingyue,
Jiang Yuanchun,
Liu Yezheng,
Wu Le
Publication year - 2021
Publication title -
journal of the association for information science and technology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.903
H-Index - 145
eISSN - 2330-1643
pISSN - 2330-1635
DOI - 10.1002/asi.24400
Subject(s) - computer science , key (lock) , context (archaeology) , information retrieval , representation (politics) , recommender system , data mining , artificial intelligence , machine learning , paleontology , computer security , politics , political science , law , biology
Abstract With the development of Web‐based social networks, many personalized tag recommendation approaches based on multi‐information have been proposed. Due to the differences in users' preferences, different users care about different kinds of information. In the meantime, different elements within each kind of information are differentially informative for user tagging behaviors. In this context, how to effectively integrate different elements and different information separately becomes a key part of tag recommendation. However, the existing methods ignore this key part. In order to address this problem, we propose a deep neural network for tag recommendation. Specifically, we model two important attentive aspects with a hierarchical attention model. For different user‐item pairs, the bottom layered attention network models the influence of different elements on the features representation of the information while the top layered attention network models the attentive scores of different information. To verify the effectiveness of the proposed method, we conduct extensive experiments on two real‐world data sets. The results show that using attention network and different kinds of information can significantly improve the performance of the recommendation model, and verify the effectiveness and superiority of our proposed model.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here