
Multi-task continuous learning model
Author(s) -
Zhengbing Guo,
Meng Wang
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1873/1/012093
Subject(s) - forgetting , computer science , bottleneck , task (project management) , routing (electronic design automation) , artificial intelligence , salient , cluster analysis , machine learning , philosophy , linguistics , management , economics , embedded system , computer network
In continual learning, previous learned knowledge tends to be overlapped by the subsequent training tasks. This bottleneck, known as catastrophic forgetting (CF), has recently been relieved between vision tasks involving simple tasks. Nevertheless, the challenge lies in the continuous classification of the sequential sets discriminated by global transformations, such as excessively spatial rotations. Aiming at this, a novel strategy of dynamic memory routing is proposed to dominate the forward paths of capsule network (CapsNet) according to the current input sets. To recall previous knowledge, a binary routing table is maintained among these sequential tasks. Then, an increment procedure of competitive prototype clustering is integrated to update the routing of current task. Moreover, a sparsity measurement is employed to decouple the salient routing among the different learned tasks. The experimental results demonstrate the superiority of the proposed memory network over the state-of-art approaches by the recalling evaluations on extended sets of SVHN, Cifar-100 and CelebA etc.