Training Method and Device of Chemical Industry Chinese Language Model Based on Knowledge Distillation
Author(s) -
Wenting Li,
Shangbing Gao,
Junqiang ZHANG,
Shu-Xing Guo
Publication year - 2021
Publication title -
scientific programming
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.269
H-Index - 36
eISSN - 1875-919X
pISSN - 1058-9244
DOI - 10.1155/2021/5753693
Subject(s) - computer science , language model , layer (electronics) , distillation , transformer , artificial intelligence , domain knowledge , domain (mathematical analysis) , feature (linguistics) , natural language processing , natural language , machine learning , engineering , mathematical analysis , linguistics , chemistry , philosophy , mathematics , organic chemistry , voltage , electrical engineering
Recent advances in pretraining language models have obtained state-of-the-art results in various natural language processing tasks. However, these huge pretraining language models are difficult to be used in practical applications, such as mobile devices and embedded devices. Moreover, there is no pretraining language model for the chemical industry. In this work, we propose a method to pretrain a smaller language representation model of the chemical industry domain. First, a huge number of chemical industry texts are used as pretraining corpus, and nontraditional knowledge distillation technology is used to build a simplified model to learn the knowledge in the BERT model. By learning the embedded layer, the middle layer, and the prediction layer at different stages, the simplified model not only learns the probability distribution of the prediction layer but also learns the embedded layer and the middle layer at the same time, to acquire the learning ability of BERT model. Finally, it is applied to the downstream tasks. Experiments show that, compared with the current BERT model distillation method, our method makes full use of the rich feature knowledge in the middle layer of the teacher model while building a student model based on the BiLSTM architecture, which effectively solves the problem that the traditional student model based on the transformer architecture is too large and improves the accuracy of the language model in the chemical domain.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom