z-logo
open-access-imgOpen Access
Investigating Back-Translation in Tibetan-Chinese Neural Machine Translation
Author(s) -
Ding Liu,
Yachao Li,
Dachang Zhu,
Xuan Li,
Ning Ma,
Ao Zhu
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1651/1/012122
Subject(s) - machine translation , computer science , example based machine translation , translation (biology) , transfer based machine translation , artificial intelligence , natural language processing , artificial neural network , deep learning , rule based machine translation , machine learning , biochemistry , chemistry , messenger rna , gene
In recent years, the proposal of neural network has provided new idea for solving natural language processing, and at the same time, neural machine translation has become the frontier method of machine translation. In low-resource languages, due to the sparse bilingual data, the model needs more high-quality data, and the translation quality fails to achieve the desired effect. In this paper, experiments on neural network machine translation based on attention are conducted on Tibetan-Chinese language pairs, and transfer learning method combined with back translation method is used to alleviate the problem of insufficient Tibetan-Chinese parallel corpus. Experimental results show that the proposed transfer learning combined with back translation method is simple and effective. Compared with traditional translation methods, the translation effect is significantly improved. From the analysis of translation, it can be seen that the citation of Tibetan-Chinese neural machine translation is smoother, which is greatly improved compared to the translation without back translation. At the same time, there are common deficiencies in neural machine translation such as inadequate translation and low translation loyalty.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here