z-logo
open-access-imgOpen Access
Sequence-to-sequence neural machine translation for English-Malay
Author(s) -
Yeong Tsann Phua,
Sujata Navaratnam,
Chon-Moy Kang,
Wai-Seong Che
Publication year - 2022
Publication title -
iaes international journal of artificial intelligence
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.341
H-Index - 7
eISSN - 2252-8938
pISSN - 2089-4872
DOI - 10.11591/ijai.v11.i2.pp658-665
Subject(s) - machine translation , computer science , artificial intelligence , natural language processing , malay , sequence (biology) , speech recognition , translation (biology) , language model , artificial neural network , linguistics , philosophy , biochemistry , chemistry , biology , messenger rna , gene , genetics
Machine translation aims to translate text from a specific language into another language using computer software. In this work, we performed neural machine translation with attention implementation on English-Malay parallel corpus. We attempt to improve the model performance by rectified linear unit (ReLU) attention alignment. Different sequence-to-sequence models were trained. These models include long-short term memory (LSTM), gated recurrent unit (GRU), bidirectional LSTM (Bi-LSTM) and bidirectional GRU (Bi-GRU). In the experiment, both bidirectional models, Bi-LSTM and Bi-GRU yield a converge of below 30 epochs. Our study shows that the ReLU attention alignment improves the bilingual evaluation understudy (BLEU) translation score between score 0.26 and 1.12 across all the models as compare to the original Tanh models.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here