
Recurrent Neural Network based Models for Word Prediction
Author(s) -
Ms.S. Ramya*,
Dr.C.S.Kanimozhi Selvi
Publication year - 2019
Publication title -
international journal of recent technology and engineering
Language(s) - English
Resource type - Journals
ISSN - 2277-3878
DOI - 10.35940/ijrte.d5313.118419
Subject(s) - computer science , recurrent neural network , laptop , word (group theory) , artificial neural network , artificial intelligence , language model , mobile device , natural language processing , sentiment analysis , speech recognition , machine learning , world wide web , linguistics , philosophy , operating system
Globally, people are spending a cumulative amount of time on their mobile device, laptop, tab, desktop, etc,. for messaging, sending emails, banking, interaction through social media, and all other activities. It is necessary to cut down the time spend on typing through these devices. It can be achieved when the device can provide the user more options for what the next word might be for the current typed word. It also increases the speed of typing. In this paper, we suggest and presented a comparative study on various models like Recurrent Neural Network, Stacked Recurrent Neural Network, Long Short Term Memory network (LSTM) and Bi-directional LSTM that gives solution for the above said problem. Our primary goal is to suggest the best model among the four models to predict the next word for the given current word in English Language. Our survey says that for predicting next word RNN provide accuracy 60% and loss 40%, Stacked RNN provide accuracy 62% and loss 38%, LSTM provide accuracy 64% and loss 36% and Bidirectional LSTM provide accuracy 72% and loss 28%.