z-logo
Premium
A scaled‐down neural conversational model for chatbots
Author(s) -
Mathur Saurabh,
Lopez Daphne
Publication year - 2018
Publication title -
concurrency and computation: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.309
H-Index - 67
eISSN - 1532-0634
pISSN - 1532-0626
DOI - 10.1002/cpe.4761
Subject(s) - conversation , computer science , encode , vocabulary , field (mathematics) , artificial intelligence , quality (philosophy) , natural language processing , machine learning , speech recognition , linguistics , communication , psychology , biochemistry , chemistry , philosophy , mathematics , epistemology , pure mathematics , gene
Summary Deep learning has revolutionized the field of conversation modeling. A lot of the research has been toward making the conversational agent more human‐like. As a result, overall the model size increases. Bigger models require more data and are costly to build and maintain. Often, for some tasks, high‐quality responses are not necessary. In this paper, a model that consumes fewer resources and a way to augment conversation data without increasing the size of the vocabulary is proposed. The proposed model uses a modified version of the GRU instead of the LSTM to encode and decode sequences of text.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here