z-logo
open-access-imgOpen Access
Multi-turn Dialogue Generation Using Self-attention and Nonnegative Matrix Factorization
Author(s) -
Chen Hu,
Neng Wan,
Songtao Cai,
Guangping Zeng
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1924/1/012028
Subject(s) - non negative matrix factorization , computer science , context (archaeology) , generative grammar , artificial intelligence , field (mathematics) , matrix (chemical analysis) , generative model , matrix decomposition , mechanism (biology) , machine learning , natural language processing , mathematics , eigenvalues and eigenvectors , physics , quantum mechanics , paleontology , philosophy , materials science , epistemology , composite material , pure mathematics , biology
Recently, neural generative models have shown significant potential in the field of human-computer interaction, especially the dialogue systems. In this paper, we propose a new model for multi-turn dialogue generation, which uses the self-attention mechanism to extract relevant information in the dialogue history, and utilizes the nonnegative matrix factorization(NMF) to learn topic vectors from an external corpus. The response generated by the model is not only affected by the dialogue context, but also by the corresponding topic vectors. We conduct experiments on a public dataset to evaluate the performance of our model. Experimental results demonstrate that our model has the ability to generate more diverse and context-sensitive responses.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here