
Multi‐layered attentional peephole convolutional LSTM for abstractive text summarization
Author(s) -
Rahman Md. Motiur,
Siddiqui Fazlul Hasan
Publication year - 2021
Publication title -
etri journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.295
H-Index - 46
eISSN - 2233-7326
pISSN - 1225-6463
DOI - 10.4218/etrij.2019-0016
Subject(s) - automatic summarization , computer science , natural language processing , artificial intelligence , coherence (philosophical gambling strategy) , semantics (computer science) , text generation , convolutional neural network , process (computing) , programming language , physics , quantum mechanics
Abstractive text summarization is a process of making a summary of a given text by paraphrasing the facts of the text while keeping the meaning intact. The manmade summary generation process is laborious and time‐consuming. We present here a summary generation model that is based on multilayered attentional peephole convolutional long short‐term memory (MAPCoL; LSTM) in order to extract abstractive summaries of large text in an automated manner. We added the concept of attention in a peephole convolutional LSTM to improve the overall quality of a summary by giving weights to important parts of the source text during training. We evaluated the performance with regard to semantic coherence of our MAPCoL model over a popular dataset named CNN/Daily Mail, and found that MAPCoL outperformed other traditional LSTM‐based models. We found improvements in the performance of MAPCoL in different internal settings when compared to state‐of‐the‐art models of abstractive text summarization.