z-logo
open-access-imgOpen Access
Abstractive Summarization of Document using Dual Encoding Framework
Author(s) -
Monika H. Rajput,
B. R. Mandre
Publication year - 2020
Publication title -
international journal of computer applications
Language(s) - English
Resource type - Journals
ISSN - 0975-8887
DOI - 10.5120/ijca2020920544
Subject(s) - automatic summarization , computer science , dual (grammatical number) , encoding (memory) , information retrieval , natural language processing , dual purpose , artificial intelligence , linguistics , mechanical engineering , philosophy , engineering
Popularity of the web is increasing day by day and social media is becoming a huge source of information. It becomes difficult to analyze this enormous information quickly. Text summarization solves this problem, it minifies text such that repeated data are removed and important information is extracted and represented in the concise way which can help us to understand the information instantly. It is impossible to summarize all this information manually as it contains a huge number of unstructured information and reviews. Manual summarization is a tedious, monotonic and time consuming task. Therefore, method is needed for mining and summarizing information, reviews and produce representative summaries. To deal with this problem, an abstractive summarization of documents using an encoder decoder based approach is proposed. Abstractive Text Summarization gets the most essential content of a text corpus, compresses it to a shorter text, keeps its original meaning and maintains its semantic and grammatical correctness. For this, it uses deep learning architecture in natural language processing. It uses recurrent neural networks that connect the input and output data in encoder-decoder architecture with an added attention mechanism for better results. The proposed work is implemented with two datasets namely CNN/Dailymail and DUC 2004. It is worth mentioning that model achieves better performance than existing models, it improves result of ROUGE 1 metric to 41.75 for CNN/DailyMail and 35.12 for DUC2004.The experimental results show that the model produces a highly coherent, concise and grammatically correct summary. General Terms Natural Language Processing, Encoder-decoder, Recurrent Neural Network, Summarization, Abstractive Summarization, Deep Learning

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom