z-logo
open-access-imgOpen Access
FFCI: A Framework for Interpretable Automatic Evaluation of Summarization
Author(s) -
Fajri Koto,
Jey Han Lau,
Timothy Baldwin
Publication year - 2022
Publication title -
journal of artificial intelligence research/the journal of artificial intelligence research
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.79
H-Index - 123
eISSN - 1943-5037
pISSN - 1076-9757
DOI - 10.1613/jair.1.13167
Subject(s) - automatic summarization , computer science , consistency (knowledge bases) , natural language processing , coherence (philosophical gambling strategy) , precision and recall , sentence , construct (python library) , artificial intelligence , focus (optics) , information retrieval , fluency , similarity (geometry) , range (aeronautics) , semantic similarity , recall , linguistics , mathematics , statistics , physics , philosophy , materials science , composite material , optics , image (mathematics) , programming language
In this paper, we propose FFCI, a framework for fine-grained summarization evaluation that comprises four elements: faithfulness (degree of factual consistency with the source), focus (precision of summary content relative to the reference), coverage (recall of summary content relative to the reference), and inter-sentential coherence (document fluency between adjacent sentences). We construct a novel dataset for focus, coverage, and inter-sentential coherence, and develop automatic methods for evaluating each of the four dimensions of FFCI based on cross-comparison of evaluation metrics and model-based evaluation methods, including question answering (QA) approaches, semantic textual similarity (STS), next-sentence prediction (NSP), and scores derived from 19 pre-trained language models. We then apply the developed metrics in evaluating a broad range of summarization models across two datasets, with some surprising findings.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here