Premium
Incremental inconsistency detection with low memory overhead
Author(s) -
Falleri JeanRémy,
Blanc Xavier,
Bendraou Reda,
Silva Marcos Aurélio Almeida,
Teyton Cédric
Publication year - 2014
Publication title -
software: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.437
H-Index - 70
eISSN - 1097-024X
pISSN - 0038-0644
DOI - 10.1002/spe.2171
Subject(s) - consistency (knowledge bases) , computer science , scalability , sequential consistency , overhead (engineering) , consistency model , key (lock) , weak consistency , scale (ratio) , data mining , data consistency , artificial intelligence , distributed computing , strong consistency , computer security , database , mathematics , statistics , programming language , physics , quantum mechanics , estimator
SUMMARY Ensuring models’ consistency is a key concern when using a model‐based development approach. Therefore, model inconsistency detection has received significant attention over the last years. To be useful, inconsistency detection has to be sound, efficient, and scalable. Incremental detection is one way to achieve efficiency in the presence of large models. In most of the existing approaches, incrementalization is carried out at the expense of the memory consumption that becomes proportional to the model size and the number of consistency rules. In this paper, we propose a new incremental inconsistency detection approach that only consumes a small and model size‐independent amount of memory. It will therefore scale better to projects using large models and many consistency rules. Copyright © 2012 John Wiley & Sons, Ltd.