z-logo
Premium
A general metric and parallel framework for adaptive image fusion in clusters
Author(s) -
Wei Jingbo,
Liu Dingsheng,
Wang Lizhe
Publication year - 2014
Publication title -
concurrency and computation: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.309
H-Index - 67
eISSN - 1532-0634
pISSN - 1532-0626
DOI - 10.1002/cpe.3037
Subject(s) - distortion (music) , computer science , metric (unit) , process (computing) , fusion , range (aeronautics) , image fusion , artificial intelligence , image (mathematics) , acceleration , algorithm , pattern recognition (psychology) , data mining , machine learning , engineering , bandwidth (computing) , classical mechanics , aerospace engineering , operating system , computer network , amplifier , linguistics , operations management , philosophy , physics
SUMMARY This article is dedicated to techniques and theories of image fusion in automatic ways and addresses two issues—the parameter setting and quality assessment. Optimal parameters are in demand for specific applications or comparison between fusion methods because, as basic evidence, different parameters bring different fusion effects varying over a large range. In this paper, we propose a general framework of online parameter training to search optimal values that best suit input images. Furthermore, we optimized the compute‐intensive training process using parallelization and genetic algorithm, as well as patches extraction. We also propose a metric—spatial and spectral distortion—as the learning target. The spatial and spectral distortion is a fuzzy combination of mean potential energy measuring spatial distortion and Q4 measuring spectral distortion. Optimization validation on weighted Gram–Schmidt fusion indicated linear or superlinear acceleration ability, which proved that the proposed learning framework can speed up the learning process of image fusion to an acceptable time, and can thus be applied to high‐performance platforms to process large volumes of data. Copyright © 2013 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here