z-logo
open-access-imgOpen Access
Artificial Intelligence Registration of Image Series Based on Multiple Features
Author(s) -
Zhixin Li,
Degang Kong,
Yongchun Zheng
Publication year - 2022
Publication title -
traitement du signal/ts. traitement du signal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.279
H-Index - 11
eISSN - 1958-5608
pISSN - 0765-0019
DOI - 10.18280/ts.390122
Subject(s) - artificial intelligence , fuse (electrical) , computer science , series (stratigraphy) , image registration , transformation (genetics) , image fusion , feature (linguistics) , image (mathematics) , pattern recognition (psychology) , computer vision , convolutional neural network , feature detection (computer vision) , image processing , engineering , paleontology , biochemistry , chemistry , linguistics , philosophy , gene , electrical engineering , biology
Multi-source image series vary in quality. To fuse the feature information of multi-source image series, it is necessary to deeply explore the relevant registration and fusion techniques. The existing techniques of image registration and fusion lack a unified multi-feature-based algorithm framework, and fail to achieve real-time accurate registration. To solve these problems, this paper probes into the artificial intelligence (AI) registration of image series based on multiple features. Firstly, the Harris corner detector was selected to extract the corners of multi-source image series, before explaining and improving the flow of the algorithm. In addition, the deep convolutional neural network (DCNN) VGG16 was improved to extract the features of multi-source image series. Finally, the spatial transformation network was adopted to pre-register the image series, and the image series was deformed and restored based on the region-constrained moving least squares. The proposed registration algorithm was proved effective through experiments.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here