
Recurrent learning with clique structures for prostate sparse‐view CT artifacts reduction
Author(s) -
Shen Tiancheng,
Yang Yibo,
Lin Zhouchen,
Zhang Mingbin
Publication year - 2021
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/ipr2.12048
Subject(s) - artificial intelligence , computer science , reduction (mathematics) , feature (linguistics) , clique , pattern recognition (psychology) , block (permutation group theory) , convolutional neural network , deep learning , iterative reconstruction , computer vision , mathematics , geometry , combinatorics , philosophy , linguistics
In recent years, convolutional neural networks have achieved great success in streak artifacts reduction. However, there is no special method designed for the artifacts reduction of the prostate. To solve the problem, the artifacts reduction CliqueNet (ARCliqueNet) to reconstruct dense‐view computed tomography images form sparse‐view computed tomography images is proposed. In detail, first, the proposed ARCliqueNet extracts a set of feature maps from the prostate sparse‐view CT image by Clique Block. Second, the feature maps are sent to ASPP with memory to be refined. Thenanother Clique Block is applied to the output of ASPP with memory and reconstruct the dense‐view CT images. Later on, reconstructed dense‐view CT images are used as new input of the original network. This process is repeated recurrently with memory delivering information between these recurrent stages. The final reconstructed dense‐view CT images are the output of the last recurrent stage. Our proposed ARCliqueNet outperforms the SOTA (state‐of‐the‐art) general artifacts reduction methods on the prostate dataset in terms of PSNR (peak signal‐to‐noise ratio) and SSIM (structural similarity). Therefore, we can draw the conclusion that Clique structures, ASPP with memory and recurrent learning are useful for prostate sparse‐view CT Artifacts here.