z-logo
open-access-imgOpen Access
3D Densely Connected Convolution Neural Networks for Pulmonary Parenchyma Segmentation from CT Images
Author(s) -
Liang Zhao
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1631/1/012049
Subject(s) - segmentation , artificial intelligence , computer science , convolution (computer science) , set (abstract data type) , test set , artificial neural network , data set , pattern recognition (psychology) , convolutional neural network , deconvolution , parenchyma , image segmentation , process (computing) , computer vision , medicine , algorithm , pathology , programming language , operating system
Lung cancer is one of the deadliest diseases in the world today. It kills many people every year. For the diagnosis and treatment of lung cancer, accurate segmentation of lung tissue from CT images is an important process. It is necessary to design a fast and accurate segmentation method to accomplish this task. In the traditional computer-aided diagnosis system, the segmentation of Lung parenchyma is very complex, and the segmentation result depends on the performance of the parameters set in the previous stage. In order to solve these problems, we propose a 3D densely connected convolution neural network which based on deep learning. It has three densely connected blocks and three deconvolution layers. The experimental data set was taken from the public LIDC-IDRI database. A total of 888 samples with slice thickness less than 2.5 mm were selected in the experiments. And the number of samples of training set, test set and validation set is 708, 90 and 90 respectively. In addition, the experimental results show that our method is more accurate than 3D-Unet, but it requires less training parameters.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here