z-logo
open-access-imgOpen Access
Resnet Based Feature Extraction with Decision Tree Classifier for Classificaton of Mammogram Images
Author(s) -
T. Sathya Priya
Publication year - 2021
Publication title -
türk bilgisayar ve matematik eğitimi dergisi
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.218
H-Index - 3
ISSN - 1309-4653
DOI - 10.17762/turcomat.v12i2.1136
Subject(s) - artificial intelligence , computer science , preprocessor , decision tree , pattern recognition (psychology) , segmentation , classifier (uml) , residual , mammography , breast cancer , feature extraction , convolutional neural network , artificial neural network , decision tree learning , cancer , medicine , algorithm
Right now, breast cancer is considered as a most important health problem among women over the world. The detection of breast cancer in the beginning stage can reduce the mortality rate to a considerable extent. Mammogram is an effective and regularly used technique for the detection and screening of breast cancer. The advanced deep learning (DL) techniques are utilized by radiologists for accurate finding and classification of medical images. This paper develops a new deep segmentation with residual network (DS-RN) based breast cancer diagnosis model using mammogram images. The presented DS-RN model involves preprocessing, Faster Region based Convolution Neural Network (R-CNN) (Faster R-CNN) with Inception v2 model based segmentation, feature extraction and classification. To classify the mammogram images, decision tree (DT) classifier model is used. A detailed simulation process is performed to ensure the betterment of the presented model on the Mini-MIAS dataset. The obtained experimental values stated that the DS-RN model has reached to a maximum classification performance with the maximum sensitivity, specificity, accuracy and F-Measure of 98.15%, 100%, 98.86% and 99.07% respectively.  

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here