z-logo
Premium
Automated techniques for blood vessels segmentation through fundus retinal images: A review
Author(s) -
Akbar Shahzad,
Sharif Muhammad,
Akram Muhammad Usman,
Saba Tanzila,
Mahmood Toqeer,
Kolivand Mahyar
Publication year - 2019
Publication title -
microscopy research and technique
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.536
H-Index - 118
eISSN - 1097-0029
pISSN - 1059-910X
DOI - 10.1002/jemt.23172
Subject(s) - fundus (uterus) , retinal , segmentation , ophthalmology , artificial intelligence , optometry , computer science , anatomy , computer vision , medicine
Retina is the interior part of human's eye, has a vital role in vision. The digital image captured by fundus camera is very useful to analyze the abnormalities in retina especially in retinal blood vessels. To get information of blood vessels through fundus retinal image, a precise and accurate vessels segmentation image is required. This segmented blood vessel image is most beneficial to detect retinal diseases. Many automated techniques are widely used for retinal vessels segmentation which is a primary element of computerized diagnostic systems for retinal diseases. The automatic vessels segmentation may lead to more challenging task in the presence of lesions and abnormalities. This paper briefly describes the various publicly available retinal image databases and various machine learning techniques. State of the art exhibited that researchers have proposed several vessel segmentation methods based on supervised and supervised techniques and evaluated their results mostly on publicly datasets such as digital retinal images for vessel extraction and structured analysis of the retina. A comprehensive review of existing supervised and unsupervised vessel segmentation techniques or algorithms is presented which describes the philosophy of each algorithm. This review will be useful for readers in their future research.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here