z-logo
Premium
Sci—Fri AM: Imaging — 03: Automated Registration of X‐Ray Mammograms and Magnetic Resonance Breast Images
Author(s) -
Curtis C,
Frayne R,
Fear E
Publication year - 2010
Publication title -
medical physics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.473
H-Index - 180
eISSN - 2473-4209
pISSN - 0094-2405
DOI - 10.1118/1.3476182
Subject(s) - mammography , magnetic resonance imaging , projection (relational algebra) , artificial intelligence , image registration , breast cancer , computer vision , computer science , breast mri , medical imaging , breast imaging , medicine , pattern recognition (psychology) , radiology , cancer , image (mathematics) , algorithm
Breast cancer is a common and devastating form of cancer, with an estimated 22,700 new cases in 2009 in Canada alone. X‐ray mammography is the most commonly used imaging technique for detection and diagnosis, while magnetic resonance imaging (MRI) is used in some challenging cases. Both modalities rely on different properties of the tissue to form images, and thus contribute different and complimentary information about the breast. However, due to geometric distortions during the acquisition processes, it is difficult to identify and compare the same anatomical location on both modalities. In this work, a method to register 2D mammograms to projection images of MRI volumes is presented. In order to compare the 3D MRI to the 2D mammogram, a “simulated mammogram” is formed from the MRI volume. Three anatomical landmarks on the surface of the breast are identified on each image and aligned to distort the general shape of the mammogram to match that of the MR projection image. Final registration is then achieved by iteratively applying a non‐linear transformation to the mammogram until the mutual information of the two images is maximized. The registration method was tested on eight pairs of images from two volunteers (two mammographic views from each breast). Results from this small dataset are promising, with an average alignment error of 3.9%, measured as the difference in area between the two registered images. Future work will examine a larger dataset, including pathological cases, and quantification of internal alignment errors.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here