Premium
A fast, accurate, and automatic 2D–3D image registration for image‐guided cranial radiosurgery
Author(s) -
Fu Dongshan,
Kuduvalli Gopinath
Publication year - 2008
Publication title -
medical physics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.473
H-Index - 180
eISSN - 2473-4209
pISSN - 0094-2405
DOI - 10.1118/1.2903431
Subject(s) - image registration , fiducial marker , computer vision , artificial intelligence , rotation (mathematics) , radiosurgery , imaging phantom , rigid transformation , projection (relational algebra) , radiography , computer science , medical imaging , transformation (genetics) , image guided radiation therapy , translation (biology) , cone beam computed tomography , geometric transformation , mathematics , nuclear medicine , algorithm , computed tomography , image (mathematics) , medicine , radiology , radiation therapy , gene , biochemistry , chemistry , messenger rna
The authors developed a fast and accurate two‐dimensional (2D)–three‐dimensional (3D) image registration method to perform precise initial patient setup and frequent detection and correction for patient movement during image‐guided cranial radiosurgery treatment. In this method, an approximate geometric relationship is first established to decompose a 3D rigid transformation in the 3D patient coordinate into in‐plane transformations and out‐of‐plane rotations in two orthogonal 2D projections. Digitally reconstructed radiographs are generated offline from a preoperative computed tomography volume prior to treatment and used as the reference for patient position. A multiphase framework is designed to register the digitally reconstructed radiographs with the x‐ray images periodically acquired during patient setup and treatment. The registration in each projection is performed independently; the results in the two projections are then combined and converted to a 3D rigid transformation by 2D–3D geometric backprojection. The in‐plane transformation and the out‐of‐plane rotation are estimated using different search methods, including multiresolution matching, steepest descent minimization, and one‐dimensional search. Two similarity measures, optimized pattern intensity and sum of squared difference, are applied at different registration phases to optimize accuracy and computation speed. Various experiments on an anthropomorphic head‐and‐neck phantom showed that, using fiducial registration as a gold standard, the registration errors were 0.33 ± 0.16 mm( s . d . )in overall translation and 0.29 ° ± 0.11 °( s . d . )in overall rotation. The total targeting errors were 0.34 ± 0.16 mm( s . d . ) , 0.40 ± 0.2 mm( s . d . ) , and 0.51 ± 0.26 mm( s . d . )for the targets at the distances of 2, 6, and 10 cm from the rotation center, respectively. The computation time was less than 3 s on a computer with an Intel Pentium 3.0 GHz dual processor.