Premium
Interactive initialization of 2D/3D rigid registration
Author(s) -
Gong Ren Hui,
Güler Özgür,
Kürklüoglu Mustafa,
Lovejoy John,
Yaniv Ziv
Publication year - 2013
Publication title -
medical physics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.473
H-Index - 180
eISSN - 2473-4209
pISSN - 0094-2405
DOI - 10.1118/1.4830428
Subject(s) - initialization , computer science , computer vision , image registration , artificial intelligence , volume rendering , rendering (computer graphics) , patient registration , rigid transformation , augmented reality , computer graphics (images) , image (mathematics) , programming language
Purpose: Registration is one of the key technical components in an image‐guided navigation system. A large number of 2D/3D registration algorithms have been previously proposed, but have not been able to transition into clinical practice. The authors identify the primary reason for the lack of adoption with the prerequisite for a sufficiently accurate initial transformation, mean target registration error of about 10 mm or less. In this paper, the authors present two interactive initialization approaches that provide the desired accuracy for x‐ray/MR and x‐ray/CT registration in the operating room setting.Methods: The authors have developed two interactive registration methods based on visual alignment of a preoperative image, MR, or CT to intraoperative x‐rays. In the first approach, the operator uses a gesture based interface to align a volume rendering of the preoperative image to multiple x‐rays. The second approach uses a tracked tool available as part of a navigation system. Preoperatively, a virtual replica of the tool is positioned next to the anatomical structures visible in the volumetric data. Intraoperatively, the physical tool is positioned in a similar manner and subsequently used to align a volume rendering to the x‐ray images using an augmented reality (AR) approach. Both methods were assessed using three publicly available reference data sets for 2D/3D registration evaluation.Results: In the authorsˈ experiments, the authors show that for x‐ray/MR registration, the gesture based method resulted in a mean target registration error (mTRE) of 9.3 ± 5.0 mm with an average interaction time of 146.3 ± 73.0 s, and the AR‐based method had mTREs of 7.2 ± 3.2 mm with interaction times of 44 ± 32 s. For x‐ray/CT registration, the gesture based method resulted in a mTRE of 7.4 ± 5.0 mm with an average interaction time of 132.1 ± 66.4 s, and the AR‐based method had mTREs of 8.3 ± 5.0 mm with interaction times of 58 ± 52 s.Conclusions: Based on the authorsˈ evaluation, the authors conclude that the registration approaches are sufficiently accurate for initializing 2D/3D registration in the OR setting, both when a tracking system is not in use (gesture based approach), and when a tracking system is already in use (AR based approach).