Premium
WE‐G‐BRD‐04: BEST IN PHYSICS (JOINT IMAGING‐THERAPY): An Integrated Model‐Based Intrafractional Organ Motion Tracking Approach with Dynamic MRI in Head and Neck Radiotherapy
Author(s) -
Chen H,
Dolly S,
Victoria J,
Anastasio M,
Ruan S,
Low D,
Li H,
Wooten H,
Dempsey J,
Gay H,
Mutic S,
Thorstad W,
Li H
Publication year - 2015
Publication title -
medical physics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.473
H-Index - 180
eISSN - 2473-4209
pISSN - 0094-2405
DOI - 10.1118/1.4926060
Subject(s) - radiation therapy , artificial intelligence , computer vision , isocenter , image registration , medical imaging , computer science , sagittal plane , nuclear medicine , medicine , radiology , image (mathematics)
Purpose: In‐treatment dynamic cine images, provided by the first commercially available MRI‐guided radiotherapy system, allow physicians to observe intrafractional motion of head and neck (H&N) internal structures. Nevertheless, high anatomical complexity and relatively poor cine image contrast/resolution have complicated automatic intrafractional motion evaluation. We proposed an integrated model‐based approach to automatically delineate and analyze moving structures from on‐board cine images. Methods: The H&N upper airway, a complex and highly deformable region wherein severe internal motion often occurs, was selected as the target‐to‐be‐tracked. To reliably capture its motion, a hierarchical structure model containing three statistical shapes (face, face‐jaw, and face‐jaw‐palate) was first built from a set of manually delineated shapes using principal component analysis. An integrated model‐fitting algorithm was then employed to align the statistical shapes to the first to‐be‐detected cine frame, and multi‐feature level‐set contour propagation was performed to identify the airway shape change in the remaining frames. Ninety sagittal cine MR image sets, acquired from three H&N cancer patients, were utilized to demonstrate this approach. Results: The tracking accuracy was validated by comparing the results to the average of two manual delineations in 20 randomly selected images from each patient. The resulting dice similarity coefficient (93.28+/−1.46 %) and margin error (0.49+/−0.12 mm) showed good agreement with the manual results. Intrafractional displacements of anterior, posterior, inferior, and superior airway boundaries were observed, with values of 2.62+/−2.92, 1.78+/−1.43, 3.51+/−3.99, and 0.68+/−0.89 mm, respectively. The H&N airway motion was found to vary across directions, fractions, and patients, and highly correlated with patients’ respiratory frequency. Conclusion: We proposed the integrated computational approach, which for the first time allows to automatically identify the H&N upper airway and quantify in‐treatment H&N internal motion in real‐time. This approach can be applied to track other structures’ motion, and provide guidance on patient‐specific prediction of intra‐/inter‐fractional structure displacements.