Premium
SU‐FF‐J‐12: Computed 4D Patient Models for Motion Compensation in Radiotherapy
Author(s) -
Berlinger K,
Sauer O,
Vences L,
Roth M,
Dötter M,
Schweikard A
Publication year - 2005
Publication title -
medical physics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.473
H-Index - 180
eISSN - 2473-4209
pISSN - 0094-2405
DOI - 10.1118/1.1997558
Subject(s) - computer science , position (finance) , stack (abstract data type) , tracking (education) , motion compensation , medical imaging , computer vision , artificial intelligence , exhalation , nuclear medicine , medicine , radiology , psychology , pedagogy , finance , economics , programming language
Purpose: We designed a system that can track tumors moving due to respiratory motion, allowing more specific and effective irradiation of the tumor. Method and Materials: First two CT scans are taken during maximal inhalation and exhalation. Then several synthetic intermediate scans are computed by using morphing methods, yielding a 3D motion picture. Before treatment x‐ray images are taken periodically and compared to the 4D model. After registration of the x‐ray with the model we know the best matching stack and therefore tumor position and respiratory state. Thus we acquire correlation from respiratory state to tumor position. Due to a registration time of 10 seconds for each stack, we only get intermittent information about the target location. The target may already have moved. Therefore we use an infrared tracking system, with emitters attached to significant positions of the patient's body, to report information on the current state of respiration in real‐time. The information of the sensor is correlated to the target location computed by the comparison between the live shot and the model. Results: To create the model we use a thin‐plate spline‐based method. 47 corresponding control points were manually selected for deforming a lung. To evaluate the results transversal snap shots were compared with the model, yielding the corresponding respiratory state. We also tested the 2D/4D registration process by generating two mutually orthogonal DRRs, which we matched afterwards to the model. The best match was the stack containing the DRRs, furthermore the neighboring stacks led to the next best results. Conclusion: We have shown that it is possible to determine the respiratory state by matching a synthetic x‐ray to a generated 4D model of an internal organ. Next steps will be to improve our method of generating the model and to test the registration with real x‐rays.