Research Library

Premium SU‐E‐T‐488: Incorporating In‐House Treatment Planning Software Into Clinical Auto‐Planning Workflow
Author(s)
Lu W,
Chen M,
Tan J,
Jiang S
Publication year2015
Publication title
medical physics
Resource typeJournals
PublisherAmerican Association of Physicists in Medicine
Purpose: To report the process of incorporating in‐house software tools into clinical workflow of auto‐planning with the objectives of achieving efficiency and quality plans.With multi‐year research and development, we built in‐house software suites of various GPU‐based components of a treatment planning system (TPS), which includes DICOM import/export, various deformable image registration (DIR) and contour generation algorithms, multiple dose calculators and optimizers, and plan generation with or without reference guidance. However, to meet clinical quality, we need to further test the robustness of the tools, streamline the processes, including the front and back end processes. Methods: We build an in‐house patient database management system and collect patient data of various delivery modalities, such as IMRT, VMAT, etc., and of various disease sites, such as head and neck, breast, lung, prostate, etc., to be used as repository for part of simulating the clinical workflow and subsequent testing.A unit‐testing tool was developed that any input parameter can be loaded from a configuration file for batch processing. Modular inputs and outputs were part of the design, and button clicking and human interaction can be bypassed. Results: The database with various delivery modalities and various disease sites allows the auto‐planning tools to test on variety of cases. The unit‐testing tools allow mass data processing and analysis. As testing showed, the whole auto‐planning process can be done in a couple of minutes without any human interaction. Conclusion: A database and unit‐testing tools are essential in incorporating the in‐house software into the clinical auto‐planning workflow and greatly facilitate the implementation by allowing component testing and mass data processing and analysis. Thus, desired functional behavior may be ensured, the impact of any code modification can be more easily identified, and any unintended alteration can be avoided.
Subject(s)archaeology , artificial intelligence , computer science , database , dicom , engineering , external quality assessment , history , modular design , navy , operating system , operations management , quality assurance , software , software engineering , test plan , workflow
Language(s)English
SCImago Journal Rank1.473
H-Index180
eISSN2473-4209
pISSN0094-2405
DOI10.1118/1.4924850

Seeing content that should not be on Zendy? Contact us.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here