
Attention YOLACT++ for real-time instance segmentation of medical instruments in endoscopic procedures
Author(s) -
Juan Carlos Angeles-Ceron,
Gilberto Ochoa-Ruiz,
Leonardo Chang,
S. Hyder Ali
Publication year - 2021
Language(s) - English
Resource type - Conference proceedings
DOI - 10.52591/lxai2021062511
Subject(s) - robustness (evolution) , computer science , segmentation , artificial intelligence , computer vision , image segmentation , biochemistry , chemistry , gene
Image-based tracking of laparoscopic instruments plays a fundamental role in computer and robotic-assisted surgeries by aiding surgeons and increasing patient safety. Computer vision contests, such as the 2019 Robust Medical Instrument Segmentation (ROBUST-MIS) Challenge, encourage the development of robust models for surgical instrument segmentation and provide large, diverse, and extensive annotated datasets. To date, most of the existing models for instance segmentation of medical instruments were based on two-stage detectors, which provide robust results but are nowhere near to the real-time (5 frames-per-second (fps) at most). However, in order for the method to be clinically applicable, real-time capability is utmost required along with high accuracy. In this paper, we propose the addition of attention mechanisms to the YOLACT architecture that allows real-time instance segmentation of instrument with improved accuracy on the ROBUST-MIS dataset. Our proposed approach outperforms the winner of the 2019 ROBUST-MIS challenge in terms of robustness scores, obtaining 0.338 MI DSC and 0.383 MI NSD, while achieving real-time performance (37 fps).