z-logo
open-access-imgOpen Access
2.5D-UNet-HC: 2.5D-UNet Based on Hybrid Convolution for Prostate Ultrasound Image Segmentation
Author(s) -
Siwei Xing,
Yating Jiang,
Yue Teng,
Xinyu Wang,
Wenjun Jiang,
Yaqin Zhou,
Cheng Li
Publication year - 2025
Publication title -
ieee access
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 0.587
H-Index - 127
eISSN - 2169-3536
DOI - 10.1109/access.2025.3620348
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
In the clinical diagnosis of prostate cancer, transrectal ultrasound (TRUS) is a commonly used examination method. Accurate segmentation of the prostate from TRUS images is crucial for physicians’ diagnosis. However, 2D segmentation networks struggle to capture spatial information and fail to consider dependencies between different prostate slices. In contrast, 3D segmentation networks demand substantial computational resources and hardware capabilities. Additionally, due to the thick-layer characteristic of prostate ultrasound data, where most of the information lies in the slice plane, traditional 2.5D segmentation networks inadequately integrate inter-slice information. This paper proposes a 2.5D segmentation network called 2.5D-UNet-HC, which is based on hybrid convolutions. The network replaces all convolutions in U-Net with hybrid convolutions, utilizing a 2D branch to extract regional features of the prostate gland and a 3D branch to capture spatial information. The proposed method not only retains the advantage of 2D convolutions for recognizing local features but also effectively integrates contour information from prostate TRUS images using 3D convolutions. Moreover, the FMR block is designed to comprehensively integrate features obtained at different multidimensional levels. Finally, experimental results demonstrate that the proposed 2.5D-UNet-HC outperforms other segmentation methods on the prostate TRUS dataset, achieving an improved Dice coefficient of 91.34%.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom