RUnT: A Network Combining Residual U-Net and Transformer for Vertebral Edge Feature Fusion Constrained Spine CT Image Segmentation
نویسندگان
چکیده
Scoliosis, spinal deformity and vertebral spondylolisthesis are disorders with high incidence, which seriously affect people’s lives health. CT is an important medical tool for the detection diagnosis of provides a large amount pathologically valid information in various clinical practices such as spine pathology assessment computer-assisted surgical interventions. As presents long span, complex shape biological curve multi-stage similarity sagittal plane images. Therefore, fast accurate segmentation technology has become research direction computer-aided diagnosis. We proposed RUnT network based on combination residual U-Net feature extraction Vision Transformer structure efficient automatic multiple vertebrae spine. The deep features first extracted using to prevent gradient diffusion while improving accuracy contour segmentation. Then multi-scale maps by containing rich superficial input edge module. designed refine boundaries ensure consistency each vertebra combining operations deconvolution convolution three different scales features.Finally, global module combined local achieve blending location through self-attentive map volume. By mixing semantic features, confusion arising from between when decoder extracts reduced. model this paper experimented CTSpine1K VerSe 20 public datasets. results show that obtains state-of-the-art performance average DSC scores 88.4% 81.5% 20, respectively, reducing distance HD95 4.86 3.88.
منابع مشابه
Recurrent Residual Convolutional Neural Network based on U-Net (R2U-Net) for Medical Image Segmentation
Deep learning (DL) based semantic segmentation methods have been providing state-of-the-art performance in the last few years. More specifically, these techniques have been successfully applied to medical image classification, segmentation, and detection tasks. One deep learning technique, U-Net, has become one of the most popular for these applications. In this paper, we propose a Recurrent Co...
متن کاملAutomatic segmentation of glioma tumors from BraTS 2018 challenge dataset using a 2D U-Net network
Background: Glioma is the most common primary brain tumor, and early detection of tumors is important in the treatment planning for the patient. The precise segmentation of the tumor and intratumoral areas on the MRI by a radiologist is the first step in the diagnosis, which, in addition to the consuming time, can also receive different diagnoses from different physicians. The aim of this study...
متن کاملU-Net: Convolutional Networks for Biomedical Image Segmentation
There is large consent that successful training of deep networks requires many thousand annotated training samples. In this paper, we present a network and training strategy that relies on the strong use of data augmentation to use the available annotated samples more efficiently. The architecture consists of a contracting path to capture context and a symmetric expanding path that enables prec...
متن کاملDR2-Net: Deep Residual Reconstruction Network for Image Compressive Sensing
Most traditional algorithms for compressive sensing image reconstruction suffer from the intensive computation. Recently, deep learning-based reconstruction algorithms have been reported, which dramatically reduce the time complexity than iterative reconstruction algorithms. In this paper, we propose a novel Deep Residual Reconstruction Network (DRNet) to reconstruct the image from its Compress...
متن کاملRoad Extraction by Deep Residual U-Net
Road extraction from aerial images has been a hot research topic in the field of remote sensing image analysis. In this letter, a semantic segmentation neural network which combines the strengths of residual learning and U-Net is proposed for road area extraction. The network is built with residual units and has similar architecture to that of U-Net. The benefits of this model is two-fold: firs...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Access
سال: 2023
ISSN: ['2169-3536']
DOI: https://doi.org/10.1109/access.2023.3281468