Multi-Scale Fusion U-Net for the Segmentation of Breast Lesions

نویسندگان

چکیده

Breast lesion is a malignant tumor that occurs in the epithelial tissue of breast. The early detection breast lesions can make patients for treatment and improve survival rate. Thus, accurate automatic segmentation from ultrasound images fundamental task. However, effectively still faced up with two challenges. One characteristics lesions’ multi-scale other one blurred edges difficult. To solve these problems, we propose deep learning architecture, named Multi-scale Fusion U-Net (MF U-Net), which extracts texture features edge image. It includes novel modules new focal loss: 1) Module (WFM) segmenting irregular fuzzy lesions, 2) Multi-Scale Dilated Convolutions (MDCM) overcoming difficulties caused by large-scale changes 3) focal-DSC loss proposed to class imbalance problems segmentation. Moreover, there are some convolutional layers different receptive fields MDCM, improves network’s ability extract features. Comparative experiments reveal MF this paper outperforms methods, achieves state-of-the-art result 0.9421 Recall, 0.9345 Precision, 0.0694 FPs/image, 0.9535 DSC 0.9112 IOU on Benchmark Ultrasound Image Segmentation (BUSIS) dataset.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Automatic segmentation of glioma tumors from BraTS 2018 challenge dataset using a 2D U-Net network

Background: Glioma is the most common primary brain tumor, and early detection of tumors is important in the treatment planning for the patient. The precise segmentation of the tumor and intratumoral areas on the MRI by a radiologist is the first step in the diagnosis, which, in addition to the consuming time, can also receive different diagnoses from different physicians. The aim of this study...

متن کامل

U-Net: Convolutional Networks for Biomedical Image Segmentation

There is large consent that successful training of deep networks requires many thousand annotated training samples. In this paper, we present a network and training strategy that relies on the strong use of data augmentation to use the available annotated samples more efficiently. The architecture consists of a contracting path to capture context and a symmetric expanding path that enables prec...

متن کامل

Deep Fusion Net for Multi-atlas Segmentation: Application to Cardiac MR Images

Atlas selection and label fusion are two major challenges in multi-atlas segmentation. In this paper, we propose a novel deep fusion net for better solving these challenges. Deep fusion net is a deep architecture by concatenating a feature extraction subnet and a non-local patchbased label fusion (NL-PLF) subnet in a single network. This network is trained end-to-end for automatically learning ...

متن کامل

Recurrent Residual Convolutional Neural Network based on U-Net (R2U-Net) for Medical Image Segmentation

Deep learning (DL) based semantic segmentation methods have been providing state-of-the-art performance in the last few years. More specifically, these techniques have been successfully applied to medical image classification, segmentation, and detection tasks. One deep learning technique, U-Net, has become one of the most popular for these applications. In this paper, we propose a Recurrent Co...

متن کامل

3D U-Net: Learning Dense Volumetric Segmentation from Sparse Annotation

This paper introduces a network for volumetric segmentation that learns from sparsely annotated volumetric images. We outline two attractive use cases of this method: (1) In a semi-automated setup, the user annotates some slices in the volume to be segmented. The network learns from these sparse annotations and provides a dense 3D segmentation. (2) In a fully-automated setup, we assume that a r...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2021

ISSN: ['2169-3536']

DOI: https://doi.org/10.1109/access.2021.3117578