Deep Learning-assisted Diagnosis of Breast Lesions on Ultrasound (Dual Attention DenseNet-121)
model2025-12-05https://doi.org/10.1148/atlas.1764971955087
121

Overview

Schema Version

https://atlas.rsna.org/schemas/2025-11/model.json

Name

Deep Learning-assisted Diagnosis of Breast Lesions on Ultrasound (Dual Attention DenseNet-121)

Link

https://pubmed.ncbi.nlm.nih.gov/37795135/

Indexing

Keywords: Ultrasound, Breast, Diagnosis, Breast Cancer, Deep Learning, Ultrasonography
Content: BR, US
RadLex: RID39055, RID10904, RID10909, RID10321

Author(s)

Huiling Xiang
Xi Wang
Min Xu
Yuhua Zhang
Shue Zeng
Chunyan Li
Lixian Liu
Tingting Deng
Guoxue Tang
Cuiju Yan
Jinjing Ou
Qingguang Lin
Jiehua He
Peng Sun
Anhua Li
Hao Chen
Pheng-Ann Heng
Xi Lin

Organization(s)

Sun Yat-sen University Cancer Center, State Key Laboratory of Oncology in South China, Collaborative Innovation Center for Cancer Medicine, Guangzhou, China
Zhejiang Laboratory, Hangzhou, China
Department of Radiation Oncology, Stanford University School of Medicine, Palo Alto, California, USA
Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China
The First Affiliated Hospital, College of Medicine, Zhejiang University, Hangzhou, China
Key Laboratory of Precision Diagnosis and Treatment for Hepatobiliary and Pancreatic Tumor of Zhejiang Province, Hangzhou, China
The Third People's Hospital of Zhengzhou, Cancer Hospital of Henan University, Zhengzhou, China
Hubei Cancer Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China
The Hong Kong University of Science and Technology, Hong Kong, China

Version

1.0

Contact

nc.gro.ccusys@ixnil

Funding

Supported by the National Natural Science Foundation of China (grant no. 82171955) and the Natural Science Foundation of Guangdong Province (grant no. 2021A1515012476).

Ethical review

Ethical approval was obtained from all participating hospitals; written informed consent was waived as the study presented no more than minimal risk.

Date

Updated: 2023-09-01
Published: 2023-07-12
Created: 2011-03-01

References

[1] Xiang H, Wang X, Xu M, et al.. "Deep Learning-assisted Diagnosis of Breast Lesions on US Images: A Multivendor, Multicenter Study". Radiology: Artificial Intelligence. 2023 Sep;5(5):e220185.. 2023-09-01. doi:10.1148/ryai.220185. PMID: 37795135. PMCID: PMC10546363.

Model

Architecture

Dual Attention DenseNet-121 convolutional neural network (DenseNet-121 backbone with position and channel attention modules; binary classification).

Availability

Code available at https://github.com/vancywx/Ultrasound-breast-cancer-diagnosis

Clinical benefit

Assists radiologists in discriminating malignant from benign breast lesions on ultrasound; improves diagnostic accuracy and interobserver agreement; reduces false-positive readings.

Clinical workflow phase

Clinical decision support during image interpretation.

Decision threshold

Operating points selected by maximizing Youden index on validation set; examples reported: low-threshold operating point achieved NPV 100% with specificity 18.6%; high-threshold operating point achieved PPV 86.4% with sensitivity 89.9%.

Degree of automation

Assistive decision support providing malignancy probability and classification; radiologist-in-the-loop.

Indications for use

Diagnosis to discriminate malignant versus benign breast lesions in patients undergoing diagnostic breast ultrasound in hospital settings across multiple vendors; applicable to static B-mode and color Doppler images.

Input

Static B-mode and color Doppler ultrasound images (multiview; typically two orthogonal images per examination).

Instructions

Provide multiview B-mode and color Doppler images per lesion; the model outputs image-level probabilities which are averaged to lesion-level probability. Readers in the study reviewed unaided and then with access to model-predicted malignancy probability.

Limitations

Retrospective diagnostic cohort with high malignancy rate; requires prospective screening validation. Limited follow-up for biopsy-proven high-risk lesions. Readers interpreted static images, not dynamic videos; model performance on video not assessed. Included both mass and non-mass lesions with varied appearances; subgroup performance for non-mass lesions requires further study. Some cancers correctly identified by readers were downgraded when DL was added (average 0.84% of lesions).

Output

CDEs: RDE2074, RDE2077
Description: Image-level and lesion-level predicted probability of malignancy and binary classification (benign vs malignant).

Recommendation

Use as an assistive tool to support breast ultrasound diagnosis, particularly beneficial for novice readers to improve accuracy and agreement and to reduce false positives; final decisions remain with radiologists.

Reproducibility

Implemented in PyTorch 1.9.0 with DenseNet-121 pretrained on ImageNet; trained with Adam (lr=1e-4, β1=0.9, β2=0.999), step decay by 0.1 every 2500 iterations, batch size 12, binary cross-entropy; data augmentation with random horizontal/vertical flips (p=0.5); trained/evaluated on NVIDIA TITAN Xp GPU; lesion-level scores computed by averaging multiview image probabilities.

Sustainability

Single-GPU training/inference reported (NVIDIA TITAN Xp); no energy use or runtime benchmarks provided.

Use

Intended: Decision support, Detection and diagnosis
Out-of-scope: Detection, Other, Noise reduction
Excluded: Diagnosis

User

Intended: Radiology technologist, Radiologist
Out-of-scope: Patient
Excluded: Other