WISDOM: Weakly supervISed model DevelOpment fraMework for lymph node diagnosis in rectal cancer at MRI
model2025-11-30https://doi.org/10.1148/atlas.1764460430494
171

Overview

Schema Version

https://atlas.rsna.org/schemas/2025-11/model.json

Name

WISDOM: Weakly supervISed model DevelOpment fraMework for lymph node diagnosis in rectal cancer at MRI

Link

https://dx.doi.org/10.1148/ryai.230152

Indexing

Keywords: WISDOM, weakly supervised learning, multiple-instance learning, rectal cancer, lymph node diagnosis, MRI, T2-weighted, DWI, ADC, N staging
Content: GI, MR, OI
RadLex: RID12698, RID163, RID28919, RID10799, RID39275

Author(s)

Wei Xia
Dandan Li
Wenguang He
Perry J. Pickhardt
Junming Jian
Rui Zhang
Junjie Zhang
Ruirui Song
Tong Tong
Xiaotang Yang
Xin Gao
Yanfen Cui

Organization(s)

Department of Medical Imaging, Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, China
Shanxi Province Cancer Hospital/Shanxi Hospital Affiliated to Cancer Hospital, Chinese Academy of Medical Sciences/Cancer Hospital Affiliated to Shanxi Medical University, Taiyuan, China
Department of Radiology, The First Affiliated Hospital, Zhejiang University School of Medicine, Hangzhou, China
Department of Radiology, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin, USA
Department of Radiology, Fudan University Shanghai Cancer Center, Shanghai, China
Department of Oncology, Shanghai Medical College, Fudan University, Shanghai, China
Guangdong Provincial Key Laboratory of Artificial Intelligence in Medical Image Analysis and Application, Guangzhou, China

Version

1.0

License

Text: CC BY 4.0
URL: https://creativecommons.org/licenses/by/4.0/

Contact

moc.621@012nefnay

Funding

National Natural Science Foundation of China (81871439, 82001789, 82171923, 81971687, 82271946); Key Research and Development Program of Shandong Province (2021SFGC0104); Key Research and Development Program of Jiangsu Province (BE2021663); Suzhou Science and Technology Plan Project (SJC2021014); China Postdoctoral Science Foundation (2021M700897); Applied Basic Research Projects of Shanxi Province, Outstanding Youth Foundation (202103021222014); Suzhou Association for Science and Technology Youth Science and Technology Talent Support Project; Taishan Industrial Experts Program (tscx202312131).

Ethical review

Institutional review board approval and waiver of informed consent were obtained for this retrospective multicenter study.

Date

Published: 2024-02-14
Created: 2024-01-24

References

[1] Xia W, Li D, He W, Pickhardt PJ, Jian J, Zhang R, Zhang J, Song R, Tong T, Yang X, Gao X, Cui Y. "Multicenter Evaluation of a Weakly Supervised Deep Learning Model for Lymph Node Diagnosis in Rectal Cancer at MRI". Radiology: Artificial Intelligence. 2024;6(2):e230152. 2024-02-14. doi:10.1148/ryai.230152. PMID: 38353633. PMCID: PMC10982819.

Model

Architecture

Weakly supervised deep learning framework combining: (1) a ResNet-based intensity diagnostic network with Grad-CAM visualizations; (2) a multilayer perceptron-based integrated diagnostic network that fuses intensity-based metastatic probability with LN size features and ADC values; weak supervision via multiple-instance learning (global max pooling) and learning from label proportions (global average pooling).

Availability

Open-source code: https://github.com/xiawei999000/WISDOM

Clinical benefit

Assists preoperative lymph node diagnosis and N staging in rectal cancer at MRI; significantly improves radiologist performance, especially for junior readers.

Clinical workflow phase

Clinical decision support systems for preoperative staging and treatment planning.

Degree of automation

Model generates per-LN metastatic probabilities and patient-level staging automatically from MRI-derived LN patches/features; can be combined with a fully automated LN detection/segmentation model (auto-LNDS) for end-to-end automation.

Indications for use

Preoperative assessment of lymph node metastasis and N stage (N0/N1/N2) in patients with pathology-confirmed rectal adenocarcinoma undergoing pelvic MRI (T2-weighted and DWI/ADC) within 1 week prior to total mesorectal excision in hospital imaging settings.

Input

Axial T2-weighted MRI and diffusion-weighted imaging with derived ADC maps; LN masks/patches and automatically computed LN size features (long/short axis, ratio) and mean ADC. Training uses patient-level pathologic labels (number of metastatic and total resected LNs).

Instructions

Provide preprocessed axial T2-weighted and DWI/ADC MRI with LN masks/patches (e.g., from auto-LNDS or manual review). Review model-predicted per-LN probabilities and heatmaps; incorporate patient-level binary and ternary N staging outputs during interpretation.

Limitations

No per-node pathologic reference standard; small metastatic LNs may be missed due to MRI resolution leading to lower sensitivity; model did not assess proximity of suspicious LNs to mesorectal fascia; rare tumor types (e.g., mucinous adenocarcinoma, signet ring cell carcinoma) were not included; trained only on patients undergoing TME and only mesorectal LNs—does not evaluate nonregional LNs or posttreatment follow-up.

Output

CDEs: RDE2630, RDE114, RDE2034, RDE2077
Description: Per–lymph node metastatic probability with attention heatmaps; patient-level binary N status (LNM vs non-LNM), ternary N stage (N0/N1/N2), and estimated number of metastatic LNs.

Recommendation

Use as adjunct to radiologist interpretation for MRI-based LN staging in rectal cancer; demonstrated improvement in accuracy for both junior and senior radiologists.

Reproducibility

Multicenter evaluation with internal and two external test cohorts (three centers; total n=1014). Code publicly available for replication; image preprocessing steps and training procedures described in supplemental material.

Use

Intended: Decision support, Detection and diagnosis
Out-of-scope: Prognosis, Detection and diagnosis
Excluded: Detection and diagnosis

User

Intended: Physician, Radiologist, Subspecialist diagnostic radiologist
Out-of-scope: Layperson
Excluded: Patient, Layperson