LION abdominal MRI multiorgan nnU-Net segmentation
model2025-11-23https://doi.org/10.1148/atlas.1763916894204
11

Overview

Schema Version

https://atlas.rsna.org/schemas/2025-11/model.json

Name

LION abdominal MRI multiorgan nnU-Net segmentation

Link

https://pubs.rsna.org/doi/10.1148/ryai.230471

Indexing

Keywords: nnU-Net, 3D U-Net, chemical shift–encoded MRI, water–fat MRI, PDFF, visceral adipose tissue, subcutaneous adipose tissue, liver, psoas muscle, erector spinae, body composition, segmentation, weight loss intervention
Content: MR, GI
RadLex: RID30192, RID78, RID7780, RID29380, RID50162, RID28832, RID50366, RID2625
SNOMED: 414916001

Author(s)

Arun Somasundaram
Mingming Wu
Anna Reik
Selina Rupp
Jessie Han
Stella Naebauer
Daniela Junker
Lisa Patzelt
Meike Wiechert
Yu Zhao
Daniel Rueckert
Hans Hauner
Christina Holzapfel
Dimitrios C. Karampinos

Organization(s)

Technical University of Munich
Imperial College London
Fulda University of Applied Sciences
Else Kröner Fresenius Center for Nutritional Medicine, School of Medicine, TUM
Munich Institute of Biomedical Engineering, TUM
Munich Data Science Institute, TUM

Version

1.0

License

Text: CC BY 4.0
URL: https://creativecommons.org/licenses/by/4.0/

Contact

ed.mut@uw.gnimgnim

Funding

German Federal Ministry of Education and Research (grant no. 01EA1709, PeNut/enable; IMaGENE grant no. 16DKWN075); German Research Foundation (project no. 450799851 and 455422993/FOR 5298-iMAGO-P1); support to institution for D.R. from the German Federal Ministry of Education and Research, the European Research Council, and the Alexander von Humboldt Foundation.

Ethical review

Study protocol approved by the ethical committee of the Technical University of Munich (project no. 69/19S). Written informed consent obtained from all participants.

Date

Published: 2024-05-29

References

[1] Somasundaram A, Wu M, Reik A, et al.. "Evaluating Sex-specific Differences in Abdominal Fat Volume and Proton Density Fat Fraction at MRI Using Automated nnU-Net–based Segmentation". Radiology: Artificial Intelligence. 2024;6(4):e230471.. 2024-05-29. doi:10.1148/ryai.230471. PMID: 38809148. PMCID: PMC11294970.

Model

Architecture

3D-fullres nnU-Net (self-configuring U-Net–like network) implemented in PyTorch 1.7.0; trained with water, fat, and T2* channels.

Availability

Code and model results available: https://github.com/BMRRgroup/lion-abd-seg-nnunet and https://github.com/BMRRgroup/lion-abd-seg-3dunet

Clinical benefit

Automated multiorgan segmentation on quantitative water–fat MRI enabling extraction of organ volumes and mean PDFF for body composition analysis and evaluation of sex-specific differences in obesity and during weight loss interventions.

Degree of automation

Fully automated segmentation (nnU-Net) after training; no manual interaction required for inference.

Indications for use

Quantitative analysis of abdominal and pelvic organ volumes (VAT, SAT, liver, psoas and erector spinae muscles) and mean PDFF from chemical shift–encoded MRI in adults with obesity, including assessment before and after dietary weight loss intervention; research setting.

Input

Abdominal/pelvic CSE MRI-derived water and fat images; optional background-removed T2* maps as third input channel.

Limitations

Development and testing conducted on single-institution data acquired on a 3T Philips system with six-echo CSE MRI; external generalizability not assessed. Performance may be limited on images with different MR contrast (e.g., T1-weighted two-point Dixon) as shown in supplemental materials. Training data BMI range mainly 26–40 kg/m2; segmentation at extremes and outside this range not fully characterized. Ground truth label variability and partial volume effects, especially for VAT borders and fat-infiltrated erector spinae, affect PDFF estimates.

Output

CDEs: RDE1220, RDE1644, RDE1955, RDE483, RDE1222
Description: Segmentation masks for visceral adipose tissue (VAT), subcutaneous adipose tissue (SAT), liver, psoas muscles, and erector spinae muscles; derived organ volumes and mean PDFF per segmented organ.

Reproducibility

Fivefold cross-validation on 103 MRI datasets from 67 participants (83 train/val; 20 test) with participant-wise split across time points; code repositories provided; performance reported with Dice and interrater Bland–Altman analyses.

Sustainability

Approximate training time: nnU-Net 500 epochs ~67 hours per fold on NVIDIA Quadro P6000/Titan Xp GPUs; 3D U-Net ~12 hours per fold on same hardware.

Use

Intended: Image segmentation
Out-of-scope: Decision support, Image processing
Excluded: Other

User

Intended: Radiologist, Researcher
Out-of-scope: Layperson
Excluded: Patient, Layperson