Background: PET imaging with [18F]F-DOPA shows great promise for assessing paediatric gliomas. Manual tumour delineation and parameter extraction are time-consuming and prone to inter-operator variability. Methods: We evaluated whether a deep learning model, leveraging transfer learning from adult glioma datasets, could enable a fully automated pipeline for tumour segmentation and PET parameter extraction. Static and dynamic parameters were compared across three approaches: (i) automatic vs semi-automatic, (ii) automatic vs manual, and (iii) manual vs. semi-automatic. Data from 103 paediatric patients (median age 11 years; 54 females, 49 males) with static and/or dynamic [18F]F-DOPA PET scans (2011–2024) were retrospectively included for fine-tuning the deep learning model. Statistical and survival analyses were performed on 90 subjects; dynamic analysis included 32 patients. Results: The best model achieved a Dice score of 0.82 ± 0.11 and was integrated into the pipeline for extracting static and dynamic indices. Automatic Tumour-to-Striatum ratio showed high reproducibility across comparisons ((i) p = 0.660, (ii) p = 0.342, (iii) p = 0.639), while Tumour-to-Background differed significantly when comparing manual delineations (p < 0.01). Dynamic parameters demonstrated good reproducibility with the automatic method (p > 0.05). Importantly, both automated static indices correlate significantly with tumour grade, with the overall and progression-free survival (p < 0.05). Conclusions: Transfer learning enabled a fully automatic [18F]F-DOPA PET pipeline for paediatric gliomas, providing reproducible static and dynamic parameter extraction and correlating with clinically relevant outcomes. This approach reduces operator dependence and streamlines analysis, supporting potential integration into routine clinical practice.
U-Net-based transfer learning for automated tumour segmentation enabling fully automated [18F]F-DOPA PET analysis in paediatric gliomas
Bianconi, Andrea;Morana, Giovanni;
2026-01-01
Abstract
Background: PET imaging with [18F]F-DOPA shows great promise for assessing paediatric gliomas. Manual tumour delineation and parameter extraction are time-consuming and prone to inter-operator variability. Methods: We evaluated whether a deep learning model, leveraging transfer learning from adult glioma datasets, could enable a fully automated pipeline for tumour segmentation and PET parameter extraction. Static and dynamic parameters were compared across three approaches: (i) automatic vs semi-automatic, (ii) automatic vs manual, and (iii) manual vs. semi-automatic. Data from 103 paediatric patients (median age 11 years; 54 females, 49 males) with static and/or dynamic [18F]F-DOPA PET scans (2011–2024) were retrospectively included for fine-tuning the deep learning model. Statistical and survival analyses were performed on 90 subjects; dynamic analysis included 32 patients. Results: The best model achieved a Dice score of 0.82 ± 0.11 and was integrated into the pipeline for extracting static and dynamic indices. Automatic Tumour-to-Striatum ratio showed high reproducibility across comparisons ((i) p = 0.660, (ii) p = 0.342, (iii) p = 0.639), while Tumour-to-Background differed significantly when comparing manual delineations (p < 0.01). Dynamic parameters demonstrated good reproducibility with the automatic method (p > 0.05). Importantly, both automated static indices correlate significantly with tumour grade, with the overall and progression-free survival (p < 0.05). Conclusions: Transfer learning enabled a fully automatic [18F]F-DOPA PET pipeline for paediatric gliomas, providing reproducible static and dynamic parameter extraction and correlating with clinically relevant outcomes. This approach reduces operator dependence and streamlines analysis, supporting potential integration into routine clinical practice.| File | Dimensione | Formato | |
|---|---|---|---|
|
s40708-026-00296-z.pdf
Accesso aperto
Dimensione
3.63 MB
Formato
Adobe PDF
|
3.63 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.



