Research Article
BibTex RIS Cite

Prostate gland segmentation on prostate magnetic resonance images: An artificial intelligence study using a U-net-based convolutional neural network

Year 2026, Volume: 65 Issue: 1, 107 - 113, 09.03.2026
https://doi.org/10.19161/etd.1755224
https://izlik.org/JA78BZ48FS

Abstract

Aim: The aim of this study is to automatically segment the prostate gland, transitional zone (TZ) and periferal zone (PZ) on prostate Magnetic Resonance Imaging (MRI) using a U-net based convolutional neural network (CNN).
Materials and Methods: This retrospective study included a total of 100 patients who underwent screening with a 1.5T MRI device between January and December 2020. The acquired images were evaluated by a senior radiology resident and converted to nifti format using the MedSeg.ai platform. Prostate and TZ masks were manually traced, while the remaining area (PZ) was automatically segmented by extracting the TZ mask from the prostate mask. A U-net based CNN algorithm with 7 depth layers was developed. Data from 80 patients were used for training the algorithm, with 10 randomly selected for validation. The remaining data from 20 patients were used for testing. Evaluation metrics applied on the test set included accuracy, mean and median Dice Similarity Coefficient (DSC), mean Hausdorff Distance (HSD), Mean Surface Distance (MSD), mean Relative Absolute Volume (RAV).
Results: Mean DSC of 0.91 ± 0.03, 0.87 ± 0.06, 0.70 ± 0.16 and median DSC of 0.92, 0.90, 0.75 were obtained for prostate gland, TZ and PZ segmentation respectively. Mean HSD was 8.58, 9.52, 18.78, MSD was 0.92, 0.84, 1.30 and mean RAV was 3.51, 9.87, 70.57 for the segmentation of aforementioned structures.
Conclusion: The developed U-net algorithm performed better in segmenting the prostate and TZ than in previous studies. While the success rate of PZ segmentation was lower, this could be attributed to various factors, as indicated by state-of-the-art methods in deep learning. This study highlights AI's promising role in automating prostate segmentation.

Ethical Statement

This study was performed in line with the principles of the Declaration of Helsinki. Approval was granted by the Ethics Committee of Uni̇versi̇ty Of Health Sci̇ences İzmi̇r Bozyaka Educati̇on and Research Hospital (Date: 12.08.2021, No:E-48865165-302.14.01—10126).

Supporting Institution

None

Project Number

Yok

Thanks

None

References

  • Siegel RL, Miller KD, Fuchs HE, Jemal A. Cancer statistics, 2022. CA Cancer J Clin. 2022 Jan;72(1):7–33.
  • Das CJ, Razik A, Sharma S, Verma S. Prostate biopsy: when and how to perform. Clin Radiol. 2019 Nov 1;74(11):853–64.
  • Yacoub JH, Oto A, Miller FH. MR Imaging of the prostate. Radiol Clin North Am. 2014;52(4):811–37.
  • Hoeks CMA, Barentsz JO, Hambrock T, Yakar D, Somford DM, Heijmink SWTPJ, et al. Prostate cancer: Multiparametric MR imaging for detection, localization, and staging. Radiology. 2011;261(1):46–66.
  • Durmus T, Baur A, Hamm B. Multiparametric magnetic resonance imaging in the detection of prostate cancer. Aktuelle Urol. 2014 Mar;45(2):119–26.
  • Drost FJH, Osses DF, Nieboer D, Steyerberg EW, Bangma CH, Roobol MJ, et al. Prostate MRI, with or without MRI-targeted biopsy, and systematic biopsy for detecting prostate cancer. Cochrane Database of Systematic Reviews. 2019;
  • Schoots IG, Roobol MJ, Nieboer D, Bangma CH, Steyerberg EW, Hunink MGM. Magnetic resonance imaging-targeted biopsy may enhance the diagnostic accuracy of significant prostate cancer detection compared to standard transrectal ultrasound-guided biopsy: a systematic review and meta-analysis. Eur Urol. 2015 Sep 1;68(3):438–50.
  • Kasivisvanathan V, Rannikko AS, Borghi M, Panebianco V, Mynderse LA, Vaarala MH, et al. MRI-Targeted or Standard Biopsy for Prostate-Cancer Diagnosis. N Engl J Med. 2018 Mar 18;378(19).
  • Fiard G, Hohn N, Descotes JL, Rambeaud JJ, Troccaz J, Long JA. Targeted MRI-guided prostate biopsies for the detection of prostate cancer: Initial clinical experience with real-time 3-dimensional transrectal ultrasound guidance and magnetic resonance/transrectal ultrasound image fusion. Urology. 2013 Jun 1;81(6):1372–8.
  • Pasquier D, Lacornerie T, Vermandel M, Rousseau J, Lartigau E, Betrouni N. Automatic segmentation of pelvic structures from magnetic resonance images for prostate cancer radiotherapy. Int J Radiat Oncol Biol Phys. 2007 Jun 1;68(2):592–600.
  • Toth R, Bloch BN, Genega EM, Rofsky NM, Lenkinski RE, Rosen MA, et al. Accurate prostate volume estimation using multifeature active shape models on T2-weighted MRI. Acad Radiol. 2011 Jun;18(6):745–54.
  • Zhu Y, Wei R, Gao G, Ding L, Zhang X, Wang X, et al. Fully automatic segmentation on prostate MR images based on cascaded fully convolution network. J Magn Reson Imaging. 2019 Apr 1;49(4):1149–56.
  • To MNN, Vu DQ, Turkbey B, Choyke PL, Kwak JT. Deep convolutional neural network for prostate MR segmentation. Int J Comput Assist Radiol Surg. 2018 Nov 1;13(11):1687.
  • Ronneberger O, Fischer P, Brox T. U-Net: Convolutional Networks for Biomedical Image Segmentation. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 2015;9351:234–41.
  • Scott R, Misser SK, Cioni D, Neri E. PI-RADS v2.1: What has changed and how to report. SA J Radiol. 2021;25(1):1–13.
  • Clark T, Zhang J, Baig S, Wong A, Haider MA, Khalvati F. Fully automated segmentation of prostate whole gland and transition zone in diffusion-weighted MRI using convolutional neural networks. J Med Imaging (Bellingham). 2017 Oct 17;4(4):1.

Prostat manyetik rezonans görüntülerinde prostat bezi segmentasyonu: U-net tabanlı sinir ağı kullanan bir yapay zeka çalışması

Year 2026, Volume: 65 Issue: 1, 107 - 113, 09.03.2026
https://doi.org/10.19161/etd.1755224
https://izlik.org/JA78BZ48FS

Abstract

Amaç: Bu çalışmanın amacı, prostat Manyetik Rezonans Görüntüleme (MRG) cihazında prostat bezini, geçiş bölgesini (TZ) ve periferik bölgeyi (PZ) U-net tabanlı bir evrişimsel sinir ağı (CNN) kullanarak otomatik olarak segmentlere ayırmaktır.
Gereç ve Yöntem: Bu retrospektif çalışmaya, Ocak ve Aralık 2020 tarihleri arasında 1,5T MR cihazıyla taramadan geçen toplam 100 hasta dahil edilmiştir. Elde edilen görüntüler, kıdemli bir radyoloji asistanı tarafından değerlendirilmiş ve MedSeg.ai platformu kullanılarak nifti formatına dönüştürülmüştür. Prostat ve TZ maskeleri manuel olarak izlenirken, kalan alan (PZ), prostat maskesinden TZ maskesi çıkarılarak otomatik olarak segmentlere ayrılmıştır. 7 derinlik katmanına sahip U-net tabanlı bir CNN algoritması geliştirilmiştir. Algoritmanın eğitimi için 80 hastadan alınan veriler kullanılmış ve 10 hasta doğrulama için rastgele seçilmiştir. Kalan 20 hastadan alınan veriler ise test için kullanılmıştır. Test setine uygulanan değerlendirme metrikleri arasında doğruluk, ortalama ve ortanca Dice Benzerlik Katsayısı (DSC), ortalama Hausdorff Mesafesi (HSD), Ortalama Yüzey Mesafesi (MSD) ve ortalama Bağıl Mutlak Hacim (RAV) yer aldı.
Bulgular: Prostat bezi, TZ ve PZ segmentasyonu için sırasıyla 0,91 ± 0,03, 0,87 ± 0,06, 0,70 ± 0,16 ortalama DSC ve 0,92, 0,90, 0,75 ortanca DSC elde edildi. Söz konusu yapıların segmentasyonu için ortalama HSD 8,58, 9,52, 18,78, MSD 0,92, 0,84, 1,30 ve ortalama RAV 3,51, 9,87 ve 70,57 olarak bulundu.
Sonuç: Geliştirilen U-net algoritması, prostat ve TZ segmentasyonunda önceki çalışmalara göre daha iyi performans göstermiştir. PZ segmentasyonunun başarı oranı daha düşük olsa da, bu durum derin öğrenmedeki en son yöntemlerin de gösterdiği gibi çeşitli faktörlere bağlanabilir. Bu çalışma, yapay zekanın prostat segmentasyonunun otomasyonunda oynadığı hayati rolü vurgulamaktadır.

Ethical Statement

Bu çalışma Helsinki Deklarasyonu prensiplerine uygun olarak gerçekleştirildi. Çalışmaya Sağlık Bilimleri Üniversitesi İzmi̇r Bozyaka Eğitim ve Araştırma Hastanesi Etik Kurulu'ndan onay alındı (Tarih: 12.08.2021, No: E-48865165-302.14.01—10126).

Supporting Institution

Yok

Project Number

Yok

Thanks

Yok

References

  • Siegel RL, Miller KD, Fuchs HE, Jemal A. Cancer statistics, 2022. CA Cancer J Clin. 2022 Jan;72(1):7–33.
  • Das CJ, Razik A, Sharma S, Verma S. Prostate biopsy: when and how to perform. Clin Radiol. 2019 Nov 1;74(11):853–64.
  • Yacoub JH, Oto A, Miller FH. MR Imaging of the prostate. Radiol Clin North Am. 2014;52(4):811–37.
  • Hoeks CMA, Barentsz JO, Hambrock T, Yakar D, Somford DM, Heijmink SWTPJ, et al. Prostate cancer: Multiparametric MR imaging for detection, localization, and staging. Radiology. 2011;261(1):46–66.
  • Durmus T, Baur A, Hamm B. Multiparametric magnetic resonance imaging in the detection of prostate cancer. Aktuelle Urol. 2014 Mar;45(2):119–26.
  • Drost FJH, Osses DF, Nieboer D, Steyerberg EW, Bangma CH, Roobol MJ, et al. Prostate MRI, with or without MRI-targeted biopsy, and systematic biopsy for detecting prostate cancer. Cochrane Database of Systematic Reviews. 2019;
  • Schoots IG, Roobol MJ, Nieboer D, Bangma CH, Steyerberg EW, Hunink MGM. Magnetic resonance imaging-targeted biopsy may enhance the diagnostic accuracy of significant prostate cancer detection compared to standard transrectal ultrasound-guided biopsy: a systematic review and meta-analysis. Eur Urol. 2015 Sep 1;68(3):438–50.
  • Kasivisvanathan V, Rannikko AS, Borghi M, Panebianco V, Mynderse LA, Vaarala MH, et al. MRI-Targeted or Standard Biopsy for Prostate-Cancer Diagnosis. N Engl J Med. 2018 Mar 18;378(19).
  • Fiard G, Hohn N, Descotes JL, Rambeaud JJ, Troccaz J, Long JA. Targeted MRI-guided prostate biopsies for the detection of prostate cancer: Initial clinical experience with real-time 3-dimensional transrectal ultrasound guidance and magnetic resonance/transrectal ultrasound image fusion. Urology. 2013 Jun 1;81(6):1372–8.
  • Pasquier D, Lacornerie T, Vermandel M, Rousseau J, Lartigau E, Betrouni N. Automatic segmentation of pelvic structures from magnetic resonance images for prostate cancer radiotherapy. Int J Radiat Oncol Biol Phys. 2007 Jun 1;68(2):592–600.
  • Toth R, Bloch BN, Genega EM, Rofsky NM, Lenkinski RE, Rosen MA, et al. Accurate prostate volume estimation using multifeature active shape models on T2-weighted MRI. Acad Radiol. 2011 Jun;18(6):745–54.
  • Zhu Y, Wei R, Gao G, Ding L, Zhang X, Wang X, et al. Fully automatic segmentation on prostate MR images based on cascaded fully convolution network. J Magn Reson Imaging. 2019 Apr 1;49(4):1149–56.
  • To MNN, Vu DQ, Turkbey B, Choyke PL, Kwak JT. Deep convolutional neural network for prostate MR segmentation. Int J Comput Assist Radiol Surg. 2018 Nov 1;13(11):1687.
  • Ronneberger O, Fischer P, Brox T. U-Net: Convolutional Networks for Biomedical Image Segmentation. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 2015;9351:234–41.
  • Scott R, Misser SK, Cioni D, Neri E. PI-RADS v2.1: What has changed and how to report. SA J Radiol. 2021;25(1):1–13.
  • Clark T, Zhang J, Baig S, Wong A, Haider MA, Khalvati F. Fully automated segmentation of prostate whole gland and transition zone in diffusion-weighted MRI using convolutional neural networks. J Med Imaging (Bellingham). 2017 Oct 17;4(4):1.
There are 16 citations in total.

Details

Primary Language English
Subjects Radiology and Organ Imaging, Urology
Journal Section Research Article
Authors

Başak Ünverdi 0000-0002-5875-2964

Mehmet Akif Özdemir 0000-0002-8758-113X

Aytuğ Onan 0000-0002-9434-5880

Elif Aylin Yüce Yörük 0000-0001-9970-9455

Türker Acar 0000-0002-9060-2691

Project Number Yok
Submission Date August 1, 2025
Acceptance Date December 4, 2025
Publication Date March 9, 2026
DOI https://doi.org/10.19161/etd.1755224
IZ https://izlik.org/JA78BZ48FS
Published in Issue Year 2026 Volume: 65 Issue: 1

Cite

Vancouver 1.Başak Ünverdi, Mehmet Akif Özdemir, Aytuğ Onan, Elif Aylin Yüce Yörük, Türker Acar. Prostate gland segmentation on prostate magnetic resonance images: An artificial intelligence study using a U-net-based convolutional neural network. EJM. 2026 Mar. 1;65(1):107-13. doi:10.19161/etd.1755224

Ege Journal of Medicine enables the sharing of articles according to the Attribution-Non-Commercial-Share Alike 4.0 International (CC BY-NC-SA 4.0) license.