[go: up one dir, main page]

Skip to main content

Semi-Supervised Learning Based Cascaded Pocket U-Net for Organ and Pan-Cancer Segmentation in Abdomen CT

  • Conference paper
  • First Online:
Fast, Low-resource, and Accurate Organ and Pan-cancer Segmentation in Abdomen CT (FLARE 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14544))

  • 130 Accesses

Abstract

In clinical practice, CT scans are frequently employed as the primary imaging modality for detecting prevalent tumors arising from the abdominal organs. Hence, the accomplishment of simultaneous organ segmentation and pan-cancer segmentation in abdominal CT scans holds significant importance in decreasing the workload of clinical practitioners. To maximize the utilization of partially labeled and unlabeled data, a iterative training strategy through a semi-supervised approach based on pseudo labels is employed in this work. Furthermore, to reduce parameter size of model and increase efficiency of GPU utilization, the proposed method is built upon the pocket U-Net architecture. The methodology involves a cascaded network consisting of two parts: initially, a segmentation network trained on labeled data refines the low-resolution pocket U-Net to reduce image dimensions; subsequently, the high-resolution pocket U-Net conducts intricate segmentation to precisely delineate organ and tumor regions. As demonstrated by the evaluation outcomes on the FLARE 2023 validation dataset, the proposed method achieves an average dice similarity coefficient (DSC) of 88.94% for organs and 15.92% for tumors, along with normalized surface dice (NSD) values of 93.31% for organs and 0.0816% for tumors, with minimal parameter size. Furthermore, the average inference time is 82.61 s, with an average maximum GPU memory usage of 3560M. Codes are available at https://github.com/wt812549723/FLARE2023_solution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 74.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bilic, P., et al.: The liver tumor segmentation benchmark (lits). Med. Image Anal. 84, 102680 (2023)

    Google Scholar 

  2. Celaya, A., et al.: Pocketnet: a smaller neural network for medical image analysis. IEEE Trans. Med. Imaging 42(4), 1172–1184 (2023)

    Article  Google Scholar 

  3. Chen, X., Yuan, Y., Zeng, G., Wang, J.: Semi-supervised semantic segmentation with cross pseudo supervision. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2613–2622 (2021)

    Google Scholar 

  4. Clark, K., et al.: The cancer imaging archive (TCIA): maintaining and operating a public information repository. J. Digit. Imaging 26(6), 1045–1057 (2013)

    Article  Google Scholar 

  5. He, R., Yang, J., Qi, X.: Re-distributing biased pseudo labels for semi-supervised semantic segmentation: A baseline investigation. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), pp. 6930–6940 (2021)

    Google Scholar 

  6. Heller, N., et al.: The state of the art in kidney and kidney tumor segmentation in contrast-enhanced CT imaging: results of the kits19 challenge. Med. Image Anal. 67, 101821 (2021)

    Article  Google Scholar 

  7. Heller, N., et al.: An international challenge to use artificial intelligence to define the state-of-the-art in kidney and kidney tumor segmentation in CT imaging. Proc. Am. Soc. Clin. Oncol. 38(6), 626–626 (2020)

    Article  Google Scholar 

  8. Huang, Z., et al.: Revisiting nnU-net for iterative pseudo labeling and efficient sliding window inference. In: Ma, J., Wang, B. (eds.) FLARE 2022. LNCS, vol. 13816, pp. 178–189. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-23911-3_16

    Chapter  Google Scholar 

  9. Isensee, F., Jaeger, P.F., Kohl, S.A., Petersen, J., Maier-Hein, K.H.: nnU-net: a self-configuring method for deep learning-based biomedical image segmentation. Nat. Methods 18(2), 203–211 (2021)

    Article  Google Scholar 

  10. Li, Y., Chen, J., Xie, X., Ma, K., Zheng, Y.: Self-loop uncertainty: a novel pseudo-label for semi-supervised medical image segmentation. In: Martel, A.L., et al. (eds.) MICCAI 2020 Part I. LNCS, vol. 12261, pp. 614–623. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-59710-8_60

    Chapter  Google Scholar 

  11. Ma, J., et al.: Loss odyssey in medical image segmentation. Med. Image Anal. 71, 102035 (2021)

    Article  Google Scholar 

  12. Ma, J., He, Y., Li, F., Han, L., You, C., Wang, B.: Segment anything in medical images. Nat. Commun. 15, 654 (2024)

    Article  Google Scholar 

  13. Ma, J., et al.: Fast and low-GPU-memory abdomen CT organ segmentation: the flare challenge. Med. Image Anal. 82, 102616 (2022)

    Article  Google Scholar 

  14. Ma, J., et al.: Unleashing the strengths of unlabeled data in pan-cancer abdominal organ quantification: the flare22 challenge. arXiv preprint arXiv:2308.05862 (2023)

  15. Ma, J., et al.: Abdomenct-1k: Is abdominal organ segmentation a solved problem? IEEE Trans. Pattern Anal. Mach. Intell. 44(10), 6695–6714 (2022)

    Article  Google Scholar 

  16. Pavao, A., et al.: Codalab competitions: an open source platform to organize scientific challenges. J. Mach. Learn. Res. 24(198), 1–6 (2023)

    Google Scholar 

  17. Simpson, A.L.,et al.: A large annotated medical image dataset for the development and evaluation of segmentation algorithms. arXiv preprint arXiv:1902.09063 (2019)

  18. Wang, E., Zhao, Y., Wu, Y.: Cascade dual-decoders network for abdominal organs segmentation. In: Ma, J., Wang, B. (eds.) FLARE 2022. LNCS, vol. 13816, pp. 202–213. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-23911-3_18

    Chapter  Google Scholar 

  19. Yang, X., Song, Z., King, I., Xu, Z.: A survey on deep semi-supervised learning. IEEE Trans. Knowl. Data Eng. (2022)

    Google Scholar 

  20. Yushkevich, P.A., Gao, Y., Gerig, G.: Itk-snap: an interactive tool for semi-automatic segmentation of multi-modality biomedical images. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 3342–3345 (2016)

    Google Scholar 

Download references

Acknowledgements

The authors of this paper declare that the segmentation method they implemented for participation in the FLARE 2023 challenge has not used any pre-trained models nor additional datasets other than those provided by the organizers. The proposed solution is fully automatic without any manual intervention. We thank all the data owners for making the CT scans publicly available and CodaLab [16] for hosting the challenge platform.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tao Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, T., Zhang, X., Xiong, W., Zhou, S., Zhang, X. (2024). Semi-Supervised Learning Based Cascaded Pocket U-Net for Organ and Pan-Cancer Segmentation in Abdomen CT. In: Ma, J., Wang, B. (eds) Fast, Low-resource, and Accurate Organ and Pan-cancer Segmentation in Abdomen CT. FLARE 2023. Lecture Notes in Computer Science, vol 14544. Springer, Cham. https://doi.org/10.1007/978-3-031-58776-4_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-58776-4_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-58775-7

  • Online ISBN: 978-3-031-58776-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics