Data augmentation for patch-based OCT chorio-retinal segmentation using generative adversarial networks

Jason Kugelman, David Alonso-Caneiro, Scott A. Read, Stephen J. Vincent, Fred K. Chen, Michael J. Collins

Research output: Contribution to journalArticle

Abstract

Many clinical and research tasks rely critically upon the segmentation of tissue layers in optical coherence tomography (OCT) images of the posterior eye (the retina and choroid). However, a major limitation of using machine learning-based segmentation methods is that performance depends on a large quantity of and diversity in the data used to train the models. Due to their demonstrated ability to generate high quality and diverse synthetic images, we propose the application of GANs here to augment data for a patch-based approach to OCT chorio-retinal boundary segmentation. Given the complexity of GAN training, a range of experiments are performed to understand and optimise performance. We show that it is feasible to generate patches that are visually indistinguishable from their real variants and in the best case, the segmentation performance utilising solely synthetic data is nearly comparable to a model trained on real data. The data augmentation capabilities are demonstrated with classification performance improvements realised on a range of sparse datasets. These findings highlight the potential use of GANs for data augmentation in future work with chorio-retinal OCT images. Additionally, this study includes a range of experimental findings and an analysis of techniques which may be useful for developing or improving GAN-based methods that are not necessarily limited to chorio-retinal images, the OCT modality, or data augmentation.

Original languageEnglish
JournalNeural Computing and Applications
DOIs
Publication statusE-pub ahead of print - 10 Mar 2021

Fingerprint Dive into the research topics of 'Data augmentation for patch-based OCT chorio-retinal segmentation using generative adversarial networks'. Together they form a unique fingerprint.

Cite this