Article

Deep Learning Could Benefit Prediction of Geographic Atrophy Growth Rates

Author(s):

Retrospective analyses suggest the feasibility of using FAF images, OCT volumes to predict GA growth rates and area using a multitask deep learning approach.

Close-up of eye v2osk/Unsplash

Credit: Unsplash/v2osk

A recent retrospective analysis indicated the feasibility of using baseline fundus autofluorescence (FAF) images and optical coherence tomography (OCT) volumes to predict individual geographic atrophy (GA) area and growth rates in a multitask deep learning approach.

The analysis investigated deep learning models for annualized GA growth rate prediction using these variables at baseline visits, which could be used to increase power of clinical trials, according to Qi Yang, PhD, Roche Ophthalmology Personalized Healthcare, Genentech, Inc.

“The deep learning–based growth rate predictions could be used for covariate adjustment to increase power of clinical trials,” investigators wrote.

The retrospective analysis estimated GA growth rate as the slope of a linear fit on all available measurements of lesion area over a period of 2 years. Investigators developed 3 multitask deep leaning models (FAF-only, OCT-only, and multimodal) to predict concurrent GA area and annualized growth rate. The analysis included patients from prospective and observational clinical trials evaluating lampalizumab.

Each of the 3 models were trained on the development data set, tested on the holdout set, and further evaluated on the independent test sets, according to Yang and colleagues. They noted baseline FAF images and OCT volumes from study eyes of patients with bilateral GA (NCT02247479; NCT02247531; and NCT02479386) were split into development (n = 1279) and holdout (n = 443) sets. Meanwhile, baseline FAF images from study eyes of NCT01229215 (n = 106) and NCT02399072 (n = 169) were used as independent test sets.

The main outcome measures for the analysis included model performance, evaluated using squared Pearson correlation coefficient (r2) between observed and prediction lesion areas/growth rates. Investigators calculated 95% confidence intervals (CIs) by bootstrap resampling (B = 10,000).

Upon analysis, on the holdout data set, the r2 of the FAF-only and OCT-only for GA lesion area prediction was 0.96 (95% CI, 0.95 – 0.97) and 0.91 (95% CI, 0.87 – 0.95), respectively. Data showed the multimodal models for GA lesion area prediction was 0.94 (95% CI, 0.92 – 0.96). Moreover, the r2 for GA growth rate prediction was 0.48 (95% CI, 0.41 – 0.55) and 0.36 (95% CI, 0.29 – 0.43) for the FAF-only and OCT-only, respectively, and 0.47 (95% CI, 0.40 – 0.54) for multimodal models.

On the 2 independent test sets, data showed the r2 of the FAF-only model for GA lesion area was 0.98 (95% CI, 0.97 - 0.99) and 0.95 (95% CI, 0.93 - 0.96) and for GA growth rate, the r2 was 0.65 (95% CI, 0.52 - -0.75) and 0.47 (95% CI, 0.34 – 0.60).

“We show the feasibility of using baseline FAF images and OCT volumes to predict individual GA area and growth rates using a multitask deep learning approach,” investigators wrote.

References

Anegondi N;Gao SS;Steffen V;Spaide RF;Sadda SR;Holz FG;Rabe C;Honigberg L;Newton EM;Cluceru J;Kawczynski MG;Bengtsson T;Ferrara D;Yang Q; (n.d.). Deep learning to predict geographic atrophy area and growth rate from multimodal imaging. Ophthalmology. Retina. Retrieved April 11, 2023, from https://pubmed.ncbi.nlm.nih.gov/36038116/

Related Videos
Jonathan Meyer, MD: Cognitive Gains, Dopamine-Free Schizophrenia Treatment with Xanomeline Trospium Chloride
Allysa Saggese, NP | Credit: Weill Cornell Medicine
Zobair Younossi, MD, MPH | Credit: American College of Gastroenterology
Thumbnail for schizophrenia special report around approval of Cobenfy.
Thumbnail for schizophrenia special report around approval of Cobenfy.
Thumbnail for schizophrenia special report around approval of Cobenfy.
Thumbnail for schizophrenia special report around approval of Cobenfy.
Thumbnail for schizophrenia special report around approval of Cobenfy.
© 2024 MJH Life Sciences

All rights reserved.