After analyzing more than 3700 systematic reviews, investigators found that 1 in 3 were unreliable.
As the American Academy of Ophthalmology prepares to update its clinical practice guidelines, investigators set out to determine the reliability of systematic reviews for corneal diseases and found that 1 in every 3 systematic reviews were unreliable.
Investigators sought to inform the American Academy of Ophthalmology’s clinical practice guidelines through examining more than 3700 systematic reviews to assess the reliability of systematic reviews addressing interventions for corneal diseases.
Investigators used the Cochrane Eyes and Vision US Satellite (CEV@US) database to find systematic reviews. To identify reviews, CEV@US searched both MEDLINE and Embase from inception to 2007, with periodic updates conducted in 2009, 2012, 2014, 2016, and 2017. The search combined eyes and vision keywords and controlled vocabulary terms with a validated search filter. Authors noted that full-text reports that claimed to be systematic reviews or meta-analyses or that met the definition of a systematic review or a meta-analysis were included in the database.
For the study, investigators included systematic reviews that evaluated the effectiveness of interventions for management of at least 1 corneal disease. All systematic reviews were classified as a reliable or unreliable based upon five factors including review assessed risk of bias, if the review had conclusions that were supported by the results, and if the review used appropriate methods for quantitative syntheses.
In total, 3777 systematic reviews were identified and 3675 were excluded because they were not related to corneal diseases. Of the 102 remaining, 4 were excluded because interventions were unrelated to management of corneal disease. From that 98, investigators identified 33 reviews as unreliable and the remaining 65 as reliable.
Among the 65 reliable reviews, most of them reviewed medications (68%) or surgery (28%). The most common outcome domains assessed were adverse events (43), symptom improvement (31), visual acuity (19), quality of life (19), costs (14), symptom resolution (10), and tear production (10). All but 1 of the reliable systemic reviews included experimental studies, such as randomized or non-randomized clinical trials.
The most common reasons for unreliability were that the systematic review did not conduct a comprehensive literature search (67%), did not assess risk of bias of the individual included studies (39%), and did not use appropriate methods for quantitative syntheses (29% did not conduct a quantitative synthesis).
Investigators found notable differences between the reviews designated as reliable and the unreliable group. Unreliable reviews had a higher median number of studies (15 versus 9) and study participants (827 versus 556). Dry eye syndrome was the most frequently addressed disease in the unreliable reviews (36%), while conjunctivitis (32%) was the most frequently addressed among reliable reviews. Surgery was the most frequent type of management addressed in unreliable reviews (45%) and medication was the most frequent type of management in the reliable reviews (68%).
Authors noted limitations within their own study. One being that they used MEDLINE and Embase as the only resources to search for systematic reviews. Authors suggest it is possible that they missed reviews that were only available in other databases. A second limitation was their assessment of reliability involved judgement and reflected affiliation with Cochrane Eyes and Vision.
In their conclusion, authors noted that reliable systematic reviews can prove to be useful for clinicians, patients, guidelines developers, and other decision makers. They pointed out that careful adherence by systematic reviews to best practices and by journal editors to recommendations regarding report and editorial review can help improve the reliability of future reviews regarding eyes and vision.
This study, titled “Reliability of the Evidence Addressing Treatment of Corneal Diseases,” is published in JAMA Ophthalmology.