Automation Tool Increases Screening for Pediatric Autism Spectrum Disorder

Although screening was improved, many physicians failed to follow-up with the children.

Stephen Downs, MD, MS

A computer automation integrated into a clinician’s workflow via an electronic health record (EHR) increases screening of children for autism spectrum disorder (ASD), according to a recent study.

Stephen Downs, MD, MS, from the Division of Children’s Health Services Research at Indiana University School of Medicine, and his colleagues sought to learn if computer-automated screening and clinical decision support could improve autism spectrum disorder screening rates in pediatric primary care facilities.

The investigators found that the computer automation did increase screening, but follow-up by physicians is still flawed.

Early screening offers a chance for early therapy and vastly better outcomes, Downs said in an interview with MD Magazine®.

The main outcome of the randomized clinical trial was screening rates among the children. The team also measured rates of positive screening results, clinicians’ response rates to screening results in the computer system, and new cases of ASD identified.

The investigators compared screening rates among a sample of 274 children aged 18—24 months in 4 urban pediatric clinics of an inner-city county hospital system with or without a screening module built into an existing decision support software system. All of the clinics used the system called Child Health Improvement Through Computer Automation (CHICA) which is a rule-based system used in pediatric clinics at the Eskenazi Health System in Indiana.

CHICA communicates with the EHR so when a patient registers for care, it analyzes the child’s EHR—demographic characteristics, morphometric characteristics, diagnoses, and medications—and selects the most important yes or no questions covering a range of issues to ask the family. The family is asked to complete the form in the waiting room.

The system then analyzes the answers and picks the 6 most important alerts or reminders for the clinician. The alerts are assembled into an agenda and that can be printed or accessed from within the EHR. The clinician can respond to the alerts by checking boxes in the EHR.

The American Academy of Pediatrics ASD guidance was encoded in the CHICA ASD module by creating rules that directed surveillance and coding. The system’s prescreening form asked parents whether they were concerned about the development of their child or whether the child had a sibling with ASD.

CHICA alerted the clinician to immediately refer the child for evaluation if 1 or both of the answers were true, and the clinician expressed developmental concerns in the system at a previous visit. If not, the system produced an M-CHAT-F screening form. CHICA alerted the clinician if the form had a positive result.

Downs said there are 2 keys to CHICA.

“First, the system integrates with routine care because it covers the breadth of general pediatrics, it performs risk assessments in the waiting room, and it creates a prioritized agenda for the physician,” he said. “Second, within this workflow, autism screening was completely automated, so nearly every child was screened.”

At the end of the study, 40,820 children aged 21 months or younger had visits using the CHICA system—34% in intervention clinics and 66% in control clinics. The intervention clinics included the enhanced version of CHICA with the ASD module, while the control clinics had CHICA without the module and clinicians cared for children with ASD using standard methods.

During the intervention, 1653 M-CHAT tests were printed and 59.3% were completed and scanned into the system for CHICA to score. Scored tests showed that 265 children had results possibly indicative of ASD for a 27% positive screening rate.

The primary outcome was the rate eligible patients were screened for autism spectrum disorder using a standardized screening instrument like the M-CHAT. Screening rate increased over time in the intervention group but not the control group. None of the children in the intervention group were screened at baseline. Nearly 11% of children in the control group were screened.

When the intervention period started, 78.1% of patients were screened. Screening rates increased from 0% (95% CI, 0—5.5) at baseline to 68.4% (13 of 19) (95% CI, 43.4%–87.4%) at 6 months to 100% (18 of 18) (95% CI, 81.5%–100%) at 24 months.

The screening rate during the study period was only 15.3% for the control group (11 of 72 children) at 6 to 24 months after the intervention, peaking at 4 of 18 children (22.2%) at 24 months. The differences between the groups became statistically significant during the intervention period.

Although screening increased, physicians were not as effective in following up when patients screened positive. Of the 265 patients who screened a positive M-CHAT result, physicians only indicated a response for 151 children (57.0%; 95% CI, 51.0%—62.9%).

In 103 of those with responses (68.2%; 95% CI, 60.8—75.6), pediatricians indicated the child did not have ASD, 52 of the children (34.4%; 95% CI, 26.8–42.0) were referred for evaluation, 17 (11.3%; 95% CI, 6.2–16.3) were suspected but not referred, and 5 (3.3%; 95% CI, 0.4–6.2) were referred for audiologic evaluation.

Full referral and evaluation ASD were more likely in the intervention group (OR, 19.88 [95% CI, 3.33—118.65]).

Two patients in the intervention group (1.4%) had a new ASD diagnosis recorded during the intervention. Among all screened by CHICA, 15 of 980 (1.5%) received a diagnosis. The positive predictive value of the system was estimated to be 10%.

More research is needed to automate the evaluation of children who screened positive for autism spectrum disorder, the study authors noted.

“I think the lesson is, the more we can automate care routines, the better,” Downs concluded.

The study, “Effect of a Computer-Based Decision Support Intervention on Autism Spectrum Disorder Screening in Pediatric Primary Care Clinics,” was published online in JAMA Network Open.