Study finds that EHRs with enhanced usability can lead to reduction in cognitive workload and increases in performance among physicians.
A recent study into the usability of Electronic Health Records (EHR) has found that basic enhancements to the EHR system could be associated with better physician cognitive workload and performance.
Investigators found that physicians who used a system with enhanced longitudinal tracking had appropriately managed significantly more abnormal test results compared with physicians using baseline EHRs.
Investigators from the University of North Carolina sent out invitations to all residents and fellows enrolled in the school of medicine at a large academic institution, stating the need for experience with using the Epic EHR software in reviewing test results to undergo the study’s simulated situations. Investigators offered a $100 gift card as an incentive for participation. A total of 40 individuals were recruited to participate, but 2 of them were excluded due to numerous cancellations. Between April 1, 2016 and Dec. 23, 2016, those 38 participants were enrolled and blindly allocated to a simulated EHR environment. Of those 38, 20 were assigned to use a baseline EHR and 18 were assigned to enhanced EHRs, which included changes intended to enhance longitudinal tracking of abnormal test results in the system.
The baseline EHR, which was currently being used at the study institution, displayed all new abnormal test results and identified critical test results for patients with no-show status in a general folder called “Results” and had basic sorting capabilities. The enhanced EHR automatically sorted all previously identified critical test results for patients with a no-show status in a dedicated folder called “All Reminders”.
Additionally, the enhanced EHR clearly displayed information regarding patient status and policy-based decision support instructions. The interventions were developed based on the classic theory of attention, which indicates that cognitive workload varies continuously during the course of a task and changes of cognitive workload may be attributed to adaptive interaction strategies.
Investigators found no statistically significant difference in perceived workload between the two groups. However, a statistically significant higher cognitive workload, as shown by the lower mean blink rate, was found in the baseline EHR group compared with the enhanced EHR group.
Investigators noted statistically significant poorer performance in the baseline EHR group compared with the enhance EHR group. They attributed this to review of patients with a no-show status for a follow-up appointment. No difference between the groups was noted in terms of the time to complete simulated scenarios. Additionally, no statistically significant difference was noted in fatigue levels between baseline and enhanced EHR groups.
In an interview with MD Magazine®, Lukasz Mazur, PhD, associate professor at the UNC School of Medicine and study author, stated the importance of studies like this cannot be understated as improving EHRs are imperative to a physician's performance.
"There is a need to quantify, in a rigorous and scientific manner, the effect of usability on physicians’ mental workload and performance. Such evidence can help us create the ‘burning platform’ for the usability improvements within EHRs," Mazur said. "We can not expect physicians to do their job efficiently and effectively without properly designing their EHR-related workflows. I hope that our manuscript and findings can spearhead some policy changes aimed at making usability a centerpiece of next generation of EHRs."
Investigators concluded there were several limitations to their study. The size of their study meant they could not consider possible confounding factors and were unable to more accurately quantify the association of usability with cognitive workload and performance. The simulated environment, where the participants knew that their work was being assessed, may have impacted performance. The cognitive workload and performance scores were likely impacted by the setting and might not reflect the actual cognitive workload and performance in real clinical settings.
Investigators found that the intervention might have manipulated the ease of access to information through a reorganized display and learning because it provided a guide to action by clearly showing information on patients’ status and policy-based decision supports. Investigators noted that future research could more accurately quantify the association of usability and learning with cognitive workload and performance.
This study, titled “Association of the Usability of Electronic Health Records With Cognitive Workload and Performance Levels Among Physicians,” was published in the Journal of the American Medical Association.