Human Infants Read Lips When Learning to Talk; Finding Could Help Diagnose Autism Earlier

Article

A study conducted by David J. Lewkowicz, PhD, a professor of psychology within the Charles E. Schmidt College of Science at Florida Atlantic University, is the first to show that infants learn how to talk not just by listening, but by looking too.

A recent finding that indicates infants read lips when learning to talk could also help physicians diagnose autism earlier than is currently possible, according to the researchers.

This study, conducted by David J. Lewkowicz, PhD, a professor of psychology within the Charles E. Schmidt College of Science at Florida Atlantic University, is the first to show that infants learn how to talk not just by listening, but by looking too.

“Our research found that infants shift their focus of attention to the mouth of the person who is talking when they enter the babbling stage and that they continue to focus on the mouth for several months thereafter until they master the basic speech forms of their native language,” said Lewkowicz, according to a university press release.

“In other words, infants become lip readers when they first begin producing their first speech-like sounds,” Lewkowicz continued.

Lewkowicz and his fellow researcher Amy Hansen-Tift, a doctoral student, studied the behavior of a group of English-learning infants aged four to twelve months. The researchers observed the babies as they watched and listened to an adult who recited a monologue either in English (their native language) or Spanish (nonnative).

Between four and eight months of age, the infants shifted their attention from the eyes to the mouth regardless of what language was being spoken to them. Around 12 months, however, they began to shift their attention back to the eyes when the adult spoke English, but not when the adult spoke Spanish.

The researchers concluded that “the first shift enables infants to gain access to redundant audiovisual speech cues that enable them to learn their native speech forms and…the second shift reflects growing native-language expertise that frees them to shift attention to the eyes to gain access to social cues,” they write in an abstract of the study.

Continuing with this thought, infants aged 12 months do not their shift attention to the eyes when listening to a nonnative language due to their increasing grasp of one language (in this case, English), which makes processing nonnative more difficult.

This find, however, suggests a new method to diagnose autism spectrum disorder earlier than is presently feasible. “Contrary to typically developing children, infants who are as yet undiagnosed but who are at risk for autism may continue to focus on the mouth of a native-language talker at 12 months of age and beyond,” said Lewkowicz in the release.

Presently, 18 months is the earliest age at which a diagnosis of autism spectrum disorder can be made. Identifying autism in children earlier would allow clinicians more time to begin intervention procedures that could diminish or even prevent the “most devastating effects of autism and other communicative disorders,” Lewkowicz concluded.

This study is published in the current issue of the Proceedings of the National Academy of Sciences.

Related Videos
Sejal Shah, MD | Credit: Brigham and Women's
Stephanie Nahas, MD, MSEd | Credit: Jefferson Health
Insight on the Promising 52-Week KarXT Data with Rishi Kakar, MD
Sunny Rai, PhD: “I” Language Markers Do Not Detect Depression in Black Individuals
Rebecca A. Andrews, MD: Issues and Steps to Improve MDD Performance Measures
A Voice Detecting Depression? Lindsey Venesky, PhD, Discusses New Data
Daniel Karlin, MD: FDA Grants Breakthrough Designation to MM120 for Anxiety
© 2024 MJH Life Sciences

All rights reserved.