Infants born with sensorineural hearing loss show early disruptions in how their brains organize and specialize, according to new research using functional near-infrared spectroscopy (fNIRS). The work suggests that limited access to sound or language in the first months of life alters the development of left–right brain asymmetry – an early marker linked to later language and cognitive skills.
The study examined 112 infants aged 3 to 9 months, including 52 with congenital hearing loss. Although both groups showed the “small-world” network organization typical of efficient brain function – a pattern visible in the connectivity maps on page 3 – only typically hearing infants developed stronger left-hemisphere efficiency with age. Infants with hearing loss showed no such shift and, in some cases, a slight decline in left-hemisphere efficiency over time.
The divergence was clearest in frontal regions associated with early language processing, where typically hearing infants showed increasing leftward nodal efficiency across the first months of life. Those patterns did not emerge in infants with sensorineural hearing loss. “The first year of life is a critical window for brain organization,” the team noted in the accompanying press release. “If infants miss auditory input or early language exposure during this period, the brain’s left and right hemispheres may not develop their usual balance.”
Severity also mattered. Infants with mild hearing loss maintained some left-hemisphere specialization, while those with moderate, severe, or profound loss showed no hemispheric differences. Higher auditory brainstem response thresholds were associated with reduced frontal asymmetry, suggesting that the degree of auditory deprivation tracks with the extent of neural disruption.
While the study is cross-sectional, the findings reinforce long-standing evidence for a sensitive period in which structured auditory or language input supports the emergence of left-hemisphere dominance. “Early exposure – whether through cochlear implants, hearing aids or sign language – is essential,” the researchers said. “The brain needs structured input to build the networks that will later support communication and learning.”
The authors plan to follow the infants longitudinally and advocate combining fNIRS with MRI or EEG to better map how early sensory experience shapes communication-related networks.
