A Canadian study found that people blink less when listening to speech, especially in noisy environments, suggesting blinking is linked to cognitive effort and attention, and could potentially be used to assess mental workload in the future.
A study shows that walking enhances brain responses to sounds and shifts auditory attention depending on walking direction, suggesting the brain filters routine noises and amplifies unexpected sounds to improve safety and navigation in dynamic environments.
A study reveals that the brain distinguishes music from speech using simple acoustic parameters, with slower, steady sounds perceived as music and faster, irregular sounds as speech. This understanding could enhance therapies for language disorders like aphasia.
University of Florida research suggests that chronic ear infections in early childhood can lead to language and auditory processing issues later in life, emphasizing the need for early and continuous monitoring. The study found that children with a history of chronic ear infections had smaller vocabularies, difficulty with similar sounding words, and problems detecting changes in sounds. The researchers stress the importance of taking ear infections seriously and monitoring children long after the last preschool earache fades away, as some language deficits may only reveal themselves in later grades. Prompt treatment of ear infections can help prevent fluid buildup that hurts language development, and future research will include children at risk for delays in auditory development for other reasons.
A study conducted by Georgetown University Medical Center reveals that blind individuals can recognize faces using auditory patterns processed by the fusiform face area in the brain, challenging the belief that facial recognition is solely dependent on visual experience. The researchers used a sensory substitution device to translate images into sound, allowing blind participants to recognize basic facial configurations. Functional MRI scans showed that the fusiform face area is active in both blind and sighted individuals during face recognition tasks, suggesting that this brain region encodes the concept of a face regardless of sensory input. The findings provide insights into the development and functioning of facial recognition in the brain.
A study published in The Journal of Neuropsychiatry and Clinical Neurosciences suggests that individuals with mood disorders, such as bipolar disorder and depression, have impaired speech understanding even when in remission. The research found that both bipolar disorder and unipolar depression were associated with worsened speech understanding compared to control subjects, regardless of whether the participants were symptomatic or in remission. The findings highlight the potential impact of mood disorders on communication and social difficulties, particularly in noisy environments. However, the study has limitations, such as not measuring general cognition and the influence of pharmacological interventions on the results.
Sen. John Fetterman returned to the Senate after a six-week stay in the hospital where he was treated for clinical depression. Fetterman suffered a stroke in May 2022 while campaigning for Pennsylvania's open Senate seat, causing him to have cognitive issues, including a problem with auditory processing. He was reportedly using a closed captioning device that types out what is being said to him so that he can have conversations with other members of Congress. Fetterman will chair his first subcommittee hearing days after returning from his weeks-long absence.