hearing

Music training protects against aging-related hearing loss

February, 2012

More evidence that music training protects older adults from age-related impairment in understanding speech, adding to the potential benefits of music training in preventing dementia.

I’ve spoken before about the association between hearing loss in old age and dementia risk. Although we don’t currently understand that association, it may be that preventing hearing loss also helps prevent cognitive decline and dementia. I have previously reported on how music training in childhood can help older adults’ ability to hear speech in a noisy environment. A new study adds to this evidence.

The study looked at a specific aspect of understanding speech: auditory brainstem timing. Aging disrupts this timing, degrading the ability to precisely encode sound.

In this study, automatic brain responses to speech sounds were measured in 87 younger and older normal-hearing adults as they watched a captioned video. It was found that older adults who had begun musical training before age 9 and engaged consistently in musical activities through their lives (“musicians”) not only significantly outperformed older adults who had no more than three years of musical training (“non-musicians”), but encoded the sounds as quickly and accurately as the younger non-musicians.

The researchers qualify this finding by saying that it shows only that musical experience selectively affects the timing of sound elements that are important in distinguishing one consonant from another, not necessarily all sound elements. However, it seems probable that it extends more widely, and in any case the ability to understand speech is crucial to social interaction, which may well underlie at least part of the association between hearing loss and dementia.

The burning question for many will be whether the benefits of music training can be accrued later in life. We will have to wait for more research to answer that, but, as music training and enjoyment fit the definition of ‘mentally stimulating activities’, this certainly adds another reason to pursue such a course.

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

tags strategies: 

Deep male voice helps women remember

November, 2011

It seems that what is said by deeper male voices is remembered better by heterosexual women, while memory is impaired for higher male voices. Pitch didn’t affect the memorability of female voices.

I had to report on this quirky little study, because a few years ago I discovered Leonard Cohen’s gravelly voice and then just a few weeks ago had it trumped by Tom Waits — I adore these deep gravelly voices, but couldn’t say why. Now a study shows that woman are not only sensitive to male voice pitch, but this affects their memory.

In the first experiment, 45 heterosexual women were shown images of objects while listening to the name of the object spoken either by a man or woman. The pitch of the voice was manipulated to be high or low. After spending five minutes on a Sudoku puzzle, participants were asked to choose which of two similar but not identical versions of the object was the one they had seen earlier. After the memory test, participants were tested on their voice preferences.

Women strongly preferred the low pitch male voice and remembered objects more accurately when they have been introduced by the deeper male voice than the higher male voice (mean score for object recognition was 84.7% vs 77.8%). There was no significant difference in memory relating to pitch for the female voices (83.9% vs 81.7% — note that these are not significantly different from the score for the deeper male voice).

So is it that memory is enhanced for deeper male voices, or that it is impaired for higher male voices (performance on the female voices suggests the latter)? Or are both factors at play? To sort this out, the second experiment, involving a new set of 46 women, included unmanipulated male and female voices.

Once again, women were unaffected by the different variations of female voices. However, male voices produced a clear linear effect, with the unmanipulated male voices squarely in the middle of the deeper and higher versions. It appears, then, that both factors are at play: deepening a male voice enhances its memorability, while raising it impairs its memorability.

It’s thought that deeper voices are associated with more desirable traits for long-term male partners. Having a better memory for specific encounters with desirable men would allow women to compare and evaluate men according to how they might behave in different relationship contexts.

The voices used were supplied by four young adult men and four young adult women. Pitch was altered through software manipulation. Participants were told that the purpose of the experiment was to study sociosexual orientation and object preference. Contraceptive pill usage did not affect the women’s responses.

Reference: 

Source: 

Topics: 

tags memworks: 

Perception

See also

Smell

Hearing

Vision

Older news items (pre-2010) brought over from the old website

Perception affected by mood

An imaging study has revealed that when people were shown a composite image with a face surrounded by "place" images, such as a house, and asked to identify the gender of the face, those in whom a bad mood had been induced didn’t process the places in the background. However, those in a good mood took in both the focal and background images. These differences in perception were coupled with differences in activity in the parahippocampal place area. Increasing the amount of information is of course not necessarily a good thing, as it may result in more distraction.

[1054] Schmitz, T. W., De Rosa E., & Anderson A. K.
(2009).  Opposing Influences of Affective State Valence on Visual Cortical Encoding.
J. Neurosci.. 29(22), 7199 - 7207.

http://www.eurekalert.org/pub_releases/2009-06/uot-pww060309.php

What we perceive is not what we sense

Perceiving a simple touch may depend as much on memory, attention, and expectation as on the stimulus itself. A study involving macaque monkeys has found that the monkeys’ perception of a touch (varied in intensity) was more closely correlated with activity in the medial premotor cortex (MPC), a region of the brain's frontal lobe known to be involved in making decisions about sensory information, than activity in the primary somatosensory cortex (which nevertheless accurately recorded the intensity of the sensation). MPC neurons began to fire before the stimulus even touched the monkeys' fingertips — presumably because the monkey was expecting the stimulus.

[263] de Lafuente, V., & Romo R.
(2005).  Neuronal correlates of subjective sensory experience.
Nat Neurosci. 8(12), 1698 - 1703.

http://www.eurekalert.org/pub_releases/2005-11/hhmi-tsi110405.php

Varied sensory experience important in childhood

A new baby has far more connections between neurons than necessary; from birth to about age 12 the brain trims 50% of these unnecessary connections while at the same time building new ones through learning and sensory stimulation — in other words, tailoring the brain to its environment. A mouse study has found that without enough sensory stimulation, infant mice lose fewer connections — indicating that connections need to be lost in order for appropriate ones to grow. The findings support the idea that parents should try to expose their children to a variety of sensory experiences.

[479] Zuo, Y., Yang G., Kwon E., & Gan W-B.
(2005).  Long-term sensory deprivation prevents dendritic spine loss in primary somatosensory cortex.
Nature. 436(7048), 261 - 265.

http://www.sciencentral.com/articles/view.htm3?article_id=218392607

Brain regions that process reality and illusion identified

Researchers have now identified the regions of the brain involved in processing what’s really going on, and what we think is going on. Macaque monkeys played a virtual reality video game in which the monkeys were tricked into thinking that they were tracing ellipses with their hands, although they actually were moving their hands in a circle. Monitoring of nerve cells revealed that the primary motor cortex represented the actual movement while the signals from cells in a neighboring area, called the ventral premotor cortex, were generating elliptical shapes. Knowing how the brain works to distinguish between action and perception will help efforts to build biomedical devices that can control artificial limbs, some day enabling the disabled to move a prosthetic arm or leg by thinking about it.

[1107] Schwartz, A. B., Moran D. W., & Reina A. G.
(2004).  Differential Representation of Perception and Action in the Frontal Cortex.
Science. 303(5656), 380 - 383.

http://news-info.wustl.edu/tips/page/normal/652.html
http://www.eurekalert.org/pub_releases/2004-02/wuis-rpb020704.php

Memory different depending on whether information received via eyes or ears

Carnegie Mellon scientists using magnetic resonance imaging found quite different brain activity patterns for reading and listening to identical sentences. During reading, the right hemisphere was not as active as expected, suggesting a difference in the nature of comprehension experienced when reading versus listening. When listening, there was greater activation in a part of Broca's area associated with verbal working memory, suggesting that there is more semantic processing and working memory storage in listening comprehension than in reading. This should not be taken as evidence that comprehension is better in one or other of these situations, merely that it is different. "Listening to an audio book leaves a different set of memories than reading does. A newscast heard on the radio is processed differently from the same words read in a newspaper."

[2540] Michael, E. B., Keller T. A., Carpenter P. A., & Just M A.
(2001).  fMRI investigation of sentence comprehension by eye and by ear: Modality fingerprints on cognitive processes.
Human Brain Mapping. 13(4), 239 - 252.

http://www.eurekalert.org/pub_releases/2001-08/cmu-tma081401.php

The chunking of our lives: the brain "sees" life in segments

We talk about "chunking" all the time in the context of memory. But the process of breaking information down into manageable bits occurs, it seems, right from perception. Magnetic resonance imaging reveals that when people watched movies of common, everyday, goal-directed activities (making the bed, doing the dishes, ironing a shirt), their brains automatically broke these continuous events into smaller segments. The study also identified a network of brain areas that is activated during the perception of boundaries between events. "The fact that changes in brain activity occurred during the passive viewing of movies indicates that this is how we normally perceive continuous events, as a series of segments rather than a dynamic flow of action."

Zacks, J.M., Braver, T.S., Sheridan, M.A., Donaldson, D.I., Snyder, A.Z., Ollinger, J.M., Buckner, R.L. & Raichle, M.E. 2001. Human brain activity time-locked to perceptual event boundaries. Nature Neuroscience, 4(6), 651-5.

http://www.eurekalert.org/pub_releases/2001-07/aaft-bp070201.php

Amygdala may be critical for allowing perception of emotionally significant events despite inattention

We choose what to pay attention to, what to remember. We give more weight to some things than others. Our perceptions and memories of events are influenced by our preconceptions, and by our moods. Researchers at Yale and New York University have recently published research indicating that the part of the brain known as the amygdala is responsible for the influence of emotion on perception. This builds on previous research showing that the amygdala is critically involved in computing the emotional significance of events. The amygdala is connected to those brain regions dealing with sensory experiences, and the theory that these connections allow the amygdala to influence early perceptual processing is supported by this research. Dr. Anderson suggests that “the amygdala appears to be critical for the emotional tuning of perceptual experience, allowing perception of emotionally significant events to occur despite inattention.”

[968] Anderson, A. K., & Phelps E. A.
(2001).  Lesions of the human amygdala impair enhanced perception of emotionally salient events.
Nature. 411(6835), 305 - 309.

http://www.eurekalert.org/pub_releases/2001-05/NYU-Infr-1605101.php

tags memworks: 

Childhood musical training helps auditory processing in old age

June, 2011

Another study confirms the cognitive benefits of extensive musical training that begins in childhood, at least for hearing.

A number of studies have demonstrated the cognitive benefits of music training for children. Now research is beginning to explore just how long those benefits last. This is the second study I’ve reported on this month, that points to childhood music training protecting older adults from aspects of cognitive decline. In this study, 37 adults aged 45 to 65, of whom 18 were classified as musicians, were tested on their auditory and visual working memory, and their ability to hear speech in noise.

The musicians performed significantly better than the non-musicians at distinguishing speech in noise, and on the auditory temporal acuity and working memory tasks. There was no difference between the groups on the visual working memory task.

Difficulty hearing speech in noise is among the most common complaints of older adults, but age-related hearing loss only partially accounts for the problem.

The musicians had all begun playing an instrument by age 8 and had consistently played an instrument throughout their lives. Those classified as non-musicians had no musical experience (12 of the 19) or less than three years at any point in their lives. The seven with some musical experience rated their proficiency on an instrument at less than 1.5 on a 10-point scale, compared to at least 8 for the musicians.

Physical activity levels were also assessed. There was no significant difference between the groups.

The finding that visual working memory was not affected supports the idea that musical training helps domain-specific skills (such as auditory and language processing) rather than general ones.

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

tags strategies: 

How the deaf have better vision; the blind better hearing

November, 2010

Two recent studies point to how those lacking one sense might acquire enhanced other senses, and what limits this ability.

An experiment with congenitally deaf cats has revealed how deaf or blind people might acquire other enhanced senses. The deaf cats showed only two specific enhanced visual abilities: visual localization in the peripheral field and visual motion detection. This was associated with the parts of the auditory cortex that would normally be used to pick up peripheral and moving sound (posterior auditory cortex for localization; dorsal auditory cortex for motion detection) being switched to processing this information for vision.

This suggests that only those abilities that have a counterpart in the unused part of the brain (auditory cortex for the deaf; visual cortex for the blind) can be enhanced. The findings also point to the plasticity of the brain. (As a side-note, did you know that apparently cats are the only animal besides humans that can be born deaf?)

The findings (and their broader implications) receive support from an imaging study involving 12 blind and 12 sighted people, who carried out an auditory localization task and a tactile localization task (reporting which finger was being gently stimulated). While the visual cortex was mostly inactive when the sighted people performed these tasks, parts of the visual cortex were strongly activated in the blind. Moreover, the accuracy of the blind participants directly correlated to the strength of the activation in the spatial-processing region of the visual cortex (right middle occipital gyrus). This region was also activated in the sighted for spatial visual tasks.

Reference: 

Source: 

Topics: 

tags memworks: 

tags problems: 

Verbal, not visual, cues enhance visual detection

August, 2010

We know language affects what we perceive, but a new study shows it can also improve our ability to perceive, even when an object should be invisible to us.

I’ve talked about the importance of labels for memory, so I was interested to see that a recent series of experiments has found that hearing the name of an object improved people’s ability to see it, even when the object was flashed onscreen in conditions and speeds (50 milliseconds) that would render it invisible. The effect was specific to language; a visual preview didn’t help.

Moreover, those who consider their mental imagery particularly vivid scored higher when given the auditory cue (although this association disappeared when the position of the object was uncertain). The researchers suggest that hearing the image labeled evokes an image of the object, strengthening its visual representation and thus making it visible. They also suggested that because words in different languages pick out different things in the environment, learning different languages might shape perception in subtle ways.

Reference: 

Source: 

Topics: 

tags memworks: 

tags strategies: 

Pages

Subscribe to RSS - hearing