seniors

Cognitive training for older adults can also change a personality trait

February, 2012

A program designed to improve reasoning ability in older adults also increased their openness to new experiences.

Openness to experience – being flexible and creative, embracing new ideas and taking on challenging intellectual or cultural pursuits – is one of the ‘Big 5’ personality traits. Unlike the other four, it shows some correlation with cognitive abilities. And, like them, openness to experience does tend to decline with age.

However, while there have been many attempts to improve cognitive function in older adults, to date no one has tried to increase openness to experience. Naturally enough, one might think — it’s a personality trait, and we are not inclined to view personality traits as amenable to ‘training’. However, recently there have been some indications that personality traits can be changed, through cognitive interventions or drug treatments. In this new study, a cognitive training program for older adults also produced increases in their openness to experience.

The study involved 183 older adults (aged 60-94; average age 73), who were randomly assigned to a 16-week training program or a waiting-list control group. The program included training in inductive reasoning, and puzzles that relied in part on inductive reasoning. Most of this activity was carried out at home, but there were two 1-hour classroom sessions: one to introduce the inductive reasoning training, and one to discuss strategies for Sudoku and crosswords.

Participants came to the lab each week to hand in materials and pick up the next set. Initially, they were given crossword and Sudoku puzzles with a wide range of difficulty. Subsequently, puzzle sets were matched to each participant’s skill level (assessed from the previous week’s performance). Over the training period, the puzzles became progressively more difficult, with the steps tailored to each individual.

The inductive reasoning training involved learning to recognize novel patterns and use them to solve problems. In ‘basic series problems’, the problems required inference from a serial pattern of words, letters, or numbers. ‘Everyday serial problems’ included problems such as completing a mail order form and answering questions about a bus schedule. Again, the difficulty of the problems increased steadily over the training period.

Participants were asked to spend at least 10 hours a week on program activities, and according to the daily logs they filled in, they spent an average of 11.4 hours a week. In addition to the hopefully inherent enjoyment of the activities, those who recorded 10 hours were recognized on a bulletin board tally sheet and entered into a raffle for a prize.

Cognitive and personality testing took place 4-5 weeks prior to the program starting, and 4-5 weeks after program end. Two smaller assessments also took place during the program, at week 6 and week 12.

At the end of the program, those who had participated had significantly improved their pattern-recognition and problem-solving skills. This improvement went along with a moderate but significant increase in openness. Analysis suggested that this increase in openness occurred independently of improvement in inductive reasoning.

The benefits were specific to inductive reasoning and openness, with no significant effects on divergent thinking, processing speed, verbal ability, or the other Big 5 traits.

The researchers suggest that the carefully stepped training program was important in leading to increased openness, allowing the building of a growing confidence in their reasoning abilities. Openness to experience contributes to engagement and enjoyment in stimulating activity, and has also been linked to better health and decreased mortality risk. It seems likely, then, that increases in openness can be part of a positive feedback cycle, leading to greater and more sustained engagement in mentally stimulating activities.

The corollary is that decreases in openness may lead to declines in cognitive engagement, and then to poorer cognitive function. Indeed it has been previously suggested that openness to experience plays a role in cognitive aging.

Clearly, more research is needed to tease out how far these findings extend to other activities, and the importance of scaffolding (carefully designing cognitive activities on an individualized basis to support learning), but this work reveals an overlooked aspect to the issue of mental stimulation for preventing age-related cognitive decline.

Reference: 

Source: 

Topics: 

tags: 

tags development: 

tags memworks: 

tags problems: 

Music training protects against aging-related hearing loss

February, 2012

More evidence that music training protects older adults from age-related impairment in understanding speech, adding to the potential benefits of music training in preventing dementia.

I’ve spoken before about the association between hearing loss in old age and dementia risk. Although we don’t currently understand that association, it may be that preventing hearing loss also helps prevent cognitive decline and dementia. I have previously reported on how music training in childhood can help older adults’ ability to hear speech in a noisy environment. A new study adds to this evidence.

The study looked at a specific aspect of understanding speech: auditory brainstem timing. Aging disrupts this timing, degrading the ability to precisely encode sound.

In this study, automatic brain responses to speech sounds were measured in 87 younger and older normal-hearing adults as they watched a captioned video. It was found that older adults who had begun musical training before age 9 and engaged consistently in musical activities through their lives (“musicians”) not only significantly outperformed older adults who had no more than three years of musical training (“non-musicians”), but encoded the sounds as quickly and accurately as the younger non-musicians.

The researchers qualify this finding by saying that it shows only that musical experience selectively affects the timing of sound elements that are important in distinguishing one consonant from another, not necessarily all sound elements. However, it seems probable that it extends more widely, and in any case the ability to understand speech is crucial to social interaction, which may well underlie at least part of the association between hearing loss and dementia.

The burning question for many will be whether the benefits of music training can be accrued later in life. We will have to wait for more research to answer that, but, as music training and enjoyment fit the definition of ‘mentally stimulating activities’, this certainly adds another reason to pursue such a course.

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

tags strategies: 

Nicotine patch shows benefits in mild cognitive impairment

February, 2012

A pilot study suggests that wearing a nicotine patch may help improve memory loss in older adults with mild cognitive impairment.

The study involved 74 non-smokers with amnestic MCI (average age 76), of whom half were given a nicotine patch of 15 mg a day for six months and half received a placebo. Cognitive tests were given at the start of the study and again after three and six months.

After 6 months of treatment, the nicotine-treated group showed significant improvement in attention, memory, speed of processing and consistency of processing. For example, the nicotine-treated group regained 46% of normal performance for age on long-term memory, whereas the placebo group worsened by 26%.

Nicotine is an interesting drug, in that, while predominantly harmful, it can have positive effects if the dose is just right, and if the person’s cognitive state is at a particular level (slipping below their normal state, but not too far below). Too much nicotine will make things worse, so it’s important not to self-medicate.

Nicotine has been shown to improve cognitive performance in smokers who have stopped smoking and previous short-term studies with nicotine have shown attention and memory improvement in people with Alzheimer's disease. Nicotine receptors in the brain are reduced in Alzheimer’s brains.

Because the dose is so crucial, and the effects so dependent on brain state (including, one assumes, whether the person has been a smoker or not), more research is needed before this can be used as a treatment.

Reference: 

[2736] Newhouse, P., Kellar K., Aisen P., White H., Wesnes K., Coderre E., et al.
(2012).  Nicotine treatment of mild cognitive impairment.
Neurology. 78(2), 91 - 101.

Source: 

Topics: 

tags development: 

tags lifestyle: 

tags problems: 

Higher risk of mild cognitive impairment among older men

February, 2012

Significant differences in the risk of mild cognitive impairment for men and women, and in the risk of developing the two sub-types, suggests that risk factors should be considered separately for genders and sub-type.

More data from the long-running Mayo Clinic Study of Aging has revealed that, in this one part of the U.S. at least, MCI develops at an overall rate of 6.4% a year among older adults (70+), with a higher rate for men and the less-educated.

The study involved 1,450 older adults (aged 70-89), who underwent memory testing every 15 months for an average of three years. By the end of the study period, 296 people had developed MCI, a rate of 6.4% per year. For men, the rate was 7.2% compared to 5.7% for women.

It should be noted that these rates apply to a relatively homogeneous group of people. Participants come from one county in Minnesota, an overwhelmingly white part of the U.S.

MCI comes in two types: amnestic (involving memory loss) and non-amnestic. Amnestic MCI was more than twice as common as non-amnestic MCI. The incidence rate of aMCI was also higher for men (4.4%) than women (3.3%), as was the risk of naMCI (2% vs 1.1%).

Those who had less education also had higher rates of MCI. For aMCI, the rate for those with 12 years or less of education was 4.3%, compared to 3.25% for those with more education. Similarly, for naMCI, the rates were 2% and 1%, respectively.

While the great majority of people diagnosed with MCI continued to have the disorder or progressed to dementia, some 12% were later re-diagnosed as not having it. This, I would presume, probably reflects temporary ‘dips’ in cognitive performance as a consequence of physical or emotional problems.

The differences between aMCI and naMCI, and between genders, suggest that risk factors for these should be considered separately.

Reference: 

Source: 

Topics: 

tags development: 

tags problems: 

'Exergames' may provide greater cognitive benefit for older adults

February, 2012

An intriguing pilot study finds that regular exercise on a stationary bike enhanced with a computer game-type environment improves executive function in older adults more than ordinary exercise on a stationary bike.

We know that physical exercise greatly helps you prevent cognitive decline with aging. We know that mental stimulation also helps you prevent age-related cognitive decline. So it was only a matter of time before someone came up with a way of combining the two. A new study found that older adults improved executive function more by participating in virtual reality-enhanced exercise ("exergames") that combine physical exercise with computer-simulated environments and interactive videogame features, compared to the same exercise without the enhancements.

The Cybercycle Study involved 79 older adults (aged 58-99) from independent living facilities with indoor access to a stationary exercise bike. Of the 79, 63 participants completed the three-month study, meaning that they achieved at least 25 rides during the three months.

Unfortunately, randomization was not as good as it should have been — although the researchers planned to randomize on an individual basis, various technical problems led them to randomize on a site basis (there were eight sites), with the result that the cybercycle group and the control bike group were significantly different in age and education. Although the researchers took this into account in the analysis, that is not the same as having groups that match in these all-important variables. However, at least the variables went in opposite directions: while the cybercycle group was significantly younger (average 75.7 vs 81.6 years), it was significantly less educated (average 12.6 vs 14.8 years).

Perhaps also partly off-setting the age advantage, the cybercycle group was in poorer shape than the control group (higher BMI, glucose levels, lower physical activity level, etc), although these differences weren’t statistically significant. IQ was also lower for the cybercycle group, if not significantly so (but note the high averages for both groups: 117.6 vs 120.6). One of the three tests of executive function, Color Trails, also showed a marked group difference, but the large variability in scores meant that this difference was not statistically significant.

Although participants were screened for disorders such as Alzheimer’s and Parkinson’s, and functional disability, many of both groups were assessed as having MCI — 16 of the 38 in the cybercycle group and 14 of the 41 in the control bike group.

Participants were given cognitive tests at enrolment, one month later (before the intervention began), and after the intervention ended. The stationary bikes were identical for both groups, except the experimental bike was equipped with a virtual reality display. Cybercycle participants experienced 3D tours and raced against a "ghost rider," an avatar based on their last best ride.

The hypothesis was that cybercycling would particularly benefit executive function, and this was borne out. Executive function (measured by the Color Trails, Stroop test, and Digits Backward) improved significantly more in the cybercycle condition, and indeed was the only cognitive task to do so (other cognitive tests included verbal fluency, verbal memory, visuospatial skill, motor function). Indeed, the control group, despite getting the same amount of exercise, got worse at the Digits Backward test, and failed to show any improvement on the Stroop test.

Moreover, significantly fewer cybercyclists progressed to MCI compared to the control group (three vs nine).

There were no differences in exercise quantity or quality between the two groups — which does argue against the idea that cyber-enhanced physical activity would be more motivating. However, the cybercycling group did tend to comment on their enjoyment of the exercise. While the enjoyment may not have translated into increased activity in this situation, it may well do so in a longer, less directed intervention — i.e. real life.

It should also be remembered that the intervention was relatively short, and that other cognitive tasks might take longer to show improvement than the more sensitive executive function. This is supported by the fact that levels of the brain growth factor BDNF, assessed in 30 participants, showed a significantly greater increase of BDNF in cybercyclists.

I should also emphasize that the level of physical exercise really wasn't that great, but nevertheless the size of the cybercycle's effect on executive function was greater than usually produced by aerobic exercise (a medium effect rather than a small one).

The idea that activities that combine physical and mental exercise are of greater cognitive benefit than the sum of benefits from each type of exercise on its own is not inconsistent with previous research, and in keeping with evidence from animal studies that physical exercise and mental stimulation help the brain via different mechanisms. Moreover, I have an idea that enjoyment (in itself, not as a proxy for motivation) may be a factor in the cognitive benefits derived from activities, whether physical or mental. Mere speculation, derived from two quite separate areas of research: the idea of “flow” / “being in the zone”, and the idea that humor has physiological benefits.

Of course, as discussed, this study has a number of methodological issues that limit its findings, but hopefully it will be the beginning of an interesting line of research.  

Reference: 

[2724] Anderson-Hanley, C., Arciero P. J., Brickman A. M., Nimon J. P., Okuma N., Westen S. C., et al.
(2012).  Exergaming and Older Adult Cognition.
American Journal of Preventive Medicine. 42(2), 109 - 119.

Source: 

Topics: 

tags: 

tags development: 

tags lifestyle: 

tags memworks: 

tags problems: 

tags strategies: 

Cognitive decline begins in middle age

February, 2012

A large ten-year study of middle-aged to older adults (45-70) has found that cognitive decline begins in the 45-55 decade, with reasoning ability the most affected by age.

The age at which cognitive decline begins has been the subject of much debate. The Seattle longitudinal study has provided most of the evidence that it doesn’t begin until age 60. A more recent, much larger study that allows both longitudinal and cross-sectional analysis suggests that, depressingly, mid-to-late forties might be closer to the mark.

A long-term British study known as Whitehall II began in 1985, when all civil servants aged 35-55 in 20 London-based departments were invited to participate. In 1997-9, 5198 male and 2192 female civil servants, aged 45-70 at this point, were given the first of three rounds of cognitive testing. The second round took place in 2002-4, and the third in 2007-9.

Over these ten years, all cognitive scores except vocabulary declined in all five age categories (45-49, 50-54, 55-59, 60-64, and 65-70 at baseline). Unsurprisingly, the decline was greater with increasing age, and greatest for reasoning. Men aged 45-9 at baseline showed a 3.6% decline in reasoning, compared to a 9.6% decline for those aged 65-70. Women were less affected by age: while showing the same degree of decline when younger, the oldest showed a 7.4% decline.

None of the other cognitive tasks showed the same age-related deterioration as reasoning, which displayed a consistently linear decline with advancing age. The amount of decline over ten years was roughly similar for each age group for short-term memory and phonemic and semantic fluency (although the women displayed more variability in memory, in a somewhat erratic pattern which may perhaps reflect hormonal changes — I’m speculating here). Moreover, the amount of decline in each decade for these functions was only about the same as reasoning’s decline in the younger decades — about -4% in each decade.

Men and women differed significantly in education (33% of men attended university compared to 21% of women; 57% of women never finished secondary school compared to 39% of men). It is therefore unsurprising that men performed significantly better on all cognitive tests except memory (noting that the actual differences in score were mostly quite small: 16.9/35 vs 16.5 for phonemic fluency; 16.7/35 vs 15.8 for semantic fluency; 25.7/33 vs 23.1 for vocabulary; 48.7/65 vs 41.6 for reasoning).

The cognitive tests included a series of 65 verbal and mathematical reasoning items of increasing difficulty (testing inductive reasoning), a 20-word free recall test (short-term verbal memory), recalling as many words as possible beginning with “S” (phonemic fluency) and recalling members of the animal category (semantic fluency), and a multi-choice vocabulary test.

The design of the study allowed both longitudinal and cross-sectional analyses to be carried out. Cross-sectional data, although more easily acquired, has been criticized as conflating age effects with cohort differences. Generations differ on several relevant factors, of which education is the most obvious. The present study semi-confirmed this, finding that cross-sectional data considerably over-estimated cognitive decline in women but not men — reflecting the fact that education changed far more for women than men in the relevant time periods. For example, in the youngest group of men, 30% had less than a secondary school education and 42% had a university degree, and the women showed a similar pattern, with 34% and 40%. However, for those aged 55-59 at baseline, the corresponding figures were 38% and 29% for men compared to 58% and 17% for women.

The principal finding is of course that measurable cognitive decline was evident in the youngest group, meaning that at some point during that 45-55 decade, cognitive faculties begin to decline. Of course, it should be emphasized that this is a group effect — individuals will vary in the extent and timing of any cognitive decline.

(A side-note: During the ten year period, 305 participants died. The probability of dying was higher in those with poorer cognitive scores at baseline.)

Reference: 

Source: 

Topics: 

tags development: 

tags problems: 

Diet linked to brain atrophy in old age

January, 2012
  • A more rigorous measurement of diet finds that dietary factors account for nearly as much brain shrinkage as age, education, APOE genotype, depression and high blood pressure combined.

The study involved 104 healthy older adults (average age 87) participating in the Oregon Brain Aging Study. Analysis of the nutrient biomarkers in their blood revealed that those with diets high in omega 3 fatty acids and in vitamins C, D, E and the B vitamins had higher scores on cognitive tests than people with diets low in those nutrients, while those with diets high in trans fats were more likely to score more poorly on cognitive tests.

These were dose-dependent, with each standard deviation increase in the vitamin BCDE score ssociated with a 0.28 SD increase in global cognitive score, and each SD increase in the trans fat score associated with a 0.30 SD decrease in global cognitive score.

Trans fats are primarily found in packaged, fast, fried and frozen food, baked goods and margarine spreads.

Brain scans of 42 of the participants found that those with diets high in vitamins BCDE and omega 3 fatty acids were also less likely to have the brain shrinkage associated with Alzheimer's, while those with high trans fats were more likely to show such brain atrophy.

Those with higher omega-3 scores also had fewer white matter hyperintensities. However, this association became weaker once depression and hypertension were taken into account.

Overall, the participants had good nutritional status, but 7% were deficient in vitamin B12 (I’m surprised it’s so low, but bear in mind that these are already a select group, being healthy at such an advanced age) and 25% were deficient in vitamin D.

The nutrient biomarkers accounted for 17% of the variation in cognitive performance, while age, education, APOE genotype (presence or absence of the ‘Alzheimer’s gene’), depression and high blood pressure together accounted for 46%. Diet was more important for brain atrophy: here, the nutrient biomarkers accounted for 37% of the variation, while the other factors accounted for 40% (meaning that diet was nearly as important as all these other factors combined!).

The findings add to the growing evidence that diet has a significant role in determining whether or not, and when, you develop Alzheimer’s disease.

Reference: 

Source: 

Topics: 

tags development: 

tags lifestyle: 

tags problems: 

Physical evidence bilingualism delays onset of Alzheimer's symptoms

January, 2012
  • Brain scans reveal that active bilinguals can have nearly twice as much brain atrophy as monolinguals before cognitive performance suffers.

Growing evidence points to greater education and mentally stimulating occupations and activities providing a cognitive reserve that enables people with developing Alzheimer's to function normally for longer. Cognitive reserve means that your brain can take more damage before it has noticeable effects. A 2006 review found that some 30% of older adults found to have Alzheimer’s when autopsied had shown no signs of it when alive.

There are two relevant concepts behind the protection some brains have: cognitive reserve (which I have mentioned on a number of occasions), and brain reserve, which is more structural. ‘Brain reserve’ encapsulates the idea that certain characteristics, such as a greater brain size, help protect the brain from damage. Longitudinal studies have provided evidence, for example, that a larger head size in childhood helps reduce the risk of developing Alzheimer’s.

While cognitive reserve has been most often associated with education, it has also been associated with occupation, bilingualism, and music. A new study provides physical evidence for how effective bilingualism is.

The Toronto study involved 40 patients with a diagnosis of probable Alzheimer’s, of whom half were bilingual (fluent in a second language, and consistent users of both languages throughout their lives). Bilingual and monolingual patients were matched on a test of cognitive function (the Behavioral Neurology Assessment). The two groups were similar in education levels, gender, and performance on the MMSE and the clock drawing test. The groups did differ significantly in occupational status, with the monolinguals having higher job status than the bilinguals.

Notwithstanding this similarity in cognitive performance, brain scans revealed that the bilingual group had substantially greater atrophy in the medial temporal lobe and the temporal lobe. The two groups did not differ in measures of central and frontal atrophy, however — these regions are not associated with Alzheimer’s.

In other words, bilingualism seems to specifically help protect those areas implicated in Alzheimers, and the bilinguals could take much greater damage to the brain before it impacted their cognitive performance. It is suggested that the act of constantly switching between languages, or suppressing one language in favor of other, may help train the brain to be more flexible when the need comes to compensate for damaged areas.

The findings are consistent with previous observational studies suggesting that bilingualism delays the onset of Alzheimer's symptoms by up to five years.

Reference: 

[2712] Schweizer, T. A., Ware J., Fischer C. E., Craik F. I. M., & Bialystok E.
(2011).  Bilingualism as a contributor to cognitive reserve: Evidence from brain atrophy in Alzheimer’s disease.
Cortex.

Valenzuela MJ and Sachdev P. 2006. Brain reserve and dementia: A systematic review. Psychological Medicine, 36(4): 441e454.

Source: 

Topics: 

tags: 

tags development: 

tags problems: 

tags strategies: 

Brain atrophy may predict risk for early Alzheimer's disease

January, 2012
  • Shrinking of certain brain regions predicts age-related cognitive decline and dementia, with greater brain tissue loss markedly increasing risk.

A study involving 159 older adults (average age 76) has confirmed that the amount of brain tissue in specific regions is a predictor of Alzheimer’s disease development. Of the 159 people, 19 were classified as at high risk on the basis of the smaller size of nine small regions previously shown to be vulnerable to Alzheimer's), and 24 as low risk. The regions, in order of importance, are the medial temporal, inferior temporal, temporal pole, angular gyrus, superior parietal, superior frontal, inferior frontal cortex, supramarginal gyrus, precuneus.

There was no difference between the three risk groups at the beginning of the study on global cognitive measures (MMSE; Alzheimer’s Disease Assessment Scale—cognitive subscale; Clinical Dementia Rating—sum of boxes), or in episodic memory. The high-risk group did perform significantly more slowly on the Trail-making test part B, with similar trends on the Digit Symbol and Verbal Fluency tests.

After three years, 125 participants were re-tested. Nine met the criteria for cognitive decline. Of these, 21% were from the small high-risk group (3/14) and 7% from the much larger average-risk group (6/90). None were from the low-risk group.

The results were even more marked when less stringent criteria were used. On the basis of an increase on the Clinical Dementia Rating, 28.5% of the high-risk group and 9.7% of the average-risk group showed decline. On the basis of declining at least one standard deviation on any one of the three neuropsychological tests, half the high-risk group, 35% of the average risk group, and 14% (3/21) of the low-risk group showed decline. (The composite criteria required both of these criteria.)

Analysis estimated that every standard deviation of cortical thinning (reduced brain tissue) was associated with a nearly tripled risk of cognitive decline.

The 84 individuals for whom amyloid-beta levels in the cerebrospinal fluid were available also revealed that 60% of the high-risk group had levels consistent with the presence of Alzheimer's pathology, compared to 36% of those at average risk and 19% of those at low risk.

The findings extend and confirm the evidence that brain atrophy in specific regions is a biomarker for developing Alzheimer’s.

Reference: 

[2709] Dickerson, B. C., & Wolk D. A.
(2012).  MRI cortical thickness biomarker predicts AD-like CSF and cognitive decline in normal adults.
Neurology. 78(2), 84 - 90.

Dickerson BC, Bakkour A, Salat DH, et al. 2009. The cortical signature of Alzheimer’s disease: regionally specific cortical thinning relates to symptom severity in very mild to mild AD dementia and is detectable in asymptomatic amyloidpositive individuals. Cereb Cortex;19:497–510.

Source: 

Topics: 

tags development: 

tags problems: 

Reviving a failing sense of smell through training

January, 2012

A rat study reveals how training can improve or impair smell perception.

The olfactory bulb is in the oldest part of our brain. It connects directly to the amygdala (our ‘emotion center’) and our prefrontal cortex, giving smells a more direct pathway to memory than our other senses. But the olfactory bulb is only part of the system processing smells. It projects to several other regions, all of which are together called the primary olfactory cortex, and of which the most prominent member is the piriform cortex. More recently, however, it has been suggested that it would be more useful to regard the olfactory bulb as the primary olfactory cortex (primary in the sense that it is first), while the piriform cortex should be regarded as association cortex — meaning that it integrates sensory information with ‘higher-order’ (cognitive, contextual, and behavioral) information.

Testing this hypothesis, a new rat study has found that, when rats were given training to distinguish various odors, each smell produced a different pattern of electrical activity in the olfactory bulb. However, only those smells that the rat could distinguish from others were reflected in distinct patterns of brain activity in the anterior piriform cortex, while smells that the rat couldn’t differentiate produced identical brain activity patterns there. Interestingly, the smells that the rats could easily distinguish were ones in which one of the ten components in the target odor had been replaced with a new component. The smells they found difficult to distinguish were those in which a component had simply been deleted.

When a new group of rats was given additional training (8 days vs the 2 days given the original group), they eventually learned to discriminate between the odors the first animals couldn’t distinguish, and this was reflected in distinct patterns of brain activity in the anterior piriform cortex. When a third group were taught to ignore the difference between odors the first rats could readily distinguish, they became unable to tell the odors apart, and similar patterns of brain activity were produced in the piriform cortex.

The effects of training were also quite stable — they were still evident after two weeks.

These findings support the idea of the piriform cortex as association cortex. It is here that experience modified neuronal activity. In the olfactory bulb, where all the various odors were reflected in different patterns of activity right from the beginning (meaning that this part of the brain could discriminate between odors that the rat itself couldn’t distinguish), training made no difference to the patterns of activity.

Having said that, it should be noted that this is not entirely consistent with previous research. Several studies have found that odor training produces changes in the representations in the olfactory bulb. The difference may lie in the method of neural recording.

How far does this generalize to the human brain? Human studies have suggested that odors are represented in the posterior piriform cortex rather than the anterior piriform cortex. They have also suggested that the anterior piriform cortex is involved in expectations relating to the smells, rather than representing the smells themselves. Whether these differences reflect species differences, task differences, or methodological differences, remains to be seen.

But whether or not the same exact regions are involved, there are practical implications we can consider. The findings do suggest that one road to olfactory impairment is through neglect — if you learn to ignore differences between smells, you will become increasingly less able to do so. An impaired sense of smell has been found in Alzheimer’s disease, Parkinson's disease, schizophrenia, and even normal aging. While some of that may well reflect impairment earlier in the perception process, some of it may reflect the consequences of neglect. The burning question is, then, would it be possible to restore smell function through odor training?

I’d really like to see this study replicated with old rats.

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

tags strategies: 

Pages

Subscribe to RSS - seniors