Aging - how cognitive function declines

Older adults commonly need to practice more than younger adults to achieve the same level of performance. Such age deficits are at least partly due to poorer monitoring of their learning.

Failing to immediately retrieve well-known information does become more common with age, with an increase in "tips of the tongue" evident as early as the mid-thirties. Older people tend to be less likely than younger people to actively pursue a missing word.

Older adults are less likely than younger ones to use the appropriate brain regions when performing a memory task, and more likely to use cortical regions that are not as useful. But this can be at least partly overcome if the seniors are given specific strategy instructions.

Older adults appear to be particularly impaired in context processing — particularly seen in an inability to remember where they heard (or read, or saw) something. Because context is involved in many memory processes, this may have far-reaching implications. An impaired ability to remember context may reflect frontal-lobe inefficiency rather than aging per se.

Decreased ability to remember past events is linked to an impaired ability to imagine future events.

Older adults may compensate for cognitive decline by using additional brain regions. However, the downside is that these brain regions are then not available when a task requires them specifically. This may explain older adults' poorer performance on complex short-term memory tasks.

An important (perhaps even the most important) reason for cognitive decline in older adults is now seen to be a growing inability to filter out irrelevant/distracting information and inhibit processing. There can, however, be a decision-making/problem-solving advantage to this inclusion of apparently irrelevant information.

Older adults’ greater problems with multitasking stem from their impaired ability to disengage from an interrupting task and restore the original task.

There is growing evidence that memory problems (even amnesia) reflect confusion between memories more than loss of memory, and age-related difficulties reflect increasing difficulties in replacing out-of-date information with new, or distinguishing between them.

There do seem to be some gender differences in how brains change with age, which is consistent with the evidence that general intelligence is reflected in different brain attributes for men and women.

While IQ tends to drop with age, this may simply reflect perception deficits, not cognitive ones.

Brain regions that are especially affected by age include shrinking of the frontal lobe, especially the prefrontal cortex, of the medial temporal lobe, especially the hippocampus, and (for men only) the cerebellum. Aging also tends to degrade white matter, leading to brain networks growing less coordinated. The default network is most severely disrupted. Levels of the inhibitory neurotransmitter GABA also tend to decline with age, as does the levels of dopamine. Both are important for learning and memory.

Latest Research News

An Australian study involving 102 older adults (60-90) has concluded that physical fitness and arterial stiffness account for a great deal of age-related memory decline.

The study that, while both physical fitness and aortic stiffness were associated with spatial working memory performance, the two factors affected cognition independently. More importantly, and surprisingly, statistical modelling found that, taking BMI and gender into account, fitness and aortic stiffness together explained a third (33%) of the individual differences in spatial working memory — with age no longer predicting any of the differences.

While physical fitness didn’t seem to affect central arterial stiffness, the researchers point out that only current fitness was assessed and long term fitness might be a better predictor of central arterial stiffness.

It's also worth noting that only one cognitive measure was used. However, this particular measure should be a good one for assessing cognition untainted by the benefits of experience — a purer measure of the ability to process information, as it were.

It would also be interesting to extend the comparison to younger adults. I hope future research will explore these aspects.

Nevertheless, the idea that age-related cognitive decline might be largely, or even entirely, accounted for by one's physical fitness and the state of one's arteries, is an immensely appealing one.

Fitness was assessed with the Six-Minute Walk test which involved participants walking back and forth between two markers placed 10 metres apart for six minutes. Only participants who completed the full six minutes were included in the analysis.

https://www.eurekalert.org/pub_releases/2018-06/ip-bpf061118.php

Data from over 5,000 individuals found that a measure of belly fat (waist:hip ratio) was associated with reduced cognitive function in older Irish adults (60+). Body mass index (BMI), however, was found to protect cognitive function.

BMI is a crude measure of body fat and cannot differentiate between fat and muscle — the muscle component is likely to be the protective factor.

Research indicates that as we age, our fat becomes less efficient at producing a hormone that helps support the growth and survival of neurons and helps regulate their activity.

That hormone is adiponectin, which is made by fat cells, circulates in our blood and enters our brain. Inside fat cells, its production is regulated by PPAR-γ (peroxisome proliferator-activated receptor gamma).

Adiponectin is anti-inflammatory and can help regulate neuronal activity, including turning activity of some neurons up and others down. However, adiponectin is reduced in Alzheimer’s patients. Delivering adiponectin to the brain has been shown to improve cognition in mice.

Chronic stress can also decrease fat's production of PPAR-γ and adiponectin.

Fat cells become less efficient at making adiponectin in obesity, and with age. One theory is that fat cells start making inflammation-promoting signals called cytokines and this inflammation then inhibits adiponectin production.

The shift from beneficial subcutaneous fat to unhealthy fat that piles up on our bellies and around the organs inside our abdominal cavity is one that naturally occurs with age, but it is of course worse if you have a lot of excess weight around your abdomen.

Genetic variations in PPAR-γ and adiponectin as well as low blood levels of adiponectin are associated with an increased Alzheimer's risk.

https://www.eurekalert.org/pub_releases/2018-08/tcd-mob080118.php

Dietary choline linked to reduced dementia risk & better cognition

Data from a large, long-running Finnish study, involving some 2,500 men aged 42-60, has found that dietary intake of phosphatidylcholine was associated with a reduced risk of dementia (the risk was 28% lower in men with the highest intake compared to the lowest). Men with the highest intake of dietary phosphatidylcholine also excelled in tests measuring their memory and linguistic abilities.

The key sources of phosphatidylcholine in the study population's diet were eggs (39%) and meat (37%).

Choline is necessary for the formation of the neurotransmitter acetylcholine. Earlier studies have linked choline intake with cognitive processing, and adequate choline intake may play a role in the prevention of cognitive decline and Alzheimer's disease.

There was no interaction with the APOE4 gene.

https://www.eurekalert.org/pub_releases/2019-08/uoef-dca080619.php

Choline helps fight Alzheimer's across generations

A study using genetically engineered mice found that when they were given high choline in their diet, their offspring showed improved in spatial memory.

Study of the hippocampus found the choline had reduced microglial activation, and thus brain inflammation, and reduced levels of homocysteine by converting it into the more helpful chemical methionine. These effects, achieved through gene modification, passed on to the next generation.

It has long been recognized that choline is particularly important in early brain development.

https://www.eurekalert.org/pub_releases/2019-01/asu-enm010719.php

Ylilauri, Maija P.T. et al. 2019. Associations of dietary choline intake with risk of incident dementia and with cognitive performance: the Kuopio Ischaemic Heart Disease Risk Factor Study.
American Journal of Clinical Nutrition, published online July 30, 2019, https://doi.org/10.1093/ajcn/nqz148

[4504] Velazquez, R., Ferreira E., Winslow W., Dave N., Piras I. S., Naymik M., et al.
(2019).  Maternal choline supplementation ameliorates Alzheimer’s disease pathology by reducing brain homocysteine levels across multiple generations.
Molecular Psychiatry. 1 - 10.

Can computer use, crafts and games slow or prevent age-related memory loss?

A study involving 2,000 healthy older adults (average age 78) found that mentally stimulating activities were linked to a lower risk or delay of MCI, and that the timing and number of these activities may also play a role.

During the study, 532 participants developed MCI.

Using a computer in middle-age (50-65) was associated with a 48% lower risk of MCI, while using a computer in later life was associated with a 30% lower risk, and using a computer in both middle-age and later life was associated with a 37% lower risk.

Engaging in social activities, like going to movies or going out with friends, or playing games, like doing crosswords or playing cards, in both middle-age and later life were associated with a 20% lower risk of developing MCI.

Craft activities were associated with a 42% lower risk, but only in later life.

Those who engaged in two activities were 28% less likely to develop MCI than those who took part in no activities, while those who took part in three activities were 45% less likely, those with four activities 56% percent less likely and those with five activities were 43% less likely.

It should be noted that activities in middle-age were assessed by participants’ memory many years later.

https://www.eurekalert.org/pub_releases/2019-07/aaon-ccu071019.php

Regular crosswords & sudoku linked to sharper brain in later life

Data from the PROTECT online platform, involving 19,000 healthy older adults (50-96), found that the more regularly older adults played puzzles such as crosswords and Sudoku, the better they performed on tasks assessing attention, reasoning and memory.

In some areas the improvement was quite dramatic, for example, on measures of problem-solving, people who regularly do these puzzles performed equivalent to an average of eight years younger compared to those who don't.

https://www.eurekalert.org/pub_releases/2019-05/uoe-rca051419.php

Mind-body exercises improve cognitive function in older adults

A meta-analysis of 32 randomized controlled trials with 3,624 older adults with or without cognitive impairment has concluded that mind-body exercises, especially tai chi and dance mind-body exercise, help improve global cognition, cognitive flexibility, working memory, verbal fluency, and learning in older adults.

https://www.eurekalert.org/pub_releases/2018-12/w-mem121718.php

Krell-Roesch, J., Syrjanen, J. A., Vassilaki, M., Machulda, M. M., Mielke, M. M., Knopman, D. S., … Geda, Y. E. (2019). Quantity and quality of mental activities and the risk of incident mild cognitive impairment. Neurology, 93(6), e548. https://doi.org/10.1212/WNL.0000000000007897

Brooker, H., Wesnes, K. A., Ballard, C., Hampshire, A., Aarsland, D., Khan, Z., … Corbett, A. (2019). The relationship between the frequency of number-puzzle use and baseline cognitive function in a large online sample of adults aged 50 and over. International Journal of Geriatric Psychiatry, 34(7), 932–940. https://doi.org/10.1002/gps.5085

Brooker, H., Wesnes, K. A., Ballard, C., Hampshire, A., Aarsland, D., Khan, Z., … Corbett, A. (2019). An online investigation of the relationship between the frequency of word puzzle use and cognitive function in a large sample of older adults. International Journal of Geriatric Psychiatry, 34(7), 921–931. https://doi.org/10.1002/gps.5033

Wu, C., Yi, Q., Zheng, X., Cui, S., Chen, B., Lu, L., & Tang, C. (2019). Effects of Mind-Body Exercises on Cognitive Function in Older Adults: A Meta-Analysis. Journal of the American Geriatrics Society, 67(4), 749–758. https://doi.org/10.1111/jgs.15714

Americans with a college education live longer without dementia and Alzheimer's

Data from the large, long-running U.S. Health and Retirement Study found that healthy cognition characterized most of the people with at least a college education into their late 80s, while those who didn’t complete high school had good cognition up until their 70s.

The study found that those who had at least a college education lived a much shorter time with dementia than those with less than a high school education: an average of 10 months for men and 19 months for women, compared to 2.57 years (men) and 4.12 years (women).

The data suggests that those who graduated high school can expect to live (on average) at least 70% of their remaining life after 65 with good cogntion, compared to more than 80% for those with a college education, and less than 50% for those who didn't finish high school.

The analysis was based on a sample of 10,374 older adults (65+; average age 74) in 2000 and 9,995 in 2010.

https://www.eurekalert.org/pub_releases/2018-04/uosc-awa041618.php

https://academic.oup.com/psychsocgerontology/article/73/suppl_1/S20/4971564 (open access)

More education linked to better cognitive functioning later in life

Data from around 196,000 subscribers to Lumosity online brain-training games found that higher levels of education were strong predictors of better cognitive performance across the 15- to 60-year-old age range of their study participants, and appear to boost performance more in areas such as reasoning than in terms of processing speed.

Differences in performance were small for test subjects with a bachelor's degree compared to those with a high school diploma, and moderate for those with doctorates compared to those with only some high school education.

But people from lower educational backgrounds learned novel tasks nearly as well as those from higher ones.

https://www.eurekalert.org/pub_releases/2017-08/l-mel082117.php

http://www.futurity.org/higher-education-cognitive-peak-1523712/

Youthful cognitive ability strongly predicts mental capacity later in life

Data from more than 1,000 men participating in the Vietnam Era Twin Study of Aging revealed that their cognitive ability at age 20 was a stronger predictor of cognitive function later in life than other factors, such as higher education, occupational complexity or engaging in late-life intellectual activities.

All of the men, now in their mid-50s to mid-60s, took the Armed Forces Qualification Test at an average age of 20. The same test of general cognitive ability (GCA) was given in late midlife, plus assessments in seven cognitive domains.

GCA at age 20 accounted for 40% of the variance in the same measure at age 62, and approximately 10% of the variance in each of the seven cognitive domains. Lifetime education, complexity of job and engagement in intellectual activities each accounted for less than 1% of variance at average age 62.

The findings suggest that the impact of education, occupational complexity and engagement in cognitive activities on later life cognitive function simply reflects earlier cognitive ability.

The researchers speculated that the role of education in increasing GCA takes place primarily during childhood and adolescence when there is still substantial brain development.

https://www.eurekalert.org/pub_releases/2019-01/uoc--yca011819.php

[4484] Crimmins, E. M., Saito Y., Kim J. Ki, Zhang Y. S., Sasson I., & Hayward M. D.
(2018).  Educational Differences in the Prevalence of Dementia and Life Expectancy with Dementia: Changes from 2000 to 2010.
The Journals of Gerontology: Series B. 73(suppl_1), S20 - S28.

Guerra-Carrillo, B., Katovich, K., & Bunge, S. A. (2017). Does higher education hone cognitive functioning and learning efficacy? Findings from a large and diverse sample. PLOS ONE, 12(8), e0182276. https://doi.org/10.1371/journal.pone.0182276

[4485] Kremen, W. S., Beck A., Elman J. A., Gustavson D. E., Reynolds C. A., Tu X. M., et al.
(2019).  Influence of young adult cognitive ability and additional education on later-life cognition.
Proceedings of the National Academy of Sciences. 116(6), 2021.

Socially active 60-year-olds face lower dementia risk

Data from the Whitehall II study, tracking 10,228 participants for 30 years, found that increased social contact at age 60 is associated with a significantly lower risk of developing dementia later in life. Someone who saw friends almost daily at age 60 was 12% less likely to develop dementia than someone who only saw one or two friends every few months.

While previous studies have found a link between social contact and dementia risk, the long follow-up in the present study strengthens the evidence that social engagement could protect people from dementia (rather than precursors of dementia bringing about a decline in social engagement).

https://www.eurekalert.org/pub_releases/2019-08/ucl-sa6073119.php

Low social engagement plus high amyloid linked to cognitive decline

A three-year study of 217 healthy older adults (63-89) enrolled in the Harvard Aging Brain Study, has found that higher amyloid-beta levels in combination with lower social engagement was associated with greater cognitive decline over three years. Lower social engagement wasn’t associated with cognitive decline in those with a lower amyloid-beta burden.

https://www.eurekalert.org/pub_releases/2019-06/bawh-scl062819.php

Sommerlad, A., Sabia, S., Singh-Manoux, A., Lewis, G., & Livingston, G. (2019). Association of social contact with dementia and cognition: 28-year follow-up of the Whitehall II cohort study. PLOS Medicine, 16(8), e1002862. https://doi.org/10.1371/journal.pmed.1002862

Biddle, K et al, "Social Engagement and Amyloid-b-Related Cognitive Decline in Cognitively Normal Older Adults." American Journal of Geriatric Psychiatry. DOI: https://doi.org/10.1016/j.jagp.2019.05.005

Chronic insomnia linked to memory problems

Data from 28,485 older Canadians (45+) found that those with chronic insomnia performed significantly worse on cognitive tests than those who had symptoms of insomnia without any noticable impact on their daytime functioning and those with normal sleep quality. The main type of memory affected was declarative memory (memory of concepts, events and facts).

Chronic insomnia is characterized by trouble falling asleep or staying asleep at least three nights a week for over three months with an impact on daytime functioning (mood, attention, and daytime concentration).

https://www.eurekalert.org/pub_releases/2019-05/cu-cia051519.php

Poor brainwave syncing behind older adults failure to consolidate memories

We know that memories are consolidated during sleep, and that for some reason this consolidation becomes more difficult with age. Now a new study shows why.

To consolidate memories (move them into long-term storage), low and speedy brain waves have to sync up at exactly the right moment during sleep. These brain rhythms synchronize perfectly in young adults, but in old age, it seems, slow waves during non-rapid eye movement (NREM) sleep are not so good at making timely contact with the speedy electrical bursts known as “spindles.”

These difficulties are thought to be due to atrophy of the gray matter in the medial frontal cortex.

The study compared the overnight memory of 20 healthy young adults to that of 32 healthy older adults (mostly in their 70s). Before going to sleep, participants learned and were then tested on 120 word sets. They were tested again in the morning. EEG results from their sleeping brains showed that in older people, the spindles consistently peaked early in the memory-consolidation cycle and missed syncing up with the slow waves.

http://www.futurity.org/memories-sleep-older-adults-1633432/

https://www.eurekalert.org/pub_releases/2017-12/uoc--obd121417.php

Oxidative stress governs sleep

A fruitfly study has shown how oxidative stress leads to sleep. Fruitflies (and, it is believed, humans) have sleep-control neurons that act like an on-off switch: if the neurons are electrically active, the fly is asleep; when they are silent, the fly is awake. The switch is triggered, it appears, by an electrical current that flows through two ion channels, and this, it now appears, is regulated by a small molecule called NADPH.

The state of NADPH reflects the degree of oxidative stress. Sleeplessness causes oxidative stress, driving the behavior of NADPH.

I'm wildly speculating here, but is it possible that increased sleep problems often found with age are linked to a growing inability of this molecule to sensitively monitor the degree of oxidative stress, perhaps due to high levels of oxidative stress??

https://www.eurekalert.org/pub_releases/2019-03/uoo-saa032119.php

Increases in brain activity are matched by increases in blood flow. Neurons require a huge amount of energy, but can’t store it themselves, so must rely on blood to deliver the nutrients they need.

Two new studies help explain how blood flow is controlled.

The first study found blood appears to be stored in the blood vessels in the space between the brain and skull.

When the heart pumps blood into cranium, only a fraction of it flows into the capillaries that infuse the brain. The arteries in the cranium expand to store the excess blood. This expansion pushes out cerebrospinal fluid into the spinal column. When the heart relaxes, the drop in the pressure pushing blood through the arteries causes them to contract and the blood is pushed into the brain's capillaries. This in turn forces used blood out of the brain into the veins between it and the skull. These cerebral veins expand to store this blood as it leaves the brain.

Crucially, the study shows that the flow of blood in the veins leading out of the cranium is closely linked to the flow of cerebrospinal fluid in and out of the brain's ventricles.

The second study looked at what happens further down the track.

It had been thought that capillaries were passive tubes and the arterioles were the source of action — but the area covered by capillaries vastly surpasses the area covered by arterioles. So new findings make sense: that capillaries actively control blood flow by acting like a series of wires, transmitting electrical signals to direct blood to the areas that need it most.

To do this, capillaries rely on a protein (an ion channel) that detects increases in potassium during neuronal activity. Increased activity of this channel facilitates the flow of ions across the capillary membrane, thereby creating a small electrical current that communicates the need for additional blood flow to the arterioles, resulting in increased blood flow to the capillaries.

If the potassium level is too high, however, this mechanism can be disabled. This may be involved in a broad range of brain disorders.

https://www.eurekalert.org/pub_releases/2017-05/lbu-ffi050217.php

https://www.eurekalert.org/pub_releases/2017-03/lcom-ei032417.php

Link found between chronic inflammation and Alzheimer's gene risk

Data from the Framingham Heart Study has found carriers of the ApoE4 gene were much more likely to develop Alzheimer’s if they also had chronic low-grade inflammation. Indeed, the researchers suggest that, in the absence of inflammation, there might be no difference of Alzheimer's risk between ApoE4 and non-ApoE4 carriers.

https://www.eurekalert.org/pub_releases/2018-10/buso-lfb101818.php

Mid- to late-life increases in chronic inflammation age brain

Data from 1,532 participants in a long-running study, in which participants were tested five times every 3 years (on average), found that those who showed increasing inflammation had greater abnormalities in the brain's white matter structure.

90 people transitioned from low to persistently elevated C-reactive protein during midlife, indicating increasing inflammation. Their brains appear similar to that of a person 16 years older, researchers estimate.

Common causes of chronic inflammation include cardiovascular disease, heart failure, diabetes, high blood pressure and infections such as hepatitis C or HIV.

61% of participants were women, and 28% were African-American. At the final visit, participants were an average age of 76.

https://www.eurekalert.org/pub_releases/2018-07/jhm-mtl070218.php

Brain scans of 9,772 people aged 44 to 79, who were enrolled in the UK Biobank study, have revealed that smoking, high blood pressure, high pulse pressure, diabetes, and high BMI — but not high cholesterol — were all linked to greater brain shrinkage, less grey matter and less healthy white matter.

Smoking, high blood pressure, and diabetes were the most important factors, but there was also a compound effect, with the number of vascular risk factors being associated with greater damage to the brain. On average, those with the highest vascular risk had nearly 3% less volume of grey matter, and one-and-a-half times the damage to their white matter, compared to people who had the lowest risk.

The brain regions affected were mainly those involved in ‘higher-order’ thinking, and those known to be affected early in the development of dementia.

The associations were as strong for middle-aged adults as for older ones, suggesting the importance of tackling these factors early.

While the effect size was small, the findings emphasize how vulnerable the brain is to vascular factors even in relatively healthy adults. This also suggests the potential of lifestyle changes for fighting cognitive decline.

Although this study didn't itself examine cognitive performance in its participants, other studies have shown links between cognitive impairment and vascular risk factors, particularly diabetes, obesity, hypertension, and smoking.

https://www.eurekalert.org/pub_releases/2019-03/esoc-shb030719.php

Cognitive decline in type 2 diabetes linked to white matter hyperintensities

While type 2 diabetes has been associated with cognitive problems, the mechanism has been unclear. Now a study involving 93 people with type 2 diabetes has found that greater white matter hyperintensities (indicative of cerebral small vessel disease) were associated with decreased processing speed (but not with memory or executive function).

https://www.eurekalert.org/pub_releases/2018-09/w-rem091818.php

Cox, Simon R. et al. 2019. Associations between vascular risk factors and brain MRI indices in UK Biobank. European Heart Journal. doi:10.1093/eurheartj/ehz100

[4395] Mankovsky, B., Zherdova N., van den Berg E., Biessels G.-J., & de Bresser J.
(2018).  Cognitive functioning and structural brain abnormalities in people with Type 2 diabetes mellitus.
Diabetic Medicine. 35(12), 1663 - 1670.

 

Data from more than 17,000 healthy people aged 50 and over has revealed that the more regularly participants engaged with word puzzles, the better they performed on tasks assessing attention, reasoning and memory.

Study participants took part in online cognitive tests, as well as being asked how frequently they did word puzzles such as crosswords. There was a direct relationship between the frequency of word puzzle use and the speed and accuracy of performance on nine cognitive tasks.

The effect was considerable. For example, on test measures of grammatical reasoning speed and short-term memory accuracy, performing word puzzles was associated with brain function equivalent to ten years younger than participants’ chronological age.

The next question is whether you can improve brain function by engaging in puzzles.

The study used participants in the PROTECT online platform, run by the University of Exeter and Kings College London. Currently, more than 22,000 healthy people aged between 50 and 96 are registered in the study. PROTECT is a 10 year study with participants being followed up annually to enable a better understanding of cognitive trajectories in this age range.

https://www.eurekalert.org/pub_releases/2017-07/uoe-dcl071417.php

The Relationship Between the Frequency of Word Puzzle Use and Cognitive Function in a Large Sample of Adults Aged 50 to 96 Years, was presented at the Alzheimer's Association International Conference (AAIC) 2017 on July 17.

A Finnish study involving 338 older adults (average age 66) has found that greater muscle strength is associated with better cognitive function.

Muscle strength was measured utilising handgrip strength, three lower body exercises such as leg extension, leg flexion and leg press and two upper body exercises such as chest press and seated row.

Handgrip strength, easy to measure, has been widely used as a measure of muscle strength, and has been associated with dementia risk among the very old. However, in this study, handgrip strength on its own showed no association with cognitive function. But both upper body strength and lower body strength were independently associated with cognitive function.

It may be that handgrip strength is only useful for older, more cognitively impaired adults.

These are gender-specific associations — muscle strength was significantly greater in men, but there was no difference in cognitive performance between men and women.

The finding is supported by previous research that found a link between walking speed and cognition in older adults, and by a 2015 study that found a striking correlation between leg power and cognition.

This 10-year British study involved 324 older female twins (average age 55). Both the degree of cognitive decline over the ten year period, and the amount of gray matter, was significantly correlated with high muscle fitness (measured by leg extension muscle power). The correlation was greater than for any other lifestyle factor tested

https://www.eurekalert.org/pub_releases/2017-06/uoef-gms062617.php

Data from over 11,500 participants in the Atherosclerosis Risk in Communities (ARIC) cohort has found evidence that orthostatic hypotension in middle age may increase the risk of cognitive impairment and dementia 20 years later.

Orthostatic hypotension is the name for the experience of dizziness or light-headedness on standing up. Previous research has suggested an association between orthostatic hypotension and cognitive decline in older adults.

In this study, participants aged 45-64 were tested for orthostatic hypotension in 1987. Those with it (703, around 6%) were 40% more likely to develop dementia in the next 20 years. They also had some 15% more cognitive decline.

Orthostatic hypotension was defined as a drop of 20 mmHg or more in systolic blood pressure or 10 mmHg or more in diastolic blood pressure, when the individual stood up after 20 minutes lying down.

More work is needed to understand the reason for the association.

https://www.eurekalert.org/pub_releases/2017-03/jhub-rbp030817.php

Rawlings, Andreea. 2017. Orthostatic Hypotension is Associated with 20-year Cognitive Decline and Incident Dementia: The Atherosclerosis Risk in Communities (ARIC) Study. Presented March 10 at the American Heart Association's EPI|LIFESTYLE 2017 Scientific Sessions in Portland, Oregon.

A review of 39 studies investigating the effect of exercise on cognition in older adults (50+) confirms that physical exercise does indeed improve cognitive function in the over 50s, regardless of their cognitive status. Aerobic exercise, resistance training, multicomponent training and tai chi, all had significant effects. However, exercise sessions needed to be at least 45  minutes and moderate intensity. Because aerobic exercise and resistance training had different effects (aerobic exercise helped overall cognition, while resistance training was particularly beneficial for executive function and working memory), it’s recommended that an exercise program include both.

https://medicalxpress.com/news/2017-04-aerobic-resistance-combo-boost-brain.html

A gene linked to Alzheimer's has been linked to brain changes in childhood. This gene, SORL1, has two connections to Alzheimer’s: it carries the code for the sortilin-like receptor, which is involved in recycling some molecules before they develop into amyloid-beta; it is also involved in lipid metabolism, putting it at the heart of the vascular risk pathway.

Brain imaging of 186 healthy individuals (aged 8-86) found that, even among the youngest, those with a specific variant of SORL1 showed a reduction in white matter connections. Post-mortem brain tissue from 269 individuals (aged 0-92) without Alzheimer's disease, found that the same SORL1 variant was linked to a disruption in the process by which the gene translated its code to become the sortilin-like receptor, and this was most prominent during childhood and adolescence. Another set of post-mortem brains from 710 individuals (aged 66-108), of whom the majority had mild cognitive impairment or Alzheimer's, found that the SORL1 risk gene was linked with the presence of amyloid-beta.

It may be that, for those carrying this gene variant, lifestyle interventions may be of greatest value early in life.

http://www.eurekalert.org/pub_releases/2013-12/cfaa-arg120313.php

[3570] Felsky, D., Szeszko P., Yu L., Honer W. G., De Jager P. L., Schneider J. A., et al.
(2013).  The SORL1 gene and convergent neural risk for Alzheimer’s disease across the human lifespan.
Molecular Psychiatry.

Analysis of data from 237 patients with mild cognitive impairment (mean age 79.9) has found that, compared to those carrying the ‘normal’ ApoE3 gene (the most common variant of the ApoE gene), the ApoE4 carriers showed markedly greater rates of shrinkage in 13 of 15 brain regions thought to be key components of the brain networks disrupted in Alzheimer’s.

http://www.eurekalert.org/pub_releases/2014-01/rson-gva010714.php

[3578] Hostage, C. A., Choudhury K R., Doraiswamy M. P., & Petrella J. R.
(2013).  Mapping the Effect of the Apolipoprotein E Genotype on 4-Year Atrophy Rates in an Alzheimer Disease–related Brain Network.
Radiology. 271(1), 211 - 219.

Analysis of brain scans and cognitive scores of 64 older adults from the NIA's Baltimore Longitudinal Study of Aging (average age 76) has found that, between the most cognitively stable and the most declining (over a 12-year period), there was no significant difference in the total amount of amyloid in the brain, but there was a significant difference in the location of amyloid accumulation. The stable group showed relatively early accumulation in the frontal lobes, while the declining group showed it in the temporal lobes.

http://www.eurekalert.org/pub_releases/2013-07/uops-pop071513.php

[3624] Yotter, R. A., Doshi J., Clark V., Sojkova J., Zhou Y., Wong D. F., et al.
(2013).  Memory decline shows stronger associations with estimated spatial patterns of amyloid deposition progression than total amyloid burden.
Neurobiology of Aging. 34(12), 2835 - 2842.

Data from 6257 older adults (aged 55-90) evaluated from 2005-2012 has revealed that concerns about memory should be taken seriously, with subjective complaints associated with a doubled risk of developing mild cognitive impairment or dementia, and subjective complaints supported by a loved one being associated with a fourfold risk. Complaints by a loved one alone were also associated with a doubled risk. Among those with MCI, subjective complaints supported by a loved one were associated with a threefold risk of converting to dementia.

Of the 4414 initially cognitively normal, 14% developed MCI or dementia over the course of the study (around 5 years); of the 1843 with MCI, 41% progressed to dementia.

http://www.futurity.org/worry-about-memory-predicts-alzheimer%E2%80%99s-risk/

[3573] Gifford, K. A., Liu D., Lu Z., Tripodis Y., Cantwell N. G., Palmisano J., et al.
(2014).  The source of cognitive complaints predicts diagnostic conversion differentially among nondemented older adults.
Alzheimer's & Dementia. 10(3), 319 - 327.

Data from two longitudinal studies of older adults (a nationally representative sample of older adults, and the Alzheimer’s Disease Neuroimaging Initiative) has found that a brief cognitive test can distinguish memory decline associated with healthy aging from more serious memory disorders, years before obvious symptoms show up.

Moreover, the data challenge the idea that memory continues to decline through old age: after excluding the cognitively impaired, there was no evidence of further memory declines after the age of 69.

The data found that normal aging showed declines in recollective memory (recalling a word or event exactly) but not in reconstructive memory (recalling a word or event by piecing it together from clues about its meaning, e.g., recalling that “dog” was presented in a word list by first remembering that household pets were presented in the list). However, declines in reconstructive memory were reliable predictors of future progression from healthy aging to mild cognitive impairment and Alzheimer’s.

http://www.futurity.org/memory-test-mistakes-can-flag-trouble-sooner/

[3556] Brainerd, C. J., Reyna V. F., Gomes C. F. A., Kenney A. E., Gross C. J., Taub E. S., et al.
(2014).  Dual-retrieval models and neurocognitive impairment.
Journal of Experimental Psychology: Learning, Memory, and Cognition. 40(1), 41 - 65.

New research supports the classification system for preclinical Alzheimer’s proposed two years ago. The classification system divides preclinical Alzheimer's into three stages:

Stage 1: Levels of amyloid beta begin to decrease in the spinal fluid. This indicates that the substance is beginning to form plaques in the brain.

Stage 2: Levels of tau protein start to increase in the spinal fluid, indicating that brain cells are beginning to die. Amyloid beta levels are still abnormal and may continue to fall.

Stage 3: In the presence of abnormal amyloid and tau biomarker levels, subtle cognitive changes can be detected by neuropsychological testing.

Long-term evaluation of 311 cognitively healthy older adults (65+) found 31% with preclinical Alzheimer’s, of whom 15% were at stage 1, 12% at stage 2, and 4% at stage 3. This is consistent with autopsy studies, which have shown that around 30% of cognitively normal older adults die with some preclinical Alzheimer's pathology in their brain. Additionally, 23% were diagnosed with suspected non-Alzheimer pathophysiology (SNAP), 41% as cognitively normal, and 5% as unclassified.

Five years later, 2% of the cognitively normal, 5% of those with SNAP, 11% of the stage 1 group, 26% of the stage 2 group, and 56% of the stage 3 group had been diagnosed with symptomatic Alzheimer's.

http://www.eurekalert.org/pub_releases/2013-09/wuso-apt092313.php

[3614] Vos, S JB., Xiong C., Visser P J., Jasielec M. S., Hassenstab J., Grant E. A., et al.
(2013).  Preclinical Alzheimer's disease and its outcome: a longitudinal cohort study.
The Lancet Neurology. 12(10), 957 - 965.

A large Danish study comparing two groups of nonagenarians born 10 years apart has found that not only were people born in 1915 nearly a third (32%) more likely to reach the age of 95 than those in the 1905 cohort, but members of the group born in 1915 performed significantly better on tests of cognitive ability and activities of daily living. Additionally, significantly more members of the later cohort scored maximally on the MMSE (23% vs 13% of the earlier cohort). All this even though the later cohort were on average two years older than the first cohort when tested (94-5 vs 92-3 years)

The difference doesn’t appear to be due to education (educational achievement was slightly higher in the 1915 cohort, but only in women, who had overall very low educational attainment in both groups). It’s suggested that factors such as better diet and general living conditions, improved health care, and greater intellectual stimulation have helped the younger cohort improve their cognitive functioning.

http://www.eurekalert.org/pub_releases/2013-07/l-mpo070913.php

http://newoldage.blogs.nytimes.com/2013/07/17/in-europe-dementia-rates-may-be-falling/

http://www.thelancet.com/journals/lancet/article/PIIS0140-6736%2813%2960777-1/fulltext

[3562] Christensen, K., Thinggaard M., Oksuzyan A., Steenstrup T., Andersen-Ranberg K., Jeune B., et al.
(2013).  Physical and cognitive functioning of people older than 90 years: a comparison of two Danish cohorts born 10 years apart.
The Lancet. 382(9903), 1507 - 1513.

A large study, involving 3,690 older adults, has found that drugs with strong anticholinergic effects cause memory and cognitive impairment when taken continuously for a mere two months. Moreover, taking multiple drugs with weaker anticholinergic effects, such as many common over-the-counter digestive aids, affected cognition after 90 days’ continuous use. In both these cases, the risk of cognitive impairment doubled (approximately).

More positively, risk of Alzheimer’s did not seem to be affected (however, I do have to wonder how much weight we can put on that, given the apparent length of the study — although this is not a journal to which I have access, so I can’t be sure of that).

Although somewhat unexpected, previous research linking anticholinergics and cognitive impairment is consistent with this new finding.

Anticholinergic drugs block the neurotransmitter acetylcholine. Older adults commonly use over-the-counter drugs with anticholinergic effects as sleep aids and to relieve bladder leakage. Drugs with anticholinergic effects are also frequently prescribed for many chronic diseases including hypertension, cardiovascular disease and chronic obstructive pulmonary disease.

You can download a list detailing the ‘anticholinergic burden’ of medications at: http://www.indydiscoverynetwork.org/AnticholinergicCognitiveBurdenScale.html

http://www.eurekalert.org/pub_releases/2013-05/iu-sua050713.php

[3449] Cai, X., Campbell N., Khan B., Callahan C., & Boustani M.
(2013).  Long-term anticholinergic use and the aging brain.
Alzheimer's & Dementia: The Journal of the Alzheimer's Association. 9(4), 377 - 385.

A new study has found that errors in perceptual decisions occurred only when there was confused sensory input, not because of any ‘noise’ or randomness in the cognitive processing. The finding, if replicated across broader contexts, will change some of our fundamental assumptions about how the brain works.

The study unusually involved both humans and rats — four young adults and 19 rats — who listened to streams of randomly timed clicks coming into both the left ear and the right ear. After listening to a stream, the subjects had to choose the side from which more clicks originated.

The errors made, by both humans and rats, were invariably when two clicks overlapped. In other words, and against previous assumptions, the errors did not occur because of any ‘noise’ in the brain processing, but only when noise occurred in the sensory input.

The researchers supposedly ruled out alternative sources of confusion, such as “noise associated with holding the stimulus in mind, or memory noise, and noise associated with a bias toward one alternative or the other.”

However, before concluding that the noise which is the major source of variability and errors in more conceptual decision-making likewise stems only from noise in the incoming input (in this case external information), I would like to see the research replicated in a broader range of scenarios. Nevertheless, it’s an intriguing finding, and if indeed, as the researchers say, “the internal mental process was perfectly noiseless. All of the imperfections came from noise in the sensory processes”, then the ramifications are quite extensive.

The findings do add weight to recent evidence that a significant cause of age-related cognitive decline is sensory loss.

http://www.futurity.org/science-technology/dont-blame-your-brain-for-that-bad-decision/

[3376] Brunton, B. W., Botvinick M. M., & Brody C. D.
(2013).  Rats and Humans Can Optimally Accumulate Evidence for Decision-Making.
Science. 340(6128), 95 - 98.

Evidence is accumulating that age-related cognitive decline is rooted in three related factors: processing speed slows down (because of myelin degradation); the ability to inhibit distractions becomes impaired; working memory capacity is reduced.

A new study adds to this evidence by looking at one particular aspect of age-related cognitive decline: memory search.

The study put 185 adults aged 29-99 (average age 67) through three cognitive tests: a vocabulary test, digit span (a working memory test), and the animal fluency test, in which you name as many animals as you can in one minute.

Typically, in the animal fluency test, people move through semantic categories such as ‘pets’, ‘big cats’, and so on. The best performers are those who move from category to category with optimal timing — i.e., at the point where the category has been sufficiently exhausted that efforts would be better spent on a new one.

Participants recalled on average 17 animal names, with a range from 5 to 33. While there was a decline with age, it wasn’t particularly marked until the 80s (an average of 18.3 for those in their 30s, 17.5 for those in their 60s, 16.5 for the 70s, 12.8 for the 80s, and 10 for the 90s). Digit span did show a decline, but it was not significant (from 17.5 down to 15.3), while vocabulary (consistent with previous research) showed no decline with age.

But all this is by the by — the nub of the experiment was to discover how individuals were searching their memory. This required a quite complicated analysis, which I will not go into, except to mention two important distinctions. The first is between:

  • global context cue: activates each item in the active category according to how strong it is (how frequently it has been recalled in the past);
  • local context cue: activates each item in relation to its semantic similarity to the previous item recalled.

A further distinction was made between static and dynamic processes: in dynamic models, it is assumed the user switches between local and global search. This, it is further assumed, is because memory is ‘patchy’ – that is, information is represented in clusters. Within a cluster, we use local cues, but to move from one cluster to another, we use global cues.

The point of all this was to determine whether age-related decline in memory search has to do with:

  • Reduced processing speed,
  • Persisting too long on categories, or
  • Inability to maintain focus on local cues (this would relate it back to the inhibition deficit).

By modeling the exact recall patterns, the researchers ascertained that the recall process is indeed dynamic, although the points of transition are not clearly understood. The number of transitions from one cluster to another was negatively correlated with age; it was also strongly positively correlated with performance (number of items recalled). Digit span, assumed to measure ‘cognitive control’, was also negatively correlated with number of transitions, but, as I said, was not significantly correlated with age.

In other words, it appears that there is a qualitative change with age, that increasing age is correlated with increased switching, and reduced cognitive control is behind this — although it doesn’t explain it all (perhaps because we’re still not able to fully measure cognitive control).

At a practical level, the message is that memory search may become less efficient because, as people age, they tend to change categories too frequently, before they have exhausted their full potential. While this may well be a consequence of reduced cognitive control, it seems likely (to me at least) that making a deliberate effort to fight the tendency to move on too quickly will pay dividends for older adults who want to improve their memory retrieval abilities.

Nor is this restricted to older adults — since age appears to be primarily affecting performance through its effects on cognitive control, it is likely that this applies to those with reduced working memory capacity, of any age.

[3378] Hills, T. T., Mata R., Wilke A., & Samanez-Larkin G. R.
(2013).  Mechanisms of Age-Related Decline in Memory Search Across the Adult Life Span.
Developmental Psychology. No - Pagination Specified.

Analysis of data from 418 older adults (70+) has found that carriers of the ‘Alzheimer’s gene’, APOEe4, were 58% more likely to develop mild cognitive impairment compared to non-carriers. However, ε4 carriers with MCI developed Alzheimer’s at the same rate as non-carriers. The finding turns prevailing thinking on its head: rather than the gene increasing the risk of developing Alzheimer’s, it appears that it increases the risk of MCI — and people with MCI are the main source of new Alzheimer’s diagnoses.

In this regard, it’s worth noting that the cognitive effects of this gene variant have been demonstrated in adults as young as the mid-20s.

The finding points to the benefit of genetic testing for assessing your likelihood of cognitive impairment rather than dementia — and using this knowledge to build habits that fight cognitive impairment.

http://www.futurity.org/health-medicine/genetic-test-fails-to-show-alzhe...

[3370] Brainerd, C. J., Reyna V. F., Petersen R. C., Smith G. E., Kenney A. E., Gross C. J., et al.
(2013).  The apolipoprotein E genotype predicts longitudinal transitions to mild cognitive impairment but not to Alzheimer's dementia: Findings from a nationally representative study.
Neuropsychology. 27(1), 86 - 94.

Data from the very large, long-running UK National Child Development Study has revealed that those who exercised at least four times weekly as both a child and an adult performed better on cognitive tests at age 50 than those who exercised two to three times per month or less, and the latter in turn performed better than those who hadn’t regularly exercised at all.

The data was collected through face-to-face interviews of more than 9,000 people at the ages of 11, 16, 33, 42, 46, and 50. Cognitive score was based on an immediate and delayed recall task (ten unrelated words), ability to name as many animals as possible in one minute, and time taken to cross out specified letters in a series.

The findings add a further perspective to the pile of evidence for the value of regular exercise in fighting age-related cognitive decline.

http://www.futurity.org/health-medicine/lifelong-exercise-may-keep-aging-mind-sharp/

http://www.guardian.co.uk/society/2013/mar/12/lifelong-exercise-brain-function-study

[3336] Dregan, A., & Gulliford M. C.
(2013).  Leisure-time physical activity over the life course and cognitive functioning in late mid-adult years: a cohort-based investigation.
Psychological Medicine. FirstView, 1 - 12.

A small study of “Super Agers” has found a key difference between them and typical older adults: an unusually large anterior cingulate (involved in attention), with four times as many von Economo neurons.

Scientific American article

Preliminary findings from a small study show that older adults (68-91), after learning to use Facebook, performed about 25% better on tasks designed to measure their ability to continuously monitor and to quickly add or delete the contents of their working memory (updating), compared to their baseline performance. Two other groups of 14 showed no change. The second group of 14 were taught to use a private online diary site (Penzu.com), while the third control group were told they were on a wait-list for Facebook training.

Wohltmann, Janelle. 2013. Presented at the International Neuropsychological Society’s annual meeting in Hawaii.

Report on Futurity

Recent research has suggested that sleep problems might be a risk factor in developing Alzheimer’s, and in mild cognitive impairment. A new study adds to this gathering evidence by connecting reduced slow-wave sleep in older adults to brain atrophy and poorer learning.

The study involved 18 healthy young adults (mostly in their 20s) and 15 healthy older adults (mostly in their 70s). Participants learned 120 word- nonsense word pairs and were tested for recognition before going to bed. Their brain activity was recorded while they slept. Brain activity was also measured in the morning, when they were tested again on the word pairs.

As has been found previously, older adults showed markedly less slow-wave activity (both over the whole brain and specifically in the prefrontal cortex) than the younger adults. Again, as in previous studies, the biggest difference between young and older adults in terms of gray matter volume was found in the medial prefrontal cortex (mPFC). Moreover, significant differences were also found in the insula and posterior cingulate cortex. These regions, like the mPFC, have also been associated with the generation of slow waves.

When mPFC volume was taken into account, age no longer significantly predicted the extent of the decline in slow-wave activity — in other words, the decline in slow-wave activity appears to be due to the brain atrophy in the medial prefrontal cortex. Atrophy in other regions of the brain (precuneus, hippocampus, temporal lobe) was not associated with the decline in slow-wave activity when age was considered.

Older adults did significantly worse on the delayed recognition test than young adults. Performance on the immediate test did not predict performance on the delayed test. Moreover, the highest performers on the immediate test among the older adults performed at the same level as the lowest young adult performers — nevertheless, these older adults did worse the following day.

Slow-wave activity during sleep was significantly associated with performance on the next day’s test. Moreover, when slow-wave activity was taken into account, neither age nor mPFC atrophy significantly predicted test performance.

In other words, age relates to shrinkage of the prefrontal cortex, this shrinkage relates to a decline in slow-wave activity during sleep, and this decline in slow-wave sleep relates to poorer cognitive performance.

The findings confirm the importance of slow-wave brainwaves for memory consolidation.

All of this suggests that poorer sleep quality contributes significantly to age-related cognitive decline, and that efforts should be made to improve quality of sleep rather than just assuming lighter, more disturbed sleep is ‘natural’ in old age!

The issue of the effect of menopause on women’s cognition, and whether hormone therapy helps older women fight cognitive decline and dementia, has been a murky one. Increasing evidence suggests that the timing and type of therapy is critical. A new study makes clear that we also need to distinguish between women who experience early surgical menopause and those who experience natural menopause.

The study involved 1,837 women (aged 53-100), of whom 33% had undergone surgical menopause (removal of both ovaries before natural menopause). For these women, earlier age of the procedure was associated with a faster decline in semantic and episodic memory, as well as overall cognition. The results stayed the same after factors such as age, education and smoking were taken into consideration.

There was also a significant association between age at surgical menopause and the plaques characteristic of Alzheimer's disease. However, there was no significant association with Alzheimer’s itself.

On the positive side, hormone replacement therapy was found to help protect those who had surgical menopause, with duration of therapy linked to a significantly slower decline in overall cognition.

Also positively, age at natural menopause was not found to be associated with rate of cognitive decline.

Bove, R. et al. 2013. Early Surgical Menopause Is Associated with a Spectrum of Cognitive Decline. To be presented at the American Academy of Neurology's 65th Annual Meeting in San Diego, March 21, 2013.

I’ve written before about the gathering evidence that sensory impairment, visual impairment and hearing loss in particular, is a risk factor for age-related cognitive decline and dementia. Now a large long-running study provides more support for the association between hearing loss and age-related cognitive decline.

The study involved 1,984 older adults (aged 75-84) whose hearing and cognition was tested at the start of the study, with cognitive performance again assessed three, five, and six years later.

Those with hearing loss showed significantly faster cognitive decline than those with normal hearing — some 30-40% faster (41% on the MMSE; 32% on the Digit Symbol Substitution Test), with rate directly related to the amount of hearing loss.

On average, older adults with hearing loss developed significant cognitive impairment 3.2 years sooner than those with normal hearing — a very significant difference indeed.

It has been suggested that increasing social isolation and loneliness may underlie some, if not all, of this association. It may also be that difficulties in hearing force the brain to devote too much of its resources to processing sound, leaving less for cognition. A third possibility is that some common factor underlies both hearing loss and cognitive decline — however, the obvious risk factors, such as high blood pressure, diabetes and stroke, were taken account of in the analysis.

The findings emphasize the importance of getting help for hearing difficulties, rather than regarding them as ‘natural’ in old age.

[3293] Lin, F. R., Yaffe K., Xia J., & et al
(2013).  Hearing loss and cognitive decline in older adults.
JAMA Internal Medicine. 1 - 7.

Providing some support for the finding I recently reported — that problems with semantic knowledge in those with mild cognitive impairment (MCI) and Alzheimer’s might be rooted in an inability to inhibit immediate perceptual information in favor of conceptual information — a small study has found that executive function (and inhibitory control in particular) is impaired in far more of those with MCI than was previously thought.

The study involved 40 patients with amnestic MCI (single or multiple domain) and 32 healthy older adults. Executive function was tested across multiple sub-domains: divided attention, working memory, inhibitory control, verbal fluency, and planning.

As a group, those with MCI performed significantly more poorly in all 5 sub-domains. All MCI patients showed significant impairment in at least one sub-domain of executive functioning, with almost half performing poorly on all of the tests. The sub-domain most frequently and severely impaired was inhibitory control.

The finding is in sharp contrast with standard screening tests and clinical interviews, which have estimated executive function impairment in only 15% of those with MCI.

Executive function is crucial for many aspects of our behavior, from planning and organization to self-control to (as we saw in the previous news report) basic knowledge. It is increasingly believed that inhibitory control might be a principal cause of age-related cognitive decline, through its effect on working memory.

All this adds weight to the idea that we should be focusing our attention on ways to improve inhibitory control when it declines. Although training to improve working memory capacity has not been very successful, specific training targeted at inhibitory control might have more luck. Something to hope for!

Previous research has pointed to an association between not having teeth and a higher risk of cognitive decline and dementia. One reason might have to do with inflammation — inflammation is a well-established risk factor, and at least one study has linked gum disease to a higher dementia risk. Or it might have to do with the simple mechanical act of chewing, reducing blood flow to the brain. A new study has directly investigated chewing ability in older adults.

The Swedish study, involving 557 older adults (77+), found that those with multiple tooth loss, and those who had difficulty chewing hard food such as apples, had a significantly higher risk of developing cognitive impairments (cognitive status was measured using the MMSE). However, when adjusted for sex, age, and education, tooth loss was no longer significant, but chewing difficulties remained significant.

In other words, what had caused the tooth loss didn’t matter. The important thing was to maintain chewing ability, whether with your own natural teeth or dentures.

This idea that the physical act of chewing might affect your cognitive function (on a regular basis; I don’t think anyone is suggesting that you’re brighter when you chew!) is an intriguing and unexpected one. It does, however, give even more emphasis to the importance of physical exercise, which is a much better way of increasing blood flow to the brain.

The finding also reminds us that there are many things going on in the brain that may deteriorate with age and thus lead to cognitive decline and even dementia.

Problems with myelin — demyelination (seen most dramatically in MS, but also in other forms of neurodegeneration, including normal aging and depression); failure to develop sufficient myelin (in children and adolescents) — are increasingly being implicated in a wide range of disorders. A new animal study adds to that evidence by showing that social isolation brings about both depression and loss of myelin.

In the study, adult mice were isolated for eight weeks (which is of course longer for a mouse than it is to us) to induce a depressive-like state. They were then introduced to a mouse they hadn’t seen before. Although typically very social animals, those who had been socially isolated didn’t show any interest in interacting with the new mouse — a common pattern in human behavior as well.

Analysis of their brains revealed significantly lower levels of gene transcription for oligodendrocyte cells (the components of myelin) in the prefrontal cortex. This appeared to be caused by a lower production of heterochromatin (tightly packed DNA) in the cell nuclei, producing less mature oligodendrocytes.

Interestingly, even short periods of isolation were sufficient to produce changes in chromatin and myelin, although behavior wasn’t affected.

Happily, however, regardless of length of isolation, myelin production went back to normal after a period of social integration.

The findings add to the evidence that environmental factors can have significant effects on brain development and function, and support the idea that socializing is good for the brain.

Sometime ago, I reported on a study showing that older adults could improve their memory for a future task (remembering to regularly test their blood sugar) by picturing themselves going through the process. Imagination has been shown to be a useful strategy in improving memory (and also motor skills). A new study extends and confirms previous findings, by testing free recall and comparing self-imagination to more traditional strategies.

The study involved 15 patients with acquired brain injury who had impaired memory and 15 healthy controls. Participants memorized five lists of 24 adjectives that described personality traits, using a different strategy for each list. The five strategies were:

  • think of a word that rhymes with the trait (baseline),
  • think of a definition for the trait (semantic elaboration),
  • think about how the trait describes you (semantic self-referential processing),
  • think of a time when you acted out the trait (episodic self-referential processing), or
  • imagine acting out the trait (self-imagining).

For both groups, self-imagination produced the highest rates of free recall of the list (an average of 9.3 for the memory-impaired, compared to 3.2 using the baseline strategy; 8.1 vs 3.2 for the controls — note that the controls were given all 24 items in one list, while the memory-impaired were given 4 lists of 6 items).

Additionally, those with impaired memory did better using semantic self-referential processing than episodic self-referential processing (7.3 vs 5.7). In contrast, the controls did much the same in both conditions. This adds to the evidence that patients with brain injury often have a particular problem with episodic memory (knowledge about specific events). Episodic memory is also particularly affected in Alzheimer’s, as well as in normal aging and depression.

It’s also worth noting that all the strategies that involved the self were more effective than the two strategies that didn’t, for both groups (also, semantic elaboration was better than the baseline strategy).

The researchers suggest self-imagination (and semantic self-referential processing) might be of particular benefit for memory-impaired patients, by encouraging them to use information they can more easily access (information about their own personality traits, identity roles, and lifetime periods — what is termed the self-schema), and that future research should explore ways in which self-imagination could be used to support everyday memory tasks, such as learning new skills and remembering recent events.

A study using data from the Lothian Birth Cohort (people born in Scotland in 1936) has analyzed brain scans of 638 participants when they were 73 years old. Comparing this data with participants’ earlier reports of their exercise and leisure activities at age 70, it was found that those who reported higher levels of regular physical activity showed significantly less brain atrophy than those who did minimal exercise. Participation in social and mentally stimulating activities, on the other hand, wasn’t associated with differences in brain atrophy.

Regular physical exercise was also associated with fewer white matter lesions. While leisure activity was also associated with healthier white matter, this was not significant after factors such as age, social class, and health status were taken into account.

Unfortunately, this study is reported in a journal to which I don’t have access. I would love to have more details about the leisure activities data and the brain scans. However, although the failure to find a positive effect of stimulating activities is disappointing, it’s worth noting another recent study, that produced two relevant findings. First, men with high levels of cognitive activity showed a significant reduction in white matter lesions, while women did not. Women with high levels of cognitive activity, on the other hand, showed less overall brain atrophy — but men did not.

Secondly, both genders showed less atrophy in a particular region of the prefrontal cortex, but there was no effect on the hippocampus — the natural place to look for effects (and the region where physical exercise is known to have positive effects).

In other words, the positive effects of cognitive activity on the brain might be quite different from the positive effects of physical exercise.

The findings do, of course, add to the now-compelling evidence for the benefits of regular physical activity in fighting cognitive decline.

It’s good news, then, that a small study has found that even frail seniors can derive significant benefits from exercise.

The study involved 83 older adults (61-89), some of whom were considered frail. Forty-three took part in group exercises (3 times a week for 12 weeks), while 40 were wait-listed controls. Participants were assessed for physical capacity, quality of life and cognitive health a week before the program began, and at the end.

Those who took part in the exercise program significantly improved their physical capacity, cognitive performance, and quality of life. These benefits were equivalent among frail and non-frail participants.

Frailty is associated with a higher risk of falls, hospitalizations, cognitive decline and psychological distress, and, of course, increases with age. In the U.S, it’s estimated that 7% of seniors aged 65 to 74, 18% of those aged 75 to 84, and 37% of seniors over the age of 85 are frail.

Caffeine has been associated with a lower of developing Alzheimer's disease in some recent studies. A recent human study suggested that the reason lies in its effect on proteins involved in inflammation. A new mouse study provides more support for this idea.

In the study, two groups of mice, one of which had been given caffeine, were exposed to hypoxia, simulating what happens in the brain during an interruption of breathing or blood flow. When re-oxygenated, caffeine-treated mice recovered their ability to form a new memory 33% faster than the other mice, and the caffeine was observed to have the same anti-inflammatory effect as blocking interleukin-1 (IL-1) signaling.

Inflammation is a key player in cognitive impairment, and IL-1 has been shown to play a critical role in the inflammation associated with many neurodegenerative diseases.

It was found that the hypoxic episode triggered the release of adenosine, the main component of ATP (your neurons’ fuel). Adenosine is released when a cell is damaged, and this leakage into the environment outside the cell begins a cascade that leads to inflammation (the adenosine activates an enzyme, caspase-1, which triggers production of the cytokine IL-1β).

But caffeine blocks adenosine receptors, stopping the cascade before it starts.

The finding gives support to the idea that caffeine may help prevent cognitive decline and impairment.

Green tea is thought to have wide-ranging health benefits, especially in the prevention of cardiovascular disease, inflammatory diseases, and diabetes. These are all implicated in the development of age-related cognitive impairment, so it’s no surprise that regular drinking of green tea has been suggested as one way to help protect against age-related cognitive decline and dementia. A new mouse study adds to that evidence by showing how a particular compound in green tea promotes neurogenesis.

The chemical EGCG, (epigallocatechin-3 gallate) is a known anti-oxidant, but this study shows that it also has a specific benefit in increasing the production of neural progenitor cells. Like stem cells, these progenitor cells can become different types of cell.

Mice treated with EGCG displayed better object recognition and spatial memory than control mice, and this improved performance was associated with the number of progenitor cells in the dentate gyrus and increased activity in the sonic hedgehog signaling pathway (confirming the importance of this pathway in adult neurogenesis in the hippocampus).

The findings add to evidence that green tea may help protect against cognitive impairment and dementia.

A small Swedish brain imaging study adds to the evidence for the cognitive benefits of learning a new language by investigating the brain changes in students undergoing a highly intensive language course.

The study involved an unusual group: conscripts in the Swedish Armed Forces Interpreter Academy. These young people, selected for their talent for languages, undergo an intensive course to allow them to learn a completely novel language (Egyptian Arabic, Russian or Dari) fluently within ten months. This requires them to acquire new vocabulary at a rate of 300-500 words every week.

Brain scans were taken of 14 right-handed volunteers from this group (6 women; 8 men), and 17 controls that were matched for age, years of education, intelligence, and emotional stability. The controls were medical and cognitive science students. The scans were taken before the start of the course/semester, and three months later.

The brain scans revealed that the language students showed significantly greater changes in several specific regions. These regions included three areas in the left hemisphere: the dorsal middle frontal gyrus, the inferior frontal gyrus, and the superior temporal gyrus. These regions all grew significantly. There was also some, more selective and smaller, growth in the middle frontal gyrus and inferior frontal gyrus in the right hemisphere. The hippocampus also grew significantly more for the interpreters compared to the controls, and this effect was greater in the right hippocampus.

Among the interpreters, language proficiency was related to increases in the right hippocampus and left superior temporal gyrus. Increases in the left middle frontal gyrus were related to teacher ratings of effort — those who put in the greatest effort (regardless of result) showed the greatest increase in this area.

In other words, both learning, and the effort put into learning, had different effects on brain development.

The main point, however, is that language learning in particular is having this effect. Bear in mind that the medical and cognitive science students are also presumably putting in similar levels of effort into their studies, and yet no such significant brain growth was observed.

Of course, there is no denying that the level of intensity with which the interpreters are acquiring a new language is extremely unusual, and it cannot be ruled out that it is this intensity, rather than the particular subject matter, that is crucial for this brain growth.

Neither can it be ruled out that the differences between the groups are rooted in the individuals selected for the interpreter group. The young people chosen for the intensive training at the interpreter academy were chosen on the basis of their talent for languages. Although brain scans showed no differences between the groups at baseline, we cannot rule out the possibility that such intensive training only benefited them because they possessed this potential for growth.

A final caveat is that the soldiers all underwent basic military training before beginning the course — three months of intense physical exercise. Physical exercise is, of course, usually very beneficial for the brain.

Nevertheless, we must give due weight to the fact that the brain scans of the two groups were comparable at baseline, and the changes discussed occurred specifically during this three-month learning period. Moreover, there is growing evidence that learning a new language is indeed ‘special’, if only because it involves such a complex network of processes and brain regions.

Given that people vary in their ‘talent’ for foreign language learning, and that learning a new language does tend to become harder as we get older, it is worth noting the link between growth of the hippocampus and superior temporal gyrus and language proficiency. The STG is involved in acoustic-phonetic processes, while the hippocampus is presumably vital for the encoding of new words into long-term memory.

Interestingly, previous research with children has suggested that the ability to learn new words is greatly affected by working memory span — specifically, by how much information they can hold in that part of working memory called phonological short-term memory. While this is less important for adults learning another language, it remains important for one particular category of new words: words that have no ready association to known words. Given the languages being studied by these Swedish interpreters, it seems likely that much if not all of their new vocabulary would fall into this category.

I wonder if the link with STG is more significant in this study, because the languages are so different from the students’ native language? I also wonder if, and to what extent, you might be able to improve your phonological short-term memory with this sort of intensive practice.

In this regard, it’s worth noting that a previous study found that language proficiency correlated with growth in the left inferior frontal gyrus in a group of English-speaking exchange students learning German in Switzerland. Is this difference because the training was less intensive? because the students had prior knowledge of German? because German and English are closely related in vocabulary? (I’m picking the last.)

The researchers point out that hippocampal plasticity might also be a critical factor in determining an individual’s facility for learning a new language. Such plasticity does, of course, tend to erode with age — but this can be largely counteracted if you keep your hippocampus limber (as it were).

All these are interesting speculations, but the main point is clear: the findings add to the growing evidence that bilingualism and foreign language learning have particular benefits for the brain, and for protecting against cognitive decline.

I’ve reported before on the growing evidence that metabolic syndrome in middle and old age is linked to greater risk of cognitive impairment in old age and faster decline. A new study shows at least part of the reason.

The study involved 71 middle-aged people recruited from the Wisconsin Registry for Alzheimer's Prevention (WRAP), of whom 29 met the criteria for metabolic syndrome (multiple cardiovascular and diabetes risk factors including abdominal obesity, high blood pressure, high blood sugar and high cholesterol).

Those with metabolic syndrome averaged 15% less blood flow to the brain than those without the syndrome.

One tried and true method of increasing blood flow to the brain is of course through exercise.

The study was presented at the Alzheimer's Association International Conference in Vancouver, Canada by Barbara Bendlin.

Here’s an exciting little study, implying as it does that one particular aspect of information processing underlies much of the cognitive decline in older adults, and that this can be improved through training. No, it’s not our usual suspect, working memory, it’s something far less obvious: temporal processing.

In the study, 30 older adults (aged 65-75) were randomly assigned to three groups: one that received ‘temporal training’, one that practiced common computer games (such as Solitaire and Mahjong), and a no-activity control. Temporal training was provided by a trademarked program called Fast ForWord Language® (FFW), which was developed to help children who have trouble reading, writing, and learning.

The training, for both training groups, occupied an hour a day, four days a week, for eight weeks.

Cognitive assessment, carried out at the beginning and end of the study, and for the temporal training group again 18 months later, included tests of sequencing abilities (how quickly two sounds could be presented and still be accurately assessed for pitch or direction), attention (vigilance, divided attention, and alertness), and short-term memory (working memory span, pattern recognition, and pattern matching).

Only in the temporal training group did performance on any of the cognitive tests significantly improve after training — on the sequencing tests, divided attention, matching complex patterns, and working memory span. These positive effects still remained after 18 months (vigilance was also higher at the end of training, but this improvement wasn’t maintained).

This is, of course, only a small pilot study. I hope we will see a larger study, and one that compares this form of training against other computer training programs. It would also be good to see some broader cognitive tests — ones that are less connected to the temporal training. But I imagine that, as I’ve discussed before, an effective training program will include more than one type of training. This may well be an important component of such a program.

[3075] Szelag, E., & Skolimowska J.
(2012).  Cognitive function in elderly can be ameliorated by training in temporal information processing.
Restorative Neurology and Neuroscience. 30(5), 419 - 434.

HIV-associated dementia occurs in around 30% of untreated HIV-positive patients. Surprisingly, it also is occasionally found in some patients (2-3%) who are being successfully treated for HIV (and show no signs of AIDS).

A new study may have the answer for this mystery, and suggest a solution. Moreover, the answer may have general implications for those experiencing cognitive decline in old age.

The study found that HIV, although it doesn’t directly infect neurons, tries to stop the development of BDNF. Long known to be crucial for memory and learning, the reduced production of mature BDNF results in axons and dendrites shortening — meaning connections between neurons are lost. That in turn, brings about the death of some neurons.

It seems that the virus interferes with the normal process of development in BDNF, whereby one form of it, called proBDNF, is cut by certain enzymes into a new form called mature BDNF. It is in this form that it has its beneficial effect on neuron growth. Unfortunately, in its earlier form it is toxic to neurons.

This imbalance in the proportions of mature BDNF and proBDNF also appears to occur as we age, and in depression. It may also be a risk factor in Parkinson's and Huntington's diseases.

However, these findings suggest a new therapeutic approach.

Compounds in green tea and chocolate may help protect brain cells

In which context, it is interesting to note another new study, which has been busy analyzing the effects on brain cells of 2000 compounds, both natural and synthetic. Of the 256 that looked to have protective effects, nine were related to epicatechin, which is found in cocoa and green tea leaves.

While we’ve been aware for some time of these positive qualities, the study specifically identified epicatechin and epigallocatechin gallate (EGCG) as being the most effective at helping protect neurons by inducing production of BDNF.

One of the big advantages these compounds have is in their ability to cross the blood-brain barrier, making them a good candidate for therapy.

While green tea, dark chocolate, and cocoa are particularly good sources, many fruits also have good levels, in particular, black grapes, blackberries, apples, cherries, pears, and raspberries. (see this University of Davis document (pdf) for more detail)

My recent reports on brain training for older adults (see, e.g., Review of working memory training programs finds no broader benefit; Cognitive training shown to help healthy older adults; Video game training benefits cognition in some older adults) converge on the idea that cognitive training can indeed be beneficial for older adults’ cognition, but there’s little wider transfer beyond the skills being practiced. That in itself can be valuable, but it does reinforce the idea that the best cognitive training covers a number of different domains or skill-sets. A new study adds little to this evidence, but does perhaps emphasize the importance of persistence and regularity in training.

The study involved 59 older adults (average age 84), of whom 33 used a brain fitness program 5 days a week for 30 minutes a day for at least 8 weeks, while the other group of 26 were put on a waiting list for the program. After two months, both groups were given access to the program, and both were encouraged to use it as much or as little as they wanted. Cognitive testing occurred before the program started, at two months, and at six months.

The first group to use the program used the program on average for 80 sessions, compared to an average 44 sessions for the wait-list group.

The higher use group showed significantly higher cognitive scores (delayed memory test; Boston Naming test) at both two and six months, while the lower (and later) use group showed improvement at the end of the six month period, but not as much as the higher use group.

I’m afraid I don’t have any more details (some details of the training program would be nice) because it was a conference presentation, so I only have access to the press release and the abstract. Because we don’t know exactly what the training entailed, we don’t know the extent to which it practiced the same skills that were tested. But we may at least add it to the evidence that you can improve cognitive skills by regular training, and that the length/amount of training (and perhaps regularity, since the average number of sessions for the wait-list group implies an average engagement of some three times a week, while the high-use group seem to have maintained their five-times-a-week habit) matters.

Another interesting presentation at the conference was an investigation into mental stimulating activities and brain activity in older adults.

In this study, 151 older adults (average age 82) from the Rush Memory and Aging Project answered questions about present and past cognitive activities, before undergoing brain scans. The questions concerned how frequently they engaged in mentally stimulating activities (such as reading books, writing letters, visiting a library, playing games) and the availability of cognitive resources (such as books, dictionaries, encyclopedias) in their home, during their lifetime (specifically, at ages 6, 12, 18, 40, and now).

Higher levels of cognitive activity and cognitive resources were also associated with better cognitive performance. Moreover, after controlling for education and total brain size, it was found that frequent cognitive activity in late life was associated with greater functional connectivity between the posterior cingulate cortex and several other regions (right orbital and middle frontal gyrus, left inferior frontal gyrus, hippocampus, right cerebellum, left inferior parietal cortex). More cognitive resources throughout life was associated with greater functional connectivity between the posterior cingulate cortex and several other regions (left superior occipital gyrus, left precuneus, left cuneus, right anterior cingulate, right middle frontal gyrus, and left inferior frontal gyrus).

Previous research has implicated a decline in connectivity with the posterior cingulate cortex in mild cognitive impairment and Alzheimer’s disease.

Cognitive activity earlier in life was not associated with differences in connectivity.

The findings provide further support for the idea “Use it or lose it!”, and suggests that mental activity protects against cognitive decline by maintaining functional connectivity in important neural networks.

Miller, K.J. et al. 2012. Memory Improves With Extended Use of Computerized Brain Fitness Program Among Older Adults. Presented August 3 at the 2012 convention of the American Psychological Association.

Han, S.D. et al. 2012. Cognitive Activity and Resources Are Associated With PCC Functional Connectivity in Older Adults. Presented August 3 at the 2012 convention of the American Psychological Association.

Back in 2009, I reported briefly on a large Norwegian study that found that older adults who consumed chocolate, wine, and tea performed significantly better on cognitive tests. The association was assumed to be linked to the flavanols in these products. A new study confirms this finding, and extends it to older adults with mild cognitive impairment.

The study involved 90 older adults with MCI, who consumed either 990 milligrams, 520 mg, or 45 mg of a dairy-based cocoa drink daily for eight weeks. Their diet was restricted to eliminate other sources of flavanols (such as tea, red wine, apples and grapes).

Cognitive assessment at the end of this period revealed that, although scores on the MMSE were similar across all groups, those consuming higher levels of flavanol cocoa took significantly less time to complete Trail Making Tests A and B, and scored significantly higher on the verbal fluency test. Insulin resistance and blood pressure was also lower.

Those with the highest levels of flavanols did better than those on intermediate levels on the cognitive tests. Both did better than those on the lowest levels.

Changes in insulin resistance explained part, but not all, of the cognitive improvement.

One caveat: the group were generally in good health without known cardiovascular disease — thus, not completely representative of all those with MCI.

 

The study involved 120 healthy older adults (60-79) from Shanghai, who were randomly assigned to one of four groups: one that participated in three sessions of tai chi every week for 40 weeks; another that instead had ‘social interaction’ sessions (‘lively discussions’); another in which participants engaged in walking around a track; and a non-intervention group included as a control. Brain scans were taken before and after the 40-week intervention, and cognitive testing took place at 20 weeks as well as these times.

Compared to those who received no intervention, both those who participated in tai chi, and those who participated in the social sessions, showed significant increases in brain volume and on some cognitive measures. However, the tai chi group showed improvement on more cognitive tests than the social group (on the Mattis Dementia Rating Scale, the Trailmaking Tests, delayed recognition on the Auditory Verbal Learning Test, and verbal fluency for animals vs verbal fluency and positive trends only on Trails A and the Auditory test).

Surprisingly, there were no such significant effects from the walking intervention, which involved 30 minutes of brisk walking around a 400m circular track, sandwiched by 10 minutes of warm-up and 10 minutes cool-down exercises. This took place in the same park as the tai chi sessions (which similarly included 20 minutes of warm-up exercises, 20 minutes of tai chi, and 10 minutes of cool-down exercises).

This finding is inconsistent with other research, but the answer seems to lie in individual differences — specifically, speed of walking. Faster walkers showed significantly better performance on the Stroop test, and on delayed recall and recognition on the Auditory Verbal Learning Test. It should be noted that, unlike some studies in which participants were encouraged to reach heart-rate targets, participants in this study were simply told to walk at their own speed. This finding, then, would seem to support the view that brisk walking is needed to reap good health and cognitive benefits (which shouldn’t put anyone off — anything is better than nothing! and speed is likely to come with practice, if that’s your aim).

It should also be noted that this population has generally high rates of walking. It is likely, then, that the additional walking in these sessions did not add a great deal to their existing behavior.

There is a caveat to the strongly positive effects of tai chi: this group showed lower cognitive performance at baseline. This was because the group randomly received more individuals with very low scores (8 compared with 5 in the other groups).

The study is, of course, quite a small one, and a larger study is required to confirm these results.

One final note: the relative differences in enjoyment were not explicitly investigated, but the researchers did note that the social group, who initially were given topics to discuss in their hour-long sessions, then decided to select and organize their own discussions, and have continued to do so for two years following the end of the study. It would have been nice if the researchers had re-tested participants at that point.

Mortimer, J.A. et al. 2012. Changes in Brain Volume and Cognition in a Randomized Trial of Exercise and Social Interaction in a Community-Based Sample of Non-Demented Chinese Elders. Journal of Alzheimer's Disease, 30 (4), 757-766.
Full text available at http://health.usf.edu/nocms/publicaffairs/now/pdfs/JAD_Mortimer_30%28201...

I often talk about the importance of attitudes and beliefs for memory and cognition. A new honey bee study provides support for this in relation to the effects of aging on the brain, and suggests that this principle extends across the animal kingdom.

Previous research has shown that bees that stay in the nest and take care of the young remain mentally competent, but they don’t nurse for ever. When they’re older (after about 2-3 weeks), they become foragers, and foraging bees age very quickly — both physically and mentally. Obviously, you would think, bees ‘retire’ to foraging, and their old age is brief (they begin to show cognitive decline after just two weeks).

But it’s not as simple as that, because in artificial hives where worker bees are all the same age, nurse bees of the same age as foragers don’t show the same cognitive and sensory decline. Moreover, nurse bees have been found to maintain their cognitive abilities for more than 100 days, while foragers die within 18 days and show cognitive declines after 13-15 days (although their ability to assess sweetness remains intact).

The researchers accordingly asked a very interesting question: what happens if the foragers return to babysitting?

To achieve this, they removed all of the younger nurse bees from the nest, leaving only the queen and babies. When the older, foraging bees returned to the nest, activity slowed down for several days, and then they re-organized themselves: some of the old bees returned to foraging; others took on the babysitting and housekeeping duties (cleaning, building the comb, and tending to the queen). After 10 days, around half of these latter bees had significantly improved their ability to learn new things.

This cognitive improvement was also associated with a change in two specific proteins in their brains: one that has been associated with protection against oxidative stress and inflammation associated with Alzheimer disease and Huntington disease in humans (Prx6), and another dubbed a “chaperone” protein because it protects other proteins from being damaged when brain or other tissues are exposed to cell-level stress.

Precisely what it is about returning to the hive that produces this effect is a matter of speculation, but this finding does show that learning impairment in old bees can be reversed by changes in behavior, and this reversal is correlated with specific changes in brain protein.

Having said this, it shouldn’t be overlooked that only some of the worker bees showed this brain plasticity. This is not, apparently, due to differences in genotype, but may depend on the amount of foraging experience.

The findings add weight to the idea that social interventions can help our brains stay younger, and are consistent with growing evidence that, in humans, social engagement helps protect against dementia and age-related cognitive impairment.

The (probably) experience-dependent individual differences shown by the bees is perhaps mirrored in our idea of cognitive reserve, but with a twist. The concept of cognitive reserve emphasizes that accumulating a wealth of cognitive experience (whether through education or occupation or other activities) protects your brain from the damage that might occur with age. But perhaps (and I’m speculating now) we should also consider the other side of this: repeated engagement in routine or undemanding activities may have a deleterious effect, independent of and additional to the absence of more stimulating activities.

The latest finding from the large, long-running Health, Aging, and Body Composition (Health ABC) Study adds to the evidence that preventing or controlling diabetes helps prevent age-related cognitive decline.

The study involves 3,069 older adults (70+), of whom 717 (23%) had diabetes at the beginning of the study in 1997. Over the course of the study, a further 159 developed diabetes. Those with diabetes at the beginning had lower cognitive scores, and showed faster decline. Those who developed diabetes showed a rate of decline that was between that faster rate and the slower rate of those who never developed diabetes.

Among those with diabetes, those who had higher levels of a blood marker called glycosylated hemoglobin had greater cognitive impairment. Higher levels of this blood marker reflect poorer control of blood sugar.

In other words, both duration and severity of diabetes are important factors in determining rate of cognitive decline in old age.

A review of three high quality trials comparing the putative benefits of omega-3 fatty acids for preventing age-related cognitive decline, has concluded that there is no evidence that taking fish oil supplements helps fight cognitive decline. The trials involved a total of 3,536 healthy older adults (60+). In two studies, participants were randomly assigned to receive gel capsules containing omega-3 PUFA or olive or sunflower oil for six or 24 months. In the third study, participants were randomly assigned to receive tubs of margarine spread for 40 months (regular margarine versus margarine fortified with omega-3 PUFA).

The researchers found no benefit from taking the omega-3 capsules or margarine spread compared to placebo capsules or margarines (sunflower oil, olive oil or regular margarine). Participants given omega-3 did not score better on the MMSE or on other tests of cognitive function such as verbal learning, digit span and verbal fluency.

The researchers nevertheless stress that longer term studies are needed, given that there was very little deterioration in cognitive function in any of the groups.

While the ‘Alzheimer’s gene’ is relatively common — the ApoE4 mutation is present in around 15% of the population — having two copies of the mutation is, thankfully, much rarer, at around 2%. Having two copies is of course a major risk factor for developing Alzheimer’s, and it has been thought that having a single copy is also a significant (though lesser) risk factor. Certainly there is quite a lot of evidence linking ApoE4 carriers to various markers of cognitive impairment.

And yet, the evidence has not been entirely consistent. I have been puzzled by this myself, and now a new finding suggests a reason. It appears there are gender differences in responses to this gene variant.

The study involved 131 healthy older adults (median age 70), whose brains were scanned. The scans revealed that in older women with the E4 variant, brain activity showed the loss of synchronization that is typically seen in Alzheimer’s patients, with the precuneus (a major hub in the default mode network) out of sync with other brain regions. This was not observed in male carriers.

The finding was confirmed by a separate set of data, taken from the Alzheimer's Disease Neuroimaging Initiative database. Cerebrospinal fluid from 91 older adults (average age 75) revealed that female carriers had substantially higher levels of tau protein (a key Alzheimer’s biomarker) than male carriers or non-carriers.

It’s worth emphasizing that the participants in the first study were all cognitively normal — the loss of synchronization was starting to happen before visible Alzheimer’s symptoms appeared.

The findings suggest that men have less to worry about than women, as far as the presence of this gene is concerned. The study may also explain why more women than men get the disease (3 women to 2 men); it is not (although of course this is a factor) simply a consequence of women tending to live longer.

Whether or not these gender differences extend to carriers of two copies of the gene is another story.

A study designed to compare the relative benefits of exercise and diet control on Alzheimer’s pathology and cognitive performance has revealed that while both are beneficial, exercise is of greater benefit in reducing Alzheimer’s pathology and cognitive impairment.

The study involved mice genetically engineered with a mutation in the APP gene (a familial risk factor for Alzheimer’s), who were given either a standard diet or a high-fat diet (60% fat, 20% carbohydrate, 20% protein vs 10% fat, 70% carbohydrate, 20% protein) for 20 weeks (from 2-3 to 7-8 months of age). Some of the mice on the high-fat diet spent the second half of that 20 weeks in an environmentally enriched cage (more than twice as large as the standard cage, and supplied with a running wheel and other objects). Others on the high-fat diet were put back on a standard diet in the second 10 weeks. Yet another group were put on a standard diet and given an enriched cage in the second 10 weeks.

Unsurprisingly, those on the high-fat diet gained significantly more weight than those on the standard diet, and exercise reduced that gain — but not as much as diet control (i.e., returning to a standard diet) did. Interestingly, this was not the result of changes in food intake, which either stayed the same or slightly increased.

More importantly, exercise and diet control were roughly equal in reversing glucose intolerance, but exercise was more effective than diet control in ameliorating cognitive impairment. Similarly, while amyloid-beta pathology was significantly reduced in both exercise and diet-control conditions, exercise produced the greater reduction in amyloid-beta deposits and level of amyloid-beta oligomers.

It seems that diet control improves metabolic disorders induced by a high-fat diet — conditions such as obesity, hyperinsulinemia and hypercholesterolemia — which affects the production of amyloid-beta. However exercise is more effective in tackling brain pathology directly implicated in dementia and cognitive decline, because it strengthens the activity of an enzyme that decreases the level of amyloid-beta.

Interestingly, and somewhat surprisingly, the combination of exercise and diet control did not have a significantly better effect than exercise alone.

The finding adds to the growing pile of evidence for the value of exercise in maintaining a healthy brain in later life, and helps explain why. Of course, as I’ve discussed on several occasions, we already know other mechanisms by which exercise improves cognition, such as boosting neurogenesis.

I have said before that there is little evidence that working memory training has any wider benefits than to the skills being practiced. Occasionally a study arises that gets everyone all excited, but by and large training only benefits the skill being practiced — despite the fact that working memory underlies so many cognitive tasks, and limited working memory capacity is thought to negatively affect performance on so many tasks. However, one area that does seem to have had some success is working memory training for those with ADHD, and researchers have certainly not given hope of finding evidence for wider transfer among other groups (such as older adults).

A recent review of the research to date has, sadly, concluded that the benefits of working memory training programs are limited. But this is not to say there are no benefits.

For a start, the meta-analysis (analyzing data across studies) found that working memory training produced large immediate benefits for verbal working memory. These benefits were greatest for children below the age of 10.

These benefits, however, were not maintained long-term (at an average of 9 months after training, there were no significant benefits) — although benefits were found in one study in which the verbal working memory task was very similar to the training task (indicating that the specific skill practiced did maintain some improvement long-term).

Visuospatial working memory also showed immediate benefits, and these did not vary across age groups. One factor that did make a difference was type of training: the CogMed training program produced greater improvement than the researcher-developed programs (the studies included 7 that used CogMed, 2 that used Jungle Memory, 2 Cognifit, 4 n-back, 1 Memory Booster, and 7 researcher-developed programs).

Interestingly, visuospatial working memory did show some long-term benefits, although it should be noted that the average follow-up was distinctly shorter than that for verbal working memory tasks (an average of 5 months post-training).

The burning question, of course, is how well this training transferred to dissimilar tasks. Here the evidence seems sadly clear — those using untreated control groups tended to find such transfer; those using treated control groups never did. Similarly, nonrandomized studies tended to find far transfer, but randomized studies did not.

In other words, when studies were properly designed (randomized trials with a control group that is given alternative treatment rather than no treatment), there was no evidence of transfer effects from working memory training to nonverbal ability. Moreover, even when found, these effects were only present immediately and not on follow-up.

Neither was there any evidence of transfer effects, either immediate or delayed, on verbal ability, word reading, or arithmetic. There was a small to moderate effect on training on attention (as measured by the Stroop test), but this only occurred immediately, and not on follow-up.

It seems clear from this review that there are few good, methodologically sound studies on this subject. But three very important caveats should be noted in connection with the researchers’ dispiriting conclusion.

First of all, because this is an analysis across all data, important differences between groups or individuals may be concealed. This is a common criticism of meta-analysis, and the researchers do try and answer it. Nevertheless, I think it is still a very real issue, especially in light of evidence that the benefit of training may depend on whether the challenge of the training is at the right level for the individual.

On the other hand, another recent study, that compared young adults who received 20 sessions of training on a dual n-back task or a visual search program, or received no training at all, did look for an individual-differences effect, and failed to find it. Participants were tested repeatedly on their fluid intelligence, multitasking ability, working memory capacity, crystallized intelligence, and perceptual speed. Although those taking part in the training programs improved their performance on the tasks they practiced, there was no transfer to any of the cognitive measures. When participants were analyzed separately on the basis of their improvement during training, there was still no evidence of transfer to broader cognitive abilities.

The second important challenge comes from the lack of skill consolidation — having a short training program followed by months of not practicing the skill is not something any of us would expect to produce long-term benefits.

The third point concerns a recent finding that multi-domain cognitive training produces longer-lasting benefits than single-domain training (the same study also showed the benefit of booster training). It seems quite likely that working memory training is a valuable part of a training program that also includes practice in real-world tasks that incorporate working memory.

I should emphasize that these results only apply to ‘normal’ children and adults. The question of training benefits for those with attention difficulties or early Alzheimer’s is a completely different issue. But for these healthy individuals, it has to be said that the weight of the evidence is against working memory training producing more general cognitive improvement. Nevertheless, I think it’s probably an important part of a cognitive training program — as long as the emphasis is on part.

Melby-Lervåg, M., & Hulme, C. (2012). Is Working Memory Training Effective? A Meta-Analytic Review. Developmental psychology. doi:10.1037/a0028228
Full text available at http://www.apa.org/pubs/journals/releases/dev-ofp-melby-lervag.pdf

[3012] Redick, T. S., Shipstead Z., Harrison T. L., Hicks K. L., Fried D. E., Hambrick D. Z., et al.
(2012).  No Evidence of Intelligence Improvement After Working Memory Training: A Randomized, Placebo-Controlled Study..
Journal of Experimental Psychology: General.
Full text available at http://psychology.gatech.edu/renglelab/publications/2012/RedicketalJEPG.pdf
 

I have reported previously on research suggesting that rapamycin, a bacterial product first isolated from soil on Easter Island and used to help transplant patients prevent organ rejection, might improve learning and memory. Following on from this research, a new mouse study has extended these findings by adding rapamycin to the diet of healthy mice throughout their life span. Excitingly, it found that cognition was improved in young mice, and abolished normal cognitive decline in older mice.

Anxiety and depressive-like behavior was also reduced, and the mice’s behavior demonstrated that rapamycin was acting like an antidepressant. This effect was found across all ages.

Three "feel-good" neurotransmitters — serotonin, dopamine and norepinephrine — all showed significantly higher levels in the midbrain (but not in the hippocampus). As these neurotransmitters are involved in learning and memory as well as mood, it is suggested that this might be a factor in the improved cognition.

Other recent studies have suggested that rapamycin inhibits a pathway in the brain that interferes with memory formation and facilitates aging.

A number of studies have come out in recent years linking age-related cognitive decline and dementia risk to inflammation and infection (put inflammation into the “Search this site” box at the top of the page and you’ll see what I mean). New research suggests one important mechanism.

In a mouse study, mice engineered to be deficient in receptors for the CCR2 gene — a crucial element in removing beta-amyloid and also important for neurogenesis — developed Alzheimer’s-like pathology more quickly. When these mice had CCR2 expression boosted, accumulation of beta-amyloid decreased and the mice’s memory improved.

In the human study, the expression levels of thousands of genes from 691 older adults (average age 73) in Italy (part of the long-running InCHIANTI study) were analyzed. Both cognitive performance and cognitive decline over 9 years (according to MMSE scores) were significantly associated with the expression of this same gene. That is, greater CCR2 activity was associated with lower cognitive scores and greater decline.

Expression of the CCR2 gene was also positively associated with the Alzheimer’s gene — meaning that those who carry the APOE4 variant are more likely to have higher CCR2 activity.

The finding adds yet more weight to the importance of preventing / treating inflammation and infection.

[2960] Harries, L. W., Bradley-Smith R. M., Llewellyn D. J., Pilling L. C., Fellows A., Henley W., et al.
(2012).  Leukocyte CCR2 Expression Is Associated with Mini-Mental State Examination Score in Older Adults.
Rejuvenation Research. 120518094735004 - 120518094735004.

Naert, G. & Rivest S. 2012. Hematopoietic CC-chemokine receptor 2-(CCR2) competent cells are protective for the cognitive impairments and amyloid pathology in a transgenic mouse model of Alzheimer's disease. Molecular Medicine, 18(1), 297-313.

El Khoury J, et al. 2007. Ccr2 deficiency impairs microglial accumulation and accelerates progression of Alzheimer-like disease. Nature Medicine, 13, 432–8.

I’ve reported before on the evidence suggesting that carriers of the ‘Alzheimer’s gene’, APOE4, tend to have smaller brain volumes and perform worse on cognitive tests, despite being cognitively ‘normal’. However, the research hasn’t been consistent, and now a new study suggests the reason.

The e4 variant of the apolipoprotein (APOE) gene not only increases the risk of dementia, but also of cardiovascular disease. These effects are not unrelated. Apoliproprotein is involved in the transportation of cholesterol. In older adults, it has been shown that other vascular risk factors (such as elevated cholesterol, hypertension or diabetes) worsen the cognitive effects of having this gene variant.

This new study extends the finding, by looking at 72 healthy adults from a wide age range (19-77).

Participants were tested on various cognitive abilities known to be sensitive to aging and the effects of the e4 allele. Those abilities include speed of information processing, working memory and episodic memory. Blood pressure, brain scans, and of course genetic tests, were also performed.

There are a number of interesting findings:

  • The relationship between age and hippocampal volume was stronger for those carrying the e4 allele (shrinkage of this brain region occurs with age, and is significantly greater in those with MCI or dementia).
  • Higher systolic blood pressure was significantly associated with greater atrophy (i.e., smaller volumes), slower processing speed, and reduced working memory capacity — but only for those with the e4 variant.
  • Among those with the better and more common e3 variant, working memory was associated with lateral prefrontal cortex volume and with processing speed. Greater age was associated with higher systolic blood pressure, smaller volumes of the prefrontal cortex and prefrontal white matter, and slower processing. However, blood pressure was not itself associated with either brain atrophy or slower cognition.
  • For those with the Alzheimer’s variant (e4), older adults with higher blood pressure had smaller volumes of prefrontal white matter, and this in turn was associated with slower speed, which in turn linked to reduced working memory.

In other words, for those with the Alzheimer’s gene, age differences in working memory (which underpin so much of age-related cognitive impairment) were produced by higher blood pressure, reduced prefrontal white matter, and slower processing. For those without the gene, age differences in working memory were produced by reduced prefrontal cortex and prefrontal white matter.

Most importantly, these increases in blood pressure that we are talking about are well within the normal range (although at the higher end).

The researchers make an interesting point: that these findings are in line with “growing evidence that ‘normal’ should be viewed in the context of individual’s genetic predisposition”.

What it comes down to is this: those with the Alzheimer’s gene variant (and no doubt other genetic variants) have a greater vulnerability to some of the risk factors that commonly increase as we age. Those with a family history of dementia or serious cognitive impairment should therefore pay particular attention to controlling vascular risk factors, such as hypertension and diabetes.

This doesn’t mean that those without such a family history can safely ignore such conditions! When they get to the point of being clinically diagnosed as problems, then they are assuredly problems for your brain regardless of your genetics. What this study tells us is that these vascular issues appear to be problematic for Alzheimer’s gene carriers before they get to that point of clinical diagnosis.

A new study, involving 1,219 dementia-free older adults (65+), has found that the more omega-3 fatty acids the person consumed, the lower the level of beta-amyloid in the blood (a proxy for brain levels). Consuming a gram of omega-3 more than the average per day was associated with 20-30% lower beta-amyloid levels. A gram of omega-3 equates to around half a fillet of salmon per week.

Participants provided information about their diet for an average of 1.2 years before their blood was tested for beta-amyloid. Other nutrients investigated —saturated fatty acids, omega-6 polyunsaturated fatty acids, mono-unsaturated fatty acid, vitamin E, vitamin C, beta-carotene, vitamin B12, folate and vitamin D — were not associated with beta-amyloid levels.

The results remained after adjusting for age, education, gender, ethnicity, amount of calories consumed and APOE gene status.

The findings are consistent with previous research associating higher levels of omega-3 and/or fish intake with lower risk of Alzheimer’s. Additionally, another recent study provides evidence that the brains of those with Alzheimer’s disease, MCI, and the cognitively normal, all have significantly different levels of omega-3 and omega-6 fatty acids. That study concluded that the differences were due to both consumption and metabolic differences.

[2959] Gu, Y., Schupf N., Cosentino S. A., Luchsinger J. a, & Scarmeas N.
(2012).  Nutrient Intake and Plasma Β-Amyloid.
Neurology. 78(23), 1832 - 1840.

Cunnane, S.C. et al. 2012. Plasma and Brain Fatty Acid Profiles in Mild Cognitive Impairment and Alzheimer’s Disease. Journal of Alzheimer’s Disease, 29 (3), 691-697.

More findings from the long-running Mayo Clinic Study of Aging reveal that using a computer plus taking moderate exercise reduces your risk of mild cognitive impairment significantly more than you would expect from simply adding together these two beneficial activities.

The study involved 926 older adults (70-93), of whom 109 (12%) were diagnosed with MCI. Participants completed questionnaires on physical exercise and mental stimulation within the previous year. Computer use was targeted in this analysis because of its popularity as a cognitive activity, and because it was particularly associated with reduced odds of having MCI.

Among the cognitively healthy, only 20.1% neither exercised moderately nor used a computer, compared to 37.6% of those with MCI. On the other hand, 36% of the cognitively healthy both exercised and used a computer, compared to only 18.3% of those with MCI. There was little difference between the two groups as regards exercise but no computer use, or computer use but no exercise.

The analysis took into account calorie intake, as well as education, depression, and other health factors. Daily calorie intake was significantly higher in those with MCI compared to those without (respective group medians of 2100 calories vs 1802) — note that the median BMI was the same for the two groups.

Moderate physical exercise was defined as brisk walking, hiking, aerobics, strength training, golfing without a golf cart, swimming, doubles tennis, yoga, martial arts, using exercise machines and weightlifting. Light exercise included activities such as bowling, leisurely walking, stretching, slow dancing, and golfing with a cart. Mentally stimulating activities included reading, crafts, computer use, playing games, playing music, group and social and artistic activities and watching less television.

It should be noted that the assessment of computer activities was very basic. The researchers suggest that in future studies, both duration and frequency should be assessed. I would add type of activity, although that would be a little more difficult to assess.

Overall, the findings add yet more weight to the evidence for the value of physical exercise and mental stimulation in staving off cognitive impairment in old age, and add the twist that doing both is much better than doing either one alone.

The study involved 4,134 people (average age 59) who worked at the French national gas and electric company, of whom most worked at the company for their entire career. Their lifetime exposure to chlorinated solvents, petroleum solvents, benzene and non-benzene aromatic solvents was estimated, and they were given the Digit Symbol Substitution Test to assess cognitive performance. Cognitive impairment was defined as scoring below the 25th percentile. Most of the participants (88%) were retired.

For analysis, participants were divided into two groups based on whether they had less than a secondary school education or not. This revealed an interesting finding: higher rates of solvent exposure were associated with cognitive impairment, in a dose-dependent relationship — but only in those with less than a high school education. Recency of solvent exposure also predicted worse cognition among the less-educated (suggesting that at least some of the damage was recoverable).

However, among those with secondary education or higher, there was no significant association between solvent exposure (quantity or recency) and cognition.

Over half the participants (58%) had less than a high school education. Of those, 32% had cognitive impairment — twice the rate in those with more education.

The type of solvent also made a difference, with non-benzene aromatic solvents the most dangerous, followed by benzene solvents, and then chlorinated and petroleum solvents (the rates of cognitive impairment among highly-exposed less-educated, was 36%, 24%, and 14%, respectively).

The findings point to the value of cognitive reserve, but I have several caveats. (Unfortunately, this study appears in a journal to which I don’t have access, so it’s possible the first of this at least is answered in the paper.) The first is that those with less education had higher rates of exposure, which raises the question of a threshold effect. Second is that the cognitive assessment is only at one point of time, lacking both a baseline (do we know what sort of average score adults of this age and with this little education would achieve? A quick online search threw up no such appropriate normative data) and a time-comparison that would give a rate of decline. Third, is that the cognitive assessment is very limited, being based on only one test.

In other words, the failure to find an effect among those with at least a high school education may well reflect the lack of sensitivity in the test (designed to assess brain damage). More sensitive tests, and test comparisons over time, may well give a different answer.

On its own, then, this finding is merely another data-point. But accumulating data-points is how we do science! Hopefully, in due course there’ll be a follow-up that will give us more information.

I’ve mentioned before that, for some few people, exercise doesn’t seem to have a benefit, and the benefits of exercise for fighting age-related cognitive decline may not apply to those carrying the Alzheimer’s gene.

New research suggests there is another gene variant that may impact on exercise’s effects. The new study follows on from earlier research that found that physical exercise during adolescence had more durable effects on object memory and BDNF levels than exercise during adulthood. In this study, 54 healthy but sedentary young adults (aged 18-36) were given an object recognition test before participating in either (a) a 4-week exercise program, with exercise on the final test day, (b) a 4-week exercise program, without exercise on the final test day, (c) a single bout of exercise on the final test day, or (d) remaining sedentary between test days.

Exercise both improved object recognition memory and reduced perceived stress — but only in one group: those who exercised for 4 weeks including the final day of testing. In other words, both regular exercise and recent exercise was needed to produce a memory benefit.

But there is one more factor — and this is where it gets really interesting — the benefit in this group didn’t happen for every member of the group. Only those carrying a specific genotype benefited from regular and recent exercise. This genotype has to do with the brain protein BDNF, which is involved in neurogenesis and synaptic plasticity, and which is increased by exercise. The BDNF gene comes in two flavors: Val and Met. Previous research has linked the less common Met variant to poorer memory and greater age-related cognitive decline.

In other words, it seems that the Met allele affects how much BDNF is released as a result of exercise, and this in turn affects cognitive benefits.

The object recognition test involved participants seeing a series of 50 images (previously selected as being highly recognizable and nameable), followed by a 15 minute filler task, before seeing 100 images (the previous 50 and 50 new images) and indicating which had been seen previously. The filler task involved surveys for state anxiety, perceived stress, and mood. On the first (pre-program) visit, a survey for trait anxiety was also completed.

Of the 54 participants, 31 carried two copies of the Val allele, and 23 had at least one Met allele (19 Val/Met; 4 Met/Met). The population frequency for carrying at least one Met allele is 50% for Asians, 30% in Caucasians, and 4% in African-Americans.

Although exercise decreased stress and increased positive mood, the cognitive benefits of exercise were not associated with mood or anxiety. Neither was genotype associated with mood or anxiety. However, some studies have found an association between depression and the Met variant, and this study is of course quite small.

A final note: this study is part of research looking at the benefits of exercise for children with ADHD. The findings suggest that genotyping would enable us to predict whether an individual — a child with ADHD or an older adult at risk of cognitive decline or impairment — would benefit from this treatment strategy.

Older adults who sleep poorly react to stress with increased inflammation

A study involving 83 older adults (average age 61) has found that poor sleepers reacted to a stressful situation with a significantly greater inflammatory response than good sleepers. High levels of inflammation increase the risk of several disorders, including cardiovascular disease and diabetes, and have been implicated in Alzheimer’s.

Each participant completed a self-report of sleep quality, perceived stress, loneliness and medication use. Around 27% were categorized as poor sleepers. Participants were given a series of tests of verbal and working memory designed to increase stress, with blood being taken before and after testing, as well as three more times over the next hour. The blood was tested for levels of a protein marker for inflammation (interleukin-6).

Poor sleepers reported more depressive symptoms, more loneliness and more perceived stress compared to good sleepers. Before cognitive testing, levels of IL-6 were the same for poor and good sleepers. However, while both groups showed increases in IL-6 after testing, poor sleepers showed a significantly larger increase — as much as four times larger and at a level found to increase risk for illness and death in older adults.

After accounting for loneliness, depression or perceived stress, this association remained. Surprisingly, there was no evidence that poor sleep led to worse cognitive performance, thus causing more stress. Poor sleepers did just as well on the tests as the good sleepers (although I note that we cannot rule out that poor sleepers were having to put in more effort to achieve the same results). Although there was a tendency for poor sleepers to be in a worse mood after testing (perhaps because they had to put in more effort? My own speculation), this mood change didn’t predict the increased inflammatory response.

The findings add to evidence that poor sleep (unfortunately common as people age) is an independent risk factor for cognitive and physical health, and suggest we should put more effort into dealing with it, rather than just accepting it as a corollary of age.

REM sleep disorder doubles risk of MCI, Parkinson's

A recent Mayo Clinic study has also found that people with rapid eye movement sleep behavior disorder (RBD) have twice the risk of developing mild cognitive impairment or Parkinson’s disease. Some 34% of those diagnosed with probable RBD developed MCI or Parkinson's disease within four years of entering the study, a rate 2.2 times greater than those with normal REM sleep.

Earlier research has found that 45% of those with RBD developed MCI or Parkinson's disease within five years of diagnosis, but these findings were based on clinical patients. The present study involved cognitively healthy older adults (70-89) participating in a population-based study of aging, who were diagnosed for probable RBD on the basis of the Mayo Sleep Questionnaire.

Data from the very large and long-running Cognitive Function and Ageing Study, a U.K. study involving 13,004 older adults (65+), from which 329 brains are now available for analysis, has found that cognitive lifestyle score (CLS) had no effect on Alzheimer’s pathology. Characteristics typical of Alzheimer’s, such as plaques, neurofibrillary tangles, and hippocampal atrophy, were similar in all CLS groups.

However, while cognitive lifestyle may have no effect on the development of Alzheimer's pathology, that is not to say it has no effect on the brain. In men, an active cognitive lifestyle was associated with less microvascular disease. In particular, the high CLS group showed an 80% relative reduction in deep white matter lesions. These associations remained after taking into account cardiovascular risk factors and APOE status.

This association was not found in women. However, women in the high CLS group tended to have greater brain weight.

In both genders, high CLS was associated with greater neuronal density and cortical thickness in Brodmann area 9 in the prefrontal lobe (but not, interestingly, in the hippocampus).

Cognitive lifestyle score is produced from years of education, occupational complexity coded according to social class and socioeconomic grouping, and social engagement based on frequency of contact with relatives, neighbors, and social events.

The findings provide more support for the ‘cognitive reserve’ theory, and shed some light on the mechanism, which appears to be rather different than we imagined. It may be that the changes in the prefrontal lobe (that we expected to see in the hippocampus) are a sign that greater cognitive activity helps you develop compensatory networks, rather than building up established ones. This would be consistent with research suggesting that older adults who maintain their cognitive fitness do so by developing new strategies that involve different regions, compensating for failing regions.

Previous research has been equivocal about whether cognitive training helps cognitively healthy older adults. One recent review concluded that cognitive training could help slow age-related decline in a range of cognitive tasks; another found no evidence that such training helps slow or prevent the development of Alzheimer’s in healthy older adults. Most of the studies reviewed looked at single-domain training only: memory, reasoning, processing speed, reading, solving arithmetic problems, or strategy training (1). As we know from other studies, training in specific tasks is undeniably helpful for improving your performance at those specific tasks. However, there is little evidence for wider transfer. There have been few studies employing multi-domain training, although two such have found positive benefits.

In a new Chinese study, 270 healthy older adults (65-75) were randomly assigned to one of three groups. In the two experimental groups, participants were given one-hour training sessions twice a week for 12 weeks. Training took place in small groups of around 15. The first 15 minutes of each hour involved a lecture focusing on diseases common in older adults. The next 30 minutes were spent in instruction in one specific technique and how to use it in real life. The last 15 minutes were used to consolidate the skills by solving real-life problems.

One group were trained using a multi-domain approach, involving memory, reasoning, problem solving, map reading, handicrafts, health education and exercise. The other group trained on reasoning only (involving the towers of Hanoi, numerical reasoning, Raven Progressive Matrices, and verbal reasoning). Homework was assigned. Six months after training, three booster sessions (a month apart) were offered to 60% of the participants. The third group (the control) was put on a waiting list. All three groups attended a lecture on aspects of healthy living every two months.

All participants were given cognitive tests before training and after training, and again after 6 months, and after one year. Cognitive function was assessed using the Stroop Test, the Trail Making test, the Visual Reasoning test, and the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS, Form A).

Both the multi-domain and single-domain cognitive training produced significant improvement in cognitive scores (the former in RBANS, visual reasoning, and immediate and delayed memory; the latter in RBANS, visual reasoning, word interference, and visuospatial/constructional score), although single-domain training produced less durable benefits (after a year, the multi-domain group still showed the benefit in RBANS, delayed memory and visual reasoning, while the single-domain group only showed benefits in word interference). Booster training also produced benefits, consolidating training in reasoning, visuospatial/constructional abilities and faster processing.

Reasoning ability seemed particularly responsive to training. Although it would be reasonable to assume that single-domain training, which focused on reasoning, would produce greater improvement than multi-domain training in this specific area, there was in fact no difference between the two groups right after training or at six months. And at 12 months, the multi-domain group was clearly superior.

In sum, the study provides evidence that cognitive training helps prevent cognitive decline in healthy older people, that specific training can generalize to other tasks, but that programs that involve several cognitive domains produce more lasting benefits.

A number of studies, principally involving rodents, have established that physical exercise stimulates the creation of new brain cells in the hippocampus. A recent study attempted to uncover more about the mechanism.

Using two drugs that work directly on muscles, producing the physical effects of exercise, the researchers compared the effects on the brain. One drug (Aicar) improves the fitness of even sedentary animals. The other drug increases the effects of exercise on animals that exercise, but has little effect on sedentary animals.

After a week of receiving one of the drugs, sedentary mice performed better on tests of memory and learning, and showed more new brain cells. These effects were significantly greater for those taking Aicar.

Because the drugs have very little ability to cross into the brain, this demonstrates that the neurogenesis results from exercise-type reactions in the muscles, not to brain responses to the drugs. Indeed, previous research has found that direct infusion of Aicar into the brain impaired learning and memory.

Aicar increases the muscles’ output of AMPK, an enzyme that affects cellular energy and metabolism. It’s speculated that some of this enzyme may enter the bloodstream and travel to the brain. Interestingly, as with neurogenesis, AMPK activity in muscles appears to decline with age. It may be that AMPK production could serve as a biomarker for neurogenesis, as well as being a target for improving neurogenesis.

These findings add weight to evidence for the value of aerobic exercise over other types of exercise (given that the mice exercise by running). However, I see that human research has found that resistance training (which is difficult to study in mice!) also increases AMPK activity.

Do note — if you are hopeful that drugs will relieve you of the need to exercise — that the benefits were not only smaller than those achieved from exercise, but also didn’t last. In those mice taking Aicar for a second week, their brains not only stopped deriving any benefit, but actually deteriorated.

Over the years, I have reported on several studies that have found evidence that colorful berries — blueberries in particular (but I think that’s more of an artifact, due to the relative cheapness of these berries in North America) — benefit older brains. Indeed, I myself consume these every day (in my lunch smoothie) for this very reason (of course, the fact that they taste so good doesn’t hurt!).

But to date these studies have involved rodents or only very small numbers of humans. Now a new study analyzes data from the very large and long-running Nurses' Health Study, which has questioned 121,700 female, registered nurses about their health and lifestyle since 1976. Since 1980, participants were also asked about their frequency of food consumption. Between 1995 and 2001, memory was measured in 16,010 participants over the age of 70 years (average age 74), at 2-year intervals.

The study found that those women who had 2 or more servings of strawberries and blueberries every week had a slower rate of cognitive decline. The effects were equivalent to some 1.5-2.5 years of normal cognitive aging.

While the researchers cannot completely rule out the possibility that higher berry consumption is associated with slower cognitive decline because of its association with some other factor that affects brain aging, they did take into account a large number of potentially confounding factors, including: education, smoking history and status, antidepressant use, BMI, blood pressure, cholesterol, diabetes, physical activity, total calorie intake, fish consumption, alcohol use, overall diet scores, and various indirect measures of socioeconomic status.

Moreover, the findings are both consistent with both animal and cell studies, and with what we know about how the brain ages. The ‘magic’ ingredient of these berries is thought to lie in their flavonoids (particularly anthocyanidins), which have powerful antioxidant and anti-inflammatory properties. It’s thought that berries help the brain stay healthy both because they contain high levels of antioxidants, which protect cells from damage by harmful free radicals, and because they change the way neurons in the brain communicate, protecting against inflammation and oxidative stress.

As a rule of thumb, the deeper the color of the berry (or other fruit or vegetable), the more flavonoids it has. You can see a list of anthocyanin-rich foods here (acai isn’t in the list, but it also has a very high rating).

Genetic analysis of 9,232 older adults (average age 67; range 56-84) has implicated four genes in how fast your hippocampus shrinks with age (rs7294919 at 12q24, rs17178006 at 12q14, rs6741949 at 2q24, rs7852872 at 9p33). The first of these (implicated in cell death) showed a particularly strong link to a reduced hippocampus volume — with average consequence being a hippocampus of the same size as that of a person 4-5 years older.

Faster atrophy in this crucial brain region would increase people’s risk of Alzheimer’s and cognitive decline, by reducing their cognitive reserve. Reduced hippocampal volume is also associated with schizophrenia, major depression, and some forms of epilepsy.

In addition to cell death, the genes linked to this faster atrophy are involved in oxidative stress, ubiquitination, diabetes, embryonic development and neuronal migration.

A younger cohort, of 7,794 normal and cognitively compromised people with an average age of 40, showed that these suspect gene variants were also linked to smaller hippocampus volume in this age group. A third cohort, comprised of 1,563 primarily older people, showed a significant association between the ASTN2 variant (linked to neuronal migration) and faster memory loss.

In another analysis, researchers looked at intracranial volume and brain volume in 8,175 elderly. While they found no genetic associations for brain volume (although there was one suggestive association), they did discover that intracranial volume (the space occupied by the fully developed brain within the skull — this remains unchanged with age, reflecting brain size at full maturity) was significantly associated with two gene variants (at loci rs4273712, on chromosome 6q22, and rs9915547, on 17q21). These associations were replicated in a different sample of 1,752 older adults. One of these genes is already known to play a unique evolutionary role in human development.

A meta-analysis of seven genome-wide association studies, involving 10,768 infants (average age 14.5 months), found two loci robustly associated with head circumference in infancy (rs7980687 on chromosome 12q24 and rs1042725 on chromosome 12q15). These loci have previously been associated with adult height, but these effects on infant head circumference were largely independent of height. A third variant (rs11655470 on chromosome 17q21 — note that this is the same chromosome implicated in the study of older adults) showed suggestive evidence of association with head circumference; this chromosome has also been implicated in Parkinson's disease and other neurodegenerative diseases.

Previous research has found an association between head size in infancy and later development of Alzheimer’s. It has been thought that this may have to do with cognitive reserve.

Interestingly, the analyses also revealed that a variant in a gene called HMGA2 (rs10784502 on 12q14.3) affected intelligence as well as brain size.

Why ‘Alzheimer’s gene’ increases Alzheimer’s risk

Investigation into the so-called ‘Alzheimer’s gene’ ApoE4 (those who carry two copies of this variant have roughly eight to 10 times the risk of getting Alzheimer’s disease) has found that ApoE4 causes an increase in cyclophilin A, which in turn causes a breakdown of the cells lining the blood vessels. Blood vessels become leaky, making it more likely that toxic substances will leak into the brain.

The study found that mice carrying the ApoE4 gene had five times as much cyclophilin A as normal, in cells crucial to maintaining the integrity of the blood-brain barrier. Blocking the action of cyclophilin A brought blood flow back to normal and reduced the leakage of toxic substances by 80%.

The finding is in keeping with the idea that vascular problems are at the heart of Alzheimer’s disease — although it should not be assumed from that, that other problems (such as amyloid-beta plaques and tau tangles) are not also important. However, one thing that does seem clear now is that there is not one single pathway to Alzheimer’s. This research suggests a possible treatment approach for those carrying this risky gene variant.

Note also that this gene variant is not only associated with Alzheimer’s risk, but also Down’s syndrome dementia, poor outcome following TBI, and age-related cognitive decline.

On which note, I’d like to point out recent findings from the long-running Nurses' Health Study, involving 16,514 older women (70-81), that suggest that effects of postmenopausal hormone therapy for cognition may depend on apolipoprotein E (APOE) status, with the fastest rate of decline being observed among HT users who carried the APOe4 variant (in general HT was associated with poorer cognitive performance).

It’s also interesting to note another recent finding: that intracranial volume modifies the effect of apoE4 and white matter lesions on dementia risk. The study, involving 104 demented and 135 nondemented 85-year-olds, found that smaller intracranial volume increased the risk of dementia, Alzheimer's disease, and vascular dementia in participants with white matter lesions. However, white matter lesions were not associated with increased dementia risk in those with the largest intracranial volume. But intracranial volume did not modify dementia risk in those with the apoE4 gene.

More genes involved in Alzheimer’s

More genome-wide association studies of Alzheimer's disease have now identified variants in BIN1, CLU, CR1 and PICALM genes that increase Alzheimer’s risk, although it is not yet known how these gene variants affect risk (the present study ruled out effects on the two biomarkers, amyloid-beta 42 and phosphorylated tau).

Same genes linked to early- and late-onset Alzheimer's

Traditionally, we’ve made a distinction between early-onset Alzheimer's disease, which is thought to be inherited, and the more common late-onset Alzheimer’s. New findings, however, suggest we should re-think that distinction. While the genetic case for early-onset might seem to be stronger, sporadic (non-familial) cases do occur, and familial cases occur with late-onset.

New DNA sequencing techniques applied to the APP (amyloid precursor protein) gene, and the PSEN1 and PSEN2 (presenilin) genes (the three genes linked to early-onset Alzheimer's) has found that rare variants in these genes are more common in families where four or more members were affected with late-onset Alzheimer’s, compared to normal individuals. Additionally, mutations in the MAPT (microtubule associated protein tau) gene and GRN (progranulin) gene (both linked to frontotemporal dementia) were also found in some Alzheimer's patients, suggesting they had been incorrectly diagnosed as having Alzheimer's disease when they instead had frontotemporal dementia.

Of the 439 patients in which at least four individuals per family had been diagnosed with Alzheimer's disease, rare variants in the 3 Alzheimer's-related genes were found in 60 (13.7%) of them. While not all of these variants are known to be pathogenic, the frequency of mutations in these genes is significantly higher than it is in the general population.

The researchers estimate that about 5% of those with late-onset Alzheimer's disease have changes in these genes. They suggest that, at least in some cases, the same causes may underlie both early- and late-onset disease. The difference being that those that develop it later have more protective factors.

Another gene identified in early-onset Alzheimer's

A study of the genes from 130 families suffering from early-onset Alzheimer's disease has found that 116 had mutations on genes already known to be involved (APP, PSEN1, PSEN2 — see below for some older reports on these genes), while five of the other 14 families all showed mutations on a new gene: SORL1.

I say ‘new gene’ because it hasn’t been implicated in early-onset Alzheimer’s before. However, it has been implicated in the more common late-onset Alzheimer’s, and last year a study reported that the gene was associated with differences in hippocampal volume in young, healthy adults.

The finding, then, provides more support for the idea that some cases of early-onset and late-onset Alzheimer’s have the same causes.

The SORL1 gene codes for a protein involved in the production of the beta-amyloid peptide, and the mutations seen in this study appear to cause an under-expression of SORL1, resulting in an increase in the production of the beta-amyloid peptide. Such mutations were not found in the 1500 ethnicity-matched controls.

 

Older news reports on these other early-onset genes (brought over from the old website):

New genetic cause of Alzheimer's disease

Amyloid protein originates when it is cut by enzymes from a larger precursor protein. In very rare cases, mutations appear in the amyloid precursor protein (APP), causing it to change shape and be cut differently. The amyloid protein that is formed now has different characteristics, causing it to begin to stick together and precipitate as amyloid plaques. A genetic study of Alzheimer's patients younger than 70 has found genetic variations in the promoter that increases the gene expression and thus the formation of the amyloid precursor protein. The higher the expression (up to 150% as in Down syndrome), the younger the patient (starting between 50 and 60 years of age). Thus, the amount of amyloid precursor protein is a genetic risk factor for Alzheimer's disease.

Theuns, J. et al. 2006. Promoter Mutations That Increase Amyloid Precursor-Protein Expression Are Associated with Alzheimer Disease. American Journal of Human Genetics, 78, 936-946.

http://www.eurekalert.org/pub_releases/2006-04/vfii-rda041906.php

Evidence that Alzheimer's protein switches on genes

Amyloid b-protein precursor (APP) is snipped apart by enzymes to produce three protein fragments. Two fragments remain outside the cell and one stays inside. When APP is produced in excessive quantities, one of the cleaved segments that remains outside the cell, called the amyloid b-peptides, clumps together to form amyloid plaques that kill brain cells and may lead to the development of Alzheimer’s disease. New research indicates that the short "tail" segment of APP that is trapped inside the cell might also contribute to Alzheimer’s disease, through a process called transcriptional activation - switching on genes within the cell. Researchers speculate that creation of amyloid plaque is a byproduct of a misregulation in normal APP processing.

[2866] Cao, X., & Südhof T. C.
(2001).  A Transcriptively Active Complex of APP with Fe65 and Histone Acetyltransferase Tip60.
Science. 293(5527), 115 - 120.

http://www.eurekalert.org/pub_releases/2001-07/aaft-eta070201.php

Inactivation of Alzheimer's genes in mice causes dementia and brain degeneration

Mutations in two related genes known as presenilins are the major cause of early onset, inherited forms of Alzheimer's disease, but how these mutations cause the disease has not been clear. Since presenilins are involved in the production of amyloid peptides (the major components of amyloid plaques), it was thought that such mutations might cause Alzheimer’s by increasing brain levels of amyloid peptides. Accordingly, much effort has gone into identifying compounds that could block presenilin function. Now, however, genetic engineering in mice has revealed that deletion of these genes causes memory loss and gradual death of nerve cells in the mouse brain, demonstrating that the protein products of these genes are essential for normal learning, memory and nerve cell survival.

Saura, C.A., Choi, S-Y., Beglopoulos, V., Malkani, S., Zhang, D., Shankaranarayana Rao, B.S., Chattarji, S., Kelleher, R.J.III, Kandel, E.R., Duff, K., Kirkwood, A. & Shen, J. 2004. Loss of Presenilin Function Causes Impairments of Memory and Synaptic Plasticity Followed by Age-Dependent Neurodegeneration. Neuron, 42 (1), 23-36.

http://www.eurekalert.org/pub_releases/2004-04/cp-ioa032904.php

[2858] Consortium, E N I G M-A(ENIGMA)., & Cohorts Heart Aging Research Genomic Epidemiology(charge)
(2012).  Common variants at 12q14 and 12q24 are associated with hippocampal volume.
Nature Genetics. 44(5), 545 - 551.

[2909] Taal, R. H., Pourcain B S., Thiering E., Das S., Mook-Kanamori D. O., Warrington N. M., et al.
(2012).  Common variants at 12q15 and 12q24 are associated with infant head circumference.
Nature Genetics. 44(5), 532 - 538.

[2859] Cohorts Heart Aging Research Genomic Epidemiology,(charge), & Consortium E G G(EGG).
(2012).  Common variants at 6q22 and 17q21 are associated with intracranial volume.
Nature Genetics. 44(5), 539 - 544.

[2907] Stein, J. L., Medland S. E., Vasquez A A., Hibar D. P., Senstad R. E., Winkler A. M., et al.
(2012).  Identification of common variants associated with human hippocampal and intracranial volumes.
Nature Genetics. 44(5), 552 - 561.

[2925] Bell, R. D., Winkler E. A., Singh I., Sagare A. P., Deane R., Wu Z., et al.
(2012).  Apolipoprotein E controls cerebrovascular integrity via cyclophilin A.
Nature.

Kang, J. H., & Grodstein F. (2012).  Postmenopausal hormone therapy, timing of initiation, APOE and cognitive decline. Neurobiology of Aging. 33(7), 1129 - 1137.

Skoog, I., Olesen P. J., Blennow K., Palmertz B., Johnson S. C., & Bigler E. D. (2012).  Head size may modify the impact of white matter lesions on dementia. Neurobiology of Aging. 33(7), 1186 - 1193.

[2728] Cruchaga, C., Chakraverty S., Mayo K., Vallania F. L. M., Mitra R. D., Faber K., et al.
(2012).  Rare Variants in APP, PSEN1 and PSEN2 Increase Risk for AD in Late-Onset Alzheimer's Disease Families.
PLoS ONE. 7(2), e31039 - e31039.

Full text available at http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0031039

[2897] Pottier, C., Hannequin D., Coutant S., Rovelet-Lecrux A., Wallon D., Rousseau S., et al.
(2012).  High frequency of potentially pathogenic SORL1 mutations in autosomal dominant early-onset Alzheimer disease.
Molecular Psychiatry.

McCarthy, J. J., Saith S., Linnertz C., Burke J. R., Hulette C. M., Welsh-Bohmer K. A., et al. (2012).  The Alzheimer's associated 5′ region of the SORL1 gene cis regulates SORL1 transcripts expression. Neurobiology of Aging. 33(7), 1485.e1-1485.e8 - 1485.e1-1485.e8

A number of studies have found evidence that older adults can benefit from cognitive training. However, neural plasticity is thought to decline with age, and because of this, it’s thought that the younger-old, and/or the higher-functioning, may benefit more than the older-old, or the lower-functioning. On the other hand, because their performance may already be as good as it can be, higher-functioning seniors may be less likely to benefit. You can find evidence for both of these views.

In a new study, 19 of 39 older adults (aged 60-77) were given training in a multiplayer online video game called World of Warcraft (the other 20 formed a control group). This game was chosen because it involves multitasking and switching between various cognitive abilities. It was theorized that the demands of the game would improve both spatial orientation and attentional control, and that the multiple tasks might produce more improvement in those with lower initial ability compared to those with higher ability.

WoW participants were given a 2-hour training session, involving a 1-hour lecture and demonstration, and one hour of practice. They were then expected to play the game at home for around 14 hours over the next two weeks. There was no intervention for the control group. All participants were given several cognitive tests at the beginning and end of the two week period: Mental Rotation Test; Stroop Test; Object Perspective Test; Progressive Matrices; Shipley Vocabulary Test; Everyday Cognition Battery; Digit Symbol Substitution Test.

As a group, the WoW group improved significantly more on the Stroop test (a measure of attentional control) compared to the control group. There was no change in the other tests. However, those in the WoW group who had performed more poorly on the Object Perspective Test (measuring spatial orientation) improved significantly. Similarly, on the Mental Rotation Test, ECB, and Progressive Matrices, those who performed more poorly at the beginning tended to improve after two weeks of training. There was no change on the Digit Symbol test.

The finding that only those whose performance was initially poor benefited from cognitive training is consistent with other studies suggesting that training only benefits those who are operating below par. This is not really surprising, but there are a few points that should be made.

First of all, it should be noted that this was a group of relatively high-functioning young-old adults — poorer performance in this case could be (relatively) better performance in another context. What it comes down to is whether you are operating at a level below which you are capable of — and this applies broadly, for example, experiments show that spatial training benefits females but not males (because males tend to already have practiced enough).

Given that, in expertise research, training has an on-going, apparently limitless, effect on performance, it seems likely that the limited benefits shown in this and other studies is because of the extremely limited scope of the training. Fourteen hours is not enough to improve people who are already performing adequately — but that doesn’t mean that they wouldn’t improve with more hours. I have yet to see any interventions with older adults that give them the amount of cognitive training you would expect them to need to achieve some level of mastery.

My third and final point is the specific nature of the improvements. This has also been shown in other studies, and sometimes appears quite arbitrary — for example, one 3-D puzzle game apparently improved mental rotation, while a different 3-D puzzle game had no effect. The point being that we still don’t understand the precise attributes needed to improve different skills (although the researchers advocate the use of a tool called cognitive task analysis for revealing the underlying qualities of an activity) — but we do understand that it is a matter of precise attributes, which is definitely a step in the right direction.

The main thing, then, that you should take away from this is the idea that different activities involve specific cognitive tasks, and these, and only these, will be the ones that benefit from practicing the activities. You therefore need to think about what tasks you want to improve before deciding on the activities to practice.

Previous research has pointed to a typical decline in our sense of control as we get older. Maintaining a sense of control, however, appears to be a key factor in successful aging. Unsurprisingly, in view of the evidence that self-belief and metacognitive understanding are important for cognitive performance, a stronger sense of control is associated with better cognitive performance. (By metacognitive understanding I mean the knowledge that cognitive performance is malleable, not fixed, and strategies and training are effective in improving cognition.)

In an intriguing new study, 36 older adults (aged 61-87, average age 74) had their cognitive performance and their sense of control assessed every 12 hours for 60 days. Participants were asked questions about whether they felt in control of their lives and whether they felt able to achieve goals they set for themselves.

The reason I say this is intriguing is that it’s generally assumed that a person’s sense of control — how much they feel in control of their lives — is reasonably stable. While, as I said, it can change over the course of a lifetime, until recently we didn’t think that it could fluctuate significantly in the course of a single day — which is what this study found.

Moreover, those who normally reported having a low sense of control performed much better on inductive reasoning tests during periods when they reported feeling a higher sense of control. Similarly, those who normally reported feeling a high sense of control scored higher on memory tests when feeling more in control than usual.

Although we can’t be sure (since this wasn’t directly investigated), the analysis suggests that the improved cognitive functioning stems from the feeling of improved control, not vice versa.

The study builds on an earlier study that found weekly variability in older adults’ locus of control and competency beliefs.

Assessment was carried out in the form of a daily workbook, containing a number of measures, which participants completed twice daily. Each assessment took around 30-45 minutes to complete. The measures included three cognitive tests (14 alternate forms of each of these were used, to minimize test familiarity):

  • Letter series test: 30 items in which the next letter in a series had to be identified. [Inductive reasoning]
  • Number comparison: 48 items in which two number strings were presented beside each other, and participants had to identify where there was any mismatch. [Perceptual speed]
  • Rey Auditory Verbal Learning Task: participants have to study a list of 15 unrelated words for one minute, then on another page recall as many of the words as they could. [Memory]

Sense of control over the previous 12 hours was assessed by 8 questions, to which participants indicated their agreement/disagreement on a 6-point scale. Half the questions related to ‘locus of control’ and half to ‘perceived competence’.

While, unsurprisingly, compliance wasn’t perfect (it’s quite an arduous regime), participants completed on average 115 of 120 workbooks. Of the possible 4,320 results (36 x 120), only 166 were missing.

One of the things that often annoys me is the subsuming of all within-individual variability in cognitive scores into averages. Of course averages are vital, but so is variability, and this too often is glossed over. This study is, of course, all about variability, so I was very pleased to see people’s cognitive variability spelled out.

Most of the variance in locus of control was of course between people (86%), but 14% was within-individual. Similarly, the figures for perceived competence were 88% and 12%. (While locus of control and perceived competence are related, only 26% of the variability in within-person locus of control was associated with competence, meaning that they are largely independent.)

By comparison, within-individual variability was much greater for the cognitive measures: for the letter series (inductive reasoning), 32% was within-individual and 68% between-individual; for the number matching (perceptual speed), 21% was within-individual and 79% between-individual; for the memory test, an astounding 44% was within-individual and 56% between-individual.

Some of this within-individual variability in cognitive performance comes down to practice effects, which were significant for all cognitive measures. For the memory test, time of day was also significant, with performance being better in the morning. For the letter and number series tests, previous performance also had a small effect on perceived competence. For the number matching, increase in competence subsequent to increased performance was greatest for those with lower scores. However, lagged analyses indicated that beliefs preceded performance to a greater extent than performance preceding beliefs.

While it wasn’t an aspect of this study, it should also be noted that a person’s sense of control may well vary according to domain (e.g., cognition, social interaction, health) and context. In this regard, it’s interesting to note the present findings that sense of control affected inductive reasoning for low-control individuals, but memory for high-control individuals, suggesting that the cognitive domain also matters.

Now this small study was a preliminary one and there are several limitations that need to be tightened up in subsequent research, but I think it’s important for three reasons:

  • as a demonstration that cognitive performance is not a fixed attribute;
  • as a demonstration of the various factors that can affect older adults’ cognitive performance;
  • as a demonstration that your beliefs about yourself are a factor in your cognitive performance.

[2794] Neupert, S. D., & Allaire J. C.
(2012).  I think I can, I think I can: Examining the within-person coupling of control beliefs and cognition in older adults.
Psychology and Aging. No Pagination Specified - No Pagination Specified.

A study involving 1,575 older adults (aged 58-76) has found that those with DHA levels in the bottom 25% had smaller brain volume (equivalent to about 2 years of aging) and greater amounts of white matter lesions. Those with levels of all omega-3 fatty acids in the bottom quarter also scored lower on tests of visual memory, executive function, and abstract thinking.

The finding adds to the evidence that higher levels of omega-3 fatty acids reduce dementia risk.

For more about omega-3 oils and cognition

Data from 11,926 older twins (aged 65+) has found measurable cognitive impairment in 25% of them and subjective cognitive impairment in a further 39%, meaning that 64% of these older adults were experiencing some sort of cognitive impairment.

Although subjective impairment is not of sufficient magnitude to register on our measurement tools, that doesn’t mean that people’s memory complaints should be dismissed. It is likely, given the relative crudity of standard tests, that people are going to be aware of cognitive problems before they grow large enough to be measurable. Moreover, when individuals are of high intelligence or well-educated, standard tests can be insufficiently demanding. [Basically, subjective impairment can be thought of as a step before objective impairment, which itself is a step before mild cognitive impairment (MCI is a formal diagnosis, not simply a descriptive title), the precursor to Alzheimer’s. Note that I am calling these “steps” as a way of describing a continuum, not an inevitable process. None of these steps means that you will inevitably pass to the next step, but each later step will be preceded by the earlier steps.]

Those with subjective complaints were younger, more educated, more likely to be married, and to have higher socio-economic status, compared to those with objective impairment — supporting the idea that these factors provide some protection against cognitive decline.

The use of twins reveals that environment is more important than genes in determining whether you develop cognitive impairment in old age. For objective cognitive impairment, identical twins had a concordance rate of 52% compared to 50% in non-identical same-sex twins and 29% in non-identical different-gender twins. For subjective impairment, the rates were 63%, 63%, and 42%, respectively.

National variation in MCI prevalence

Another very large study, involving 15,376 older adults (65+), has explored the prevalence of amnestic MCI in low- and middle-income countries: Cuba, Dominican Republic, Peru, Mexico, Venezuela, Puerto Rico, China, and India. Differences between countries were marked, with only 0.6% of older adults in China having MCI compared to 4.6% in India (Cuba 1.5%, Dominican Republic 1.3%, Peru 2.6%, Mexico 2.8%, Venezuela 1%, Puerto Rico 3% — note that I have selected the numbers after they were standardized for age, gender, and education, but the raw numbers are not greatly different).

Studies to date have focused mainly on European and North American populations, and have provided prevalence estimates ranging from 2.1%-11.5%, generally hovering around 3-5% (for example, Finland 5.3%, Italy 4.9%, Japan 4.9%, the US 6% — but note South Korea 9.7% and Malaysia 15.4%).

What is clear is that there is considerable regional variation.

Interestingly, considering their importance in Western countries, the effects of both age and education on prevalence of aMCI were negligible. Granted that age and education norms were used in the diagnosis, this is still curious. It may be that there was less variance in educational level in these populations. Socioeconomic status was, however, a factor.

Participants were also tested on the 12-item WHO disability assessment schedule (WHODAS-12), which assesses five activity-limitation domains (communication, physical mobility, self-care, interpersonal interaction, life activities and social participation). MCI was found to be significantly associated with disability in Peru, India, and the Dominican Republic (but negatively associated in China). Depression (informant-rated) was also only associated with MCI in some countries.

All of this, I feel, emphasizes the situational variables that determine whether an individual will develop cognitive impairment.

Caracciolo B, Gatz M, Xu W, Pedersen NL, Fratiglioni L. 2012. Differential Distribution of Subjective and Objective Cognitive Impairment in the Population: A Nation-Wide Twin-Study. Journal of Alzheimer's Disease, 29(2), 393-403.

[2801] Sosa, A L., Albanese E., Stephan B. C. M., Dewey M., Acosta D., Ferri C. P., et al.
(2012).  Prevalence, Distribution, and Impact of Mild Cognitive Impairment in Latin America, China, and India: A 10/66 Population-Based Study.
PLoS Med. 9(2), e1001170 - e1001170.

Full text available at http://www.plosmedicine.org/article/info%3Adoi%2F10.1371%2Fjournal.pmed....

Another study adds to the evidence that changes in the brain that may lead eventually to Alzheimer’s begin many years before Alzheimer’s is diagnosed. The findings also add to the evidence that what we regard as “normal” age-related cognitive decline is really one end of a continuum of which the other end is dementia.

In the study, brain scans were taken of 137 highly educated people aged 30-89 (participants in the Dallas Lifespan Brain Study). The amount of amyloid-beta (characteristic of Alzheimer’s) was found to increase with age, and around a fifth of those over 60 had significantly elevated levels of the protein. These higher amounts were linked with worse performance on tests of working memory, reasoning and processing speed.

More specifically, across the whole sample, amyloid-beta levels affected processing speed and fluid intelligence (in a dose-dependent relationship — that is, as levels increased, these functions became more impaired), but not working memory, episodic memory, or crystallized intelligence. Among the elevated-levels group, increased amyloid-beta was significantly associated with poorer performance for processing speed, working memory, and fluid intelligence, but not episodic memory or crystallized intelligence. Among the group without elevated levels of the protein, increasing amyloid-beta only affected fluid intelligence.

These task differences aren’t surprising: processing speed, working memory, and fluid intelligence are the domains that show the most decline in normal aging.

Those with the Alzheimer’s gene APOE4 were significantly more likely to have elevated levels of amyloid-beta. While 38% of the group with high levels of the protein had the risky gene variant, only 15% of those who didn’t have high levels carried the gene.

Note that, while the prevalence of carriers of the gene variant matched population estimates (24%), the proportion was higher among those in the younger age group — 33% of those under 60, compared to 19.5% of those aged 60 or older. It seems likely that many older carriers have already developed MCI or Alzheimer’s, and thus been ineligible for the study.

The average age of the participants was 64, and the average years of education 16.4.

Amyloid deposits varied as a function of age and region: the precuneus, temporal cortex, anterior cingulate and posterior cingulate showed the greatest increase with age, while the dorsolateral prefrontal cortex, orbitofrontal cortex, parietal and occipital cortices showed smaller increases with age. However, when only those aged 60+ were analyzed, the effect of age was no longer significant. This is consistent with previous research, and adds to evidence that age-related cognitive impairment, including Alzheimer’s, has its roots in damage occurring earlier in life.

In another study, brain scans of 408 participants in the Mayo Clinic Study of Aging also found that higher levels of amyloid-beta were associated with poorer cognitive performance — but that this interacted with APOE status. Specifically, carriers of the Alzheimer’s gene variant were significantly more affected by having higher levels of the protein.

This may explain the inconsistent findings of previous research concerning whether or not amyloid-beta has significant effects on cognition in normal adults.

As the researchers of the first study point out, what’s needed is information on the long-term course of these brain changes, and they are planning to follow these participants.

In the meantime, all in all, the findings do provide more strength to the argument that your lifestyle in mid-life (and perhaps even younger) may have long-term consequences for your brain in old age — particularly for those with a genetic susceptibility to Alzheimer’s.

A ten-year study following 12,412 middle-aged and older adults (50+) has found that those who died after stroke had more severe memory loss in the years before stroke compared to those who survived stroke and those who didn't have a stroke.

Participants were tested every two years, using a standard word-recall list to measure memory loss (or caregiver assessment for those whose memory loss was too severe). During the decade of the study, 1,027 participants (8.3%) survived a stroke, 499 (4%) died after stroke, and 10,886 (87.7%) remained stroke-free over the study period.

Before having a stroke, those who later survived a stroke had worse average memory than similar individuals who never had a stroke, however their rate of memory decline was similar (0.034 and 0.028 points per year, respectively). Those who later died after a stroke, on the other hand, showed significantly faster memory decline (0.118 points per year).

Whether this is because those who die after stroke have a more compromised brain prior to the stroke, or because greater memory impairment makes people more vulnerable in the wake of a stroke, cannot be told from this data (and indeed, both factors may be involved).

Among survivors, stroke had a significant effect on memory decline, with memory scores dropping an average of 0.157 points at the time of the stroke — an amount equivalent to around 5.6 years of memory decline in similarly-aged stroke-free adults. However, in subsequent years, decline was only a little greater than it had been prior to the stroke (0.038 points per year).

(You can see a nice graph of these points here.)

Wang, Q., Capistrant, B.D., Ehntholt, A. & Glymour, M.M. 2012. Abstract 31: Rate of Change in Memory Functioning Before and After Stroke Onset. Presented at the American Stroke Association's International Stroke Conference 2012. http://stroke.ahajournals.org/cgi/content/meeting_abstract/43/2_MeetingAbstracts/A31?sid=960f2015-06d1-478f-8c03-c00994d35f2c

A small study of the sleep patterns of 100 people aged 45-80 has found a link between sleep disruption and level of amyloid plaques (characteristic of Alzheimer’s disease). The participants were recruited from the Adult Children Study, of whom half have a family history of Alzheimer’s disease.

Sleep was monitored for two weeks. Those who woke frequently (more than five times an hour!) and those who spent less than 85% of their time in bed actually asleep, were more likely to have amyloid plaques. A quarter of the participants had evidence of amyloid plaques.

The study doesn’t tell us whether disrupted sleep leads to the production of amyloid plaques, or whether brain changes in early Alzheimer's disease lead to changes in sleep, but evidence from other studies do, I think, give some weight to the first idea. At the least, this adds yet another reason for making an effort to improve your sleep!

The abstract for this not-yet-given conference presentation, or the press release, don’t mention any differences between those with a family history of Alzheimer’s and those without, suggesting there was none — but since the researchers made no mention either way, I wouldn’t take that for granted. Hopefully we’ll one day see a journal paper providing more information.

The main findings are supported by another recent study. A Polish study involving 150 older adults found that those diagnosed with Alzheimer’s after a seven-year observation period were more likely to have experienced sleep disturbances more often and with greater intensity, compared to those who did not develop Alzheimer’s.

Ju, Y., Duntley, S., Fagan, A., Morris, J. & Holtzman, D. 2012. Sleep Disruption and Risk of Preclinical Alzheimer Disease. To be presented April 23 at the American Academy of Neurology's 64th Annual Meeting in New Orleans.

Bidzan L, Grabowski J, Dutczak B, Bidzan M. 2011. [Sleep disorders in the preclinical period of the Alzheimer's disease]. Psychiatria Polska, 45(6), 851-60. http://www.ncbi.nlm.nih.gov/pubmed/22335128

New data from the ongoing validation study of the Alzheimer's Questionnaire (AQ), from 51 cognitively normal individuals (average age 78) and 47 aMCI individuals (average age 74), has found that the AQ is effective in identifying not only those with Alzheimer’s but also those older adults with mild cognitive impairment.

Of particular interest is that four questions were strong indicators of aMCI. These related to:

  • repeating questions and statements,
  • trouble knowing the date or time,
  • difficulties managing finances, and
  • decreased sense of direction.

The AQ consists of 21 yes/no questions designed to be answered by a relative or carer. The questions fall into five categories: memory, orientation, functional ability, visuospatial ability, and language. Six of these questions are known to be predictive of AD and are given extra weighting, resulting in a score out of 27. A score above 15 was indicative of AD, and between 5 and 14 of aMCI. Scores of 4 or lower indicate that the person does not have significant memory problems.

The questionnaire is not of course definitive, but is intended as an indicator for further testing. Note, too, that all participants in this study were Caucasian.

The value and limitations of brief cognitive screenings

The value of brief cognitive screenings combined with offering further evaluation is demonstrated in a recent large VA study, which found that, of 8,342 Veterans aged 70+ who were offered screening (the three-minute Mini-Cog), 8,063 (97%) accepted, 2,081 (26%) failed the screen, and 580 (28%) agreed to further evaluation. Among those accepting further evaluation, 93% were found to have cognitive impairment, including 75% with dementia.

Among those who declined further evaluation, 17% (259/1,501) were diagnosed with incident cognitive impairment through standard clinical care. In total, the use of brief cognitive screenings increased the numbers with cognitive impairment to 11% (902/8,063) versus 4% (1,242/28,349) in similar clinics without this program.

Importantly, the limits of such questionnaires were also demonstrated: 118 patients who passed the initial screen nevertheless requested further evaluation, and 87% were found to have cognitive impairment, including 70% with dementia.

This should not be taken as a reason not to employ such cognitive tests! There are two points that should, I think, be taken from this:

  • Routine screening of older adults is undoubtedly an effective strategy for identifying those with cognitive impairment.
  • Individuals who pass such tests but nevertheless believe they have cognitive problems should be taken seriously.

In the study, 64 older adults (60-74; average 70) and 64 college students were compared on a word recognition task. Both groups first took a vocabulary test, on which they performed similarly. They were then presented with 12 lists of 15 semantically related words. For example, one list could have words associated with "sleep," such as "bed," "rest," "awake," "tired" and "night" — but not the word “sleep”. They were not told they would be tested on their memory of these, rather they were asked to rate each word for pleasantness.

They then engaged in a five-minute filler task (a Sudoku) before a short text was read to them. For some, the text had to do with age-related declines in memory. These participants were told the experiment had to do with memory. For others, the text concerned language-processing research. These were told the experiment had to do with language processing and verbal ability.

They were then given a recognition test containing 36 of the studied words, 48 words unrelated to the studied words, and 12 words related to the studied words (e.g. “sleep”). After recording whether or not they had seen each word before, they also rated their confidence in that answer on an 8-point scale. Finally, they were given a lexical decision task to independently assess stereotype activation.

While young adults showed no effects from the stereotype manipulation, older adults were much more likely to falsely recognize related words that had not been studied if they had heard the text on memory. Those who heard the text on language were no more likely than the young adults to falsely recognize related words.

Note that there is always quite a high level of false recognition of such items: young adults, and older adults in the low-threat condition falsely recognized around half of the related lures, compared to around 10% of unrelated words. But in the high-threat condition, older adults falsely recognized 71% of the related words.

Moreover, older adults’ confidence was also affected. While young adults’ confidence in their false memories was unaffected by threat condition, older adults in the high-threat condition were more confident of their false memories than older adults in the low-threat condition.

The idea that older adults were affected by negative stereotypes about aging was supported by the results of the lexical decision task, which found that, in the high-threat condition, older adults responded more quickly to words associated with negative stereotypes than to neutral words (indicating that they were more accessible). Young adults did not show this difference.

Thomas, A. K., & Dubois, S. J. (2011). Reducing the burden of stereotype threat eliminates age differences in memory distortion. Psychological science, 22(12), 1515-7. doi:10.1177/0956797611425932

I have reported often on studies pointing to obesity as increasing your risk of developing dementia, and on the smaller evidence that calorie restriction may help fight age-related cognitive decline and dementia (and help you live longer). A new mouse study helps explain why eating less might help the brain.

It turns out that a molecule called CREB-1 is triggered by calorie restriction (defined as only 70% of normal consumption). cAMP Response Element Binding (CREB) protein is an essential component of long-term memory formation, and abnormalities in the expression of CREB have been reported in the brains of Alzheimer’s patients. Restoring CREB to Alzheimer’s mice has been shown to improve learning and memory impairment.

Animal models have also indicated a role for CREB in the improvements in learning and memory brought about by physical exercise. CREB seems to be vital for adult neurogenesis.

The current study found that, when CREB1 was missing (in mice genetically engineered to lack this molecule), calorie restriction had no cognitive benefits. CREB deficiency in turn drastically reduced the expression of Sirt-1. These proteins have been implicated in cardiac function, DNA repair and genomic stability (hence the connection to longevity). More recently, Sirt-1 has also been found to modulate synaptic plasticity and memory formation — an effect mediated by CREB. This role in regulating normal brain function appears to be quite separate from its cell survival functions.

The findings identify a target for drugs that could produce the same cognitive (and longevity) benefits without the need for such strict food reduction.

Reducing your eating and drinking to 70% of normal intake is a severe reduction. Recently, researchers at the National Institute on Ageing in Baltimore have suggested that the best way to cut calories to achieve cognitive benefits was to virtually fast (down to around 500 calories) for two days a week, while eating as much as you want on the other days. Their animal experiments indicate that timing is a crucial element if cognitive benefits are to accrue.

Another preliminary report, this time from the long-running Mayo Clinic study of aging, adds to the evidence that lower consumption reduces the risk of serious cognitive impairment. The first analysis of data has revealed that the risk of developing mild cognitive impairment more than doubled for those in the highest food consumption group (daily calorie consumption between 2,143 and 6,000) compared to those in the lowest (between 600 and 1,526 calories).

Calorie consumption was taken from food questionnaires in which respondents described their diets over the previous year, so must be taken with a grain of salt. Additionally, the analysis didn’t take into account types of food and beverages, or other lifestyle factors, such as exercise. Further analysis will investigate these matters in more depth.

The study involved 1,233 older adults, aged 70 to 89. Of these, 163 were found to have MCI.

None of this should be taken as a recommendation for severely restricting your diet. Certainly such behavior should not be undertaken without the approval of your doctor, but in any case, calorie restriction is only part of a much more complex issue concerning diet. I look forward to hearing more from the Mayo Clinic study regarding types of foods and interacting factors.

[2681] Fusco, S., Ripoli C., Podda M V., Ranieri S C., Leone L., Toietta G., et al.
(2012).  A role for neuronal cAMP responsive-element binding (CREB)-1 in brain responses to calorie restriction.
Proceedings of the National Academy of Sciences. 109(2), 621 - 626.

The findings from the National Institute on Aging were presented at the annual meeting of the American Association for the Advancement of Science in Vancouver.

Geda, Y., Ragossnig, M., Roberts, L.K., Roberts, R., Pankratz, V., Christianson, T., Mielke, M., Boeve, B., Tangalos, E. & Petersen, R. 2012. Caloric Intake, Aging, and Mild Cognitive Impairment: A Population-Based Study. To be presented April 25 at the American Academy of Neurology's 64th Annual Meeting in New Orleans.

Openness to experience – being flexible and creative, embracing new ideas and taking on challenging intellectual or cultural pursuits – is one of the ‘Big 5’ personality traits. Unlike the other four, it shows some correlation with cognitive abilities. And, like them, openness to experience does tend to decline with age.

However, while there have been many attempts to improve cognitive function in older adults, to date no one has tried to increase openness to experience. Naturally enough, one might think — it’s a personality trait, and we are not inclined to view personality traits as amenable to ‘training’. However, recently there have been some indications that personality traits can be changed, through cognitive interventions or drug treatments. In this new study, a cognitive training program for older adults also produced increases in their openness to experience.

The study involved 183 older adults (aged 60-94; average age 73), who were randomly assigned to a 16-week training program or a waiting-list control group. The program included training in inductive reasoning, and puzzles that relied in part on inductive reasoning. Most of this activity was carried out at home, but there were two 1-hour classroom sessions: one to introduce the inductive reasoning training, and one to discuss strategies for Sudoku and crosswords.

Participants came to the lab each week to hand in materials and pick up the next set. Initially, they were given crossword and Sudoku puzzles with a wide range of difficulty. Subsequently, puzzle sets were matched to each participant’s skill level (assessed from the previous week’s performance). Over the training period, the puzzles became progressively more difficult, with the steps tailored to each individual.

The inductive reasoning training involved learning to recognize novel patterns and use them to solve problems. In ‘basic series problems’, the problems required inference from a serial pattern of words, letters, or numbers. ‘Everyday serial problems’ included problems such as completing a mail order form and answering questions about a bus schedule. Again, the difficulty of the problems increased steadily over the training period.

Participants were asked to spend at least 10 hours a week on program activities, and according to the daily logs they filled in, they spent an average of 11.4 hours a week. In addition to the hopefully inherent enjoyment of the activities, those who recorded 10 hours were recognized on a bulletin board tally sheet and entered into a raffle for a prize.

Cognitive and personality testing took place 4-5 weeks prior to the program starting, and 4-5 weeks after program end. Two smaller assessments also took place during the program, at week 6 and week 12.

At the end of the program, those who had participated had significantly improved their pattern-recognition and problem-solving skills. This improvement went along with a moderate but significant increase in openness. Analysis suggested that this increase in openness occurred independently of improvement in inductive reasoning.

The benefits were specific to inductive reasoning and openness, with no significant effects on divergent thinking, processing speed, verbal ability, or the other Big 5 traits.

The researchers suggest that the carefully stepped training program was important in leading to increased openness, allowing the building of a growing confidence in their reasoning abilities. Openness to experience contributes to engagement and enjoyment in stimulating activity, and has also been linked to better health and decreased mortality risk. It seems likely, then, that increases in openness can be part of a positive feedback cycle, leading to greater and more sustained engagement in mentally stimulating activities.

The corollary is that decreases in openness may lead to declines in cognitive engagement, and then to poorer cognitive function. Indeed it has been previously suggested that openness to experience plays a role in cognitive aging.

Clearly, more research is needed to tease out how far these findings extend to other activities, and the importance of scaffolding (carefully designing cognitive activities on an individualized basis to support learning), but this work reveals an overlooked aspect to the issue of mental stimulation for preventing age-related cognitive decline.

I’ve spoken before about the association between hearing loss in old age and dementia risk. Although we don’t currently understand that association, it may be that preventing hearing loss also helps prevent cognitive decline and dementia. I have previously reported on how music training in childhood can help older adults’ ability to hear speech in a noisy environment. A new study adds to this evidence.

The study looked at a specific aspect of understanding speech: auditory brainstem timing. Aging disrupts this timing, degrading the ability to precisely encode sound.

In this study, automatic brain responses to speech sounds were measured in 87 younger and older normal-hearing adults as they watched a captioned video. It was found that older adults who had begun musical training before age 9 and engaged consistently in musical activities through their lives (“musicians”) not only significantly outperformed older adults who had no more than three years of musical training (“non-musicians”), but encoded the sounds as quickly and accurately as the younger non-musicians.

The researchers qualify this finding by saying that it shows only that musical experience selectively affects the timing of sound elements that are important in distinguishing one consonant from another, not necessarily all sound elements. However, it seems probable that it extends more widely, and in any case the ability to understand speech is crucial to social interaction, which may well underlie at least part of the association between hearing loss and dementia.

The burning question for many will be whether the benefits of music training can be accrued later in life. We will have to wait for more research to answer that, but, as music training and enjoyment fit the definition of ‘mentally stimulating activities’, this certainly adds another reason to pursue such a course.

We know that physical exercise greatly helps you prevent cognitive decline with aging. We know that mental stimulation also helps you prevent age-related cognitive decline. So it was only a matter of time before someone came up with a way of combining the two. A new study found that older adults improved executive function more by participating in virtual reality-enhanced exercise ("exergames") that combine physical exercise with computer-simulated environments and interactive videogame features, compared to the same exercise without the enhancements.

The Cybercycle Study involved 79 older adults (aged 58-99) from independent living facilities with indoor access to a stationary exercise bike. Of the 79, 63 participants completed the three-month study, meaning that they achieved at least 25 rides during the three months.

Unfortunately, randomization was not as good as it should have been — although the researchers planned to randomize on an individual basis, various technical problems led them to randomize on a site basis (there were eight sites), with the result that the cybercycle group and the control bike group were significantly different in age and education. Although the researchers took this into account in the analysis, that is not the same as having groups that match in these all-important variables. However, at least the variables went in opposite directions: while the cybercycle group was significantly younger (average 75.7 vs 81.6 years), it was significantly less educated (average 12.6 vs 14.8 years).

Perhaps also partly off-setting the age advantage, the cybercycle group was in poorer shape than the control group (higher BMI, glucose levels, lower physical activity level, etc), although these differences weren’t statistically significant. IQ was also lower for the cybercycle group, if not significantly so (but note the high averages for both groups: 117.6 vs 120.6). One of the three tests of executive function, Color Trails, also showed a marked group difference, but the large variability in scores meant that this difference was not statistically significant.

Although participants were screened for disorders such as Alzheimer’s and Parkinson’s, and functional disability, many of both groups were assessed as having MCI — 16 of the 38 in the cybercycle group and 14 of the 41 in the control bike group.

Participants were given cognitive tests at enrolment, one month later (before the intervention began), and after the intervention ended. The stationary bikes were identical for both groups, except the experimental bike was equipped with a virtual reality display. Cybercycle participants experienced 3D tours and raced against a "ghost rider," an avatar based on their last best ride.

The hypothesis was that cybercycling would particularly benefit executive function, and this was borne out. Executive function (measured by the Color Trails, Stroop test, and Digits Backward) improved significantly more in the cybercycle condition, and indeed was the only cognitive task to do so (other cognitive tests included verbal fluency, verbal memory, visuospatial skill, motor function). Indeed, the control group, despite getting the same amount of exercise, got worse at the Digits Backward test, and failed to show any improvement on the Stroop test.

Moreover, significantly fewer cybercyclists progressed to MCI compared to the control group (three vs nine).

There were no differences in exercise quantity or quality between the two groups — which does argue against the idea that cyber-enhanced physical activity would be more motivating. However, the cybercycling group did tend to comment on their enjoyment of the exercise. While the enjoyment may not have translated into increased activity in this situation, it may well do so in a longer, less directed intervention — i.e. real life.

It should also be remembered that the intervention was relatively short, and that other cognitive tasks might take longer to show improvement than the more sensitive executive function. This is supported by the fact that levels of the brain growth factor BDNF, assessed in 30 participants, showed a significantly greater increase of BDNF in cybercyclists.

I should also emphasize that the level of physical exercise really wasn't that great, but nevertheless the size of the cybercycle's effect on executive function was greater than usually produced by aerobic exercise (a medium effect rather than a small one).

The idea that activities that combine physical and mental exercise are of greater cognitive benefit than the sum of benefits from each type of exercise on its own is not inconsistent with previous research, and in keeping with evidence from animal studies that physical exercise and mental stimulation help the brain via different mechanisms. Moreover, I have an idea that enjoyment (in itself, not as a proxy for motivation) may be a factor in the cognitive benefits derived from activities, whether physical or mental. Mere speculation, derived from two quite separate areas of research: the idea of “flow” / “being in the zone”, and the idea that humor has physiological benefits.

Of course, as discussed, this study has a number of methodological issues that limit its findings, but hopefully it will be the beginning of an interesting line of research.  

[2724] Anderson-Hanley, C., Arciero P. J., Brickman A. M., Nimon J. P., Okuma N., Westen S. C., et al.
(2012).  Exergaming and Older Adult Cognition.
American Journal of Preventive Medicine. 42(2), 109 - 119.

The age at which cognitive decline begins has been the subject of much debate. The Seattle longitudinal study has provided most of the evidence that it doesn’t begin until age 60. A more recent, much larger study that allows both longitudinal and cross-sectional analysis suggests that, depressingly, mid-to-late forties might be closer to the mark.

A long-term British study known as Whitehall II began in 1985, when all civil servants aged 35-55 in 20 London-based departments were invited to participate. In 1997-9, 5198 male and 2192 female civil servants, aged 45-70 at this point, were given the first of three rounds of cognitive testing. The second round took place in 2002-4, and the third in 2007-9.

Over these ten years, all cognitive scores except vocabulary declined in all five age categories (45-49, 50-54, 55-59, 60-64, and 65-70 at baseline). Unsurprisingly, the decline was greater with increasing age, and greatest for reasoning. Men aged 45-9 at baseline showed a 3.6% decline in reasoning, compared to a 9.6% decline for those aged 65-70. Women were less affected by age: while showing the same degree of decline when younger, the oldest showed a 7.4% decline.

None of the other cognitive tasks showed the same age-related deterioration as reasoning, which displayed a consistently linear decline with advancing age. The amount of decline over ten years was roughly similar for each age group for short-term memory and phonemic and semantic fluency (although the women displayed more variability in memory, in a somewhat erratic pattern which may perhaps reflect hormonal changes — I’m speculating here). Moreover, the amount of decline in each decade for these functions was only about the same as reasoning’s decline in the younger decades — about -4% in each decade.

Men and women differed significantly in education (33% of men attended university compared to 21% of women; 57% of women never finished secondary school compared to 39% of men). It is therefore unsurprising that men performed significantly better on all cognitive tests except memory (noting that the actual differences in score were mostly quite small: 16.9/35 vs 16.5 for phonemic fluency; 16.7/35 vs 15.8 for semantic fluency; 25.7/33 vs 23.1 for vocabulary; 48.7/65 vs 41.6 for reasoning).

The cognitive tests included a series of 65 verbal and mathematical reasoning items of increasing difficulty (testing inductive reasoning), a 20-word free recall test (short-term verbal memory), recalling as many words as possible beginning with “S” (phonemic fluency) and recalling members of the animal category (semantic fluency), and a multi-choice vocabulary test.

The design of the study allowed both longitudinal and cross-sectional analyses to be carried out. Cross-sectional data, although more easily acquired, has been criticized as conflating age effects with cohort differences. Generations differ on several relevant factors, of which education is the most obvious. The present study semi-confirmed this, finding that cross-sectional data considerably over-estimated cognitive decline in women but not men — reflecting the fact that education changed far more for women than men in the relevant time periods. For example, in the youngest group of men, 30% had less than a secondary school education and 42% had a university degree, and the women showed a similar pattern, with 34% and 40%. However, for those aged 55-59 at baseline, the corresponding figures were 38% and 29% for men compared to 58% and 17% for women.

The principal finding is of course that measurable cognitive decline was evident in the youngest group, meaning that at some point during that 45-55 decade, cognitive faculties begin to decline. Of course, it should be emphasized that this is a group effect — individuals will vary in the extent and timing of any cognitive decline.

(A side-note: During the ten year period, 305 participants died. The probability of dying was higher in those with poorer cognitive scores at baseline.)

The study involved 104 healthy older adults (average age 87) participating in the Oregon Brain Aging Study. Analysis of the nutrient biomarkers in their blood revealed that those with diets high in omega 3 fatty acids and in vitamins C, D, E and the B vitamins had higher scores on cognitive tests than people with diets low in those nutrients, while those with diets high in trans fats were more likely to score more poorly on cognitive tests.

These were dose-dependent, with each standard deviation increase in the vitamin BCDE score ssociated with a 0.28 SD increase in global cognitive score, and each SD increase in the trans fat score associated with a 0.30 SD decrease in global cognitive score.

Trans fats are primarily found in packaged, fast, fried and frozen food, baked goods and margarine spreads.

Brain scans of 42 of the participants found that those with diets high in vitamins BCDE and omega 3 fatty acids were also less likely to have the brain shrinkage associated with Alzheimer's, while those with high trans fats were more likely to show such brain atrophy.

Those with higher omega-3 scores also had fewer white matter hyperintensities. However, this association became weaker once depression and hypertension were taken into account.

Overall, the participants had good nutritional status, but 7% were deficient in vitamin B12 (I’m surprised it’s so low, but bear in mind that these are already a select group, being healthy at such an advanced age) and 25% were deficient in vitamin D.

The nutrient biomarkers accounted for 17% of the variation in cognitive performance, while age, education, APOE genotype (presence or absence of the ‘Alzheimer’s gene’), depression and high blood pressure together accounted for 46%. Diet was more important for brain atrophy: here, the nutrient biomarkers accounted for 37% of the variation, while the other factors accounted for 40% (meaning that diet was nearly as important as all these other factors combined!).

The findings add to the growing evidence that diet has a significant role in determining whether or not, and when, you develop Alzheimer’s disease.

A study involving 159 older adults (average age 76) has confirmed that the amount of brain tissue in specific regions is a predictor of Alzheimer’s disease development. Of the 159 people, 19 were classified as at high risk on the basis of the smaller size of nine small regions previously shown to be vulnerable to Alzheimer's), and 24 as low risk. The regions, in order of importance, are the medial temporal, inferior temporal, temporal pole, angular gyrus, superior parietal, superior frontal, inferior frontal cortex, supramarginal gyrus, precuneus.

There was no difference between the three risk groups at the beginning of the study on global cognitive measures (MMSE; Alzheimer’s Disease Assessment Scale—cognitive subscale; Clinical Dementia Rating—sum of boxes), or in episodic memory. The high-risk group did perform significantly more slowly on the Trail-making test part B, with similar trends on the Digit Symbol and Verbal Fluency tests.

After three years, 125 participants were re-tested. Nine met the criteria for cognitive decline. Of these, 21% were from the small high-risk group (3/14) and 7% from the much larger average-risk group (6/90). None were from the low-risk group.

The results were even more marked when less stringent criteria were used. On the basis of an increase on the Clinical Dementia Rating, 28.5% of the high-risk group and 9.7% of the average-risk group showed decline. On the basis of declining at least one standard deviation on any one of the three neuropsychological tests, half the high-risk group, 35% of the average risk group, and 14% (3/21) of the low-risk group showed decline. (The composite criteria required both of these criteria.)

Analysis estimated that every standard deviation of cortical thinning (reduced brain tissue) was associated with a nearly tripled risk of cognitive decline.

The 84 individuals for whom amyloid-beta levels in the cerebrospinal fluid were available also revealed that 60% of the high-risk group had levels consistent with the presence of Alzheimer's pathology, compared to 36% of those at average risk and 19% of those at low risk.

The findings extend and confirm the evidence that brain atrophy in specific regions is a biomarker for developing Alzheimer’s.

[2709] Dickerson, B. C., & Wolk D. A.
(2012).  MRI cortical thickness biomarker predicts AD-like CSF and cognitive decline in normal adults.
Neurology. 78(2), 84 - 90.

Dickerson BC, Bakkour A, Salat DH, et al. 2009. The cortical signature of Alzheimer’s disease: regionally specific cortical thinning relates to symptom severity in very mild to mild AD dementia and is detectable in asymptomatic amyloidpositive individuals. Cereb Cortex;19:497–510.

A certain level of mental decline in the senior years is regarded as normal, but some fortunate few don’t suffer from any decline at all. The Northwestern University Super Aging Project has found seniors aged 80+ who match or better the average episodic memory performance of people in their fifties. Comparison of the brains of 12 super-agers, 10 cognitively-normal seniors of similar age, and 14 middle-aged adults (average age 58) now reveals that the brains of super-agers also look like those of the middle-aged. In contrast, brain scans of cognitively average octogenarians show significant thinning of the cortex.

The difference between the brains of super-agers and the others was particularly marked in the anterior cingulate cortex. Indeed, the super agers appeared to have a much thicker left anterior cingulate cortex than the middle-aged group as well. Moreover, the brain of a super-ager who died revealed that, although there were some plaques and tangles (characteristic, in much greater quantities, of Alzheimer’s) in the mediotemporal lobe, there were almost none in the anterior cingulate. (But note an earlier report from the researchers)

Why this region should be of special importance is somewhat mysterious, but the anterior cingulate is part of the attention network, and perhaps it is this role that underlies the superior abilities of these seniors. The anterior cingulate also plays a role error detection and motivation; it will be interesting to see if these attributes are also important.

While the precise reason for the anterior cingulate to be critical to retaining cognitive abilities might be mysterious, the lack of cortical atrophy, and the suggestion that super-agers’ brains have much reduced levels of the sort of pathological damage seen in most older brains, adds weight to the growing evidence that cognitive aging reflects clinical problems, which unfortunately are all too common.

Sadly, there are no obvious lifestyle factors involved here. The super agers don’t have a lifestyle any different from their ‘cognitively average’ counterparts. However, while genetics might be behind these people’s good fortune, that doesn’t mean that lifestyle choices don’t make a big difference to those of us not so genetically fortunate. It seems increasingly clear that for most of us, without ‘super-protective genes’, health problems largely resulting from lifestyle choices are behind much of the damage done to our brains.

It should be emphasized that these unpublished results are preliminary only. This conference presentation reported on data from only 12 of 48 subjects studied.

Harrison, T., Geula, C., Shi, J., Samimi, M., Weintraub, S., Mesulam, M. & Rogalski, E. 2011. Neuroanatomic and pathologic features of cognitive SuperAging. Presented at a poster session at the 2011 Society for Neuroscience conference.

Previous research has found that carriers of the so-called KIBRA T allele have been shown to have better episodic memory than those who don’t carry that gene variant (this is a group difference; it doesn’t mean that any carrier will remember events better than any non-carrier). A large new study confirms and extends this finding.

The study involved 2,230 Swedish adults aged 35-95. Of these, 1040 did not have a T allele, 932 had one, and 258 had two.  Those who had at least one T allele performed significantly better on tests of immediate free recall of words (after hearing a list of 12 words, participants had to recall as many of them as they could, in any order; in some tests, there was a concurrent sorting task during presentation or testing).

There was no difference between those with one T allele and those with two. The effect increased with increasing age. There was no effect of gender. There was no significant effect on performance of delayed category cued recall tests or a visuospatial task, although a trend in the appropriate direction was evident.

It should also be noted that the effect on immediate recall, although statistically significant, was not large.

Brain activity was studied in a subset of this group, involving 83 adults aged 55-60, plus another 64 matched on sex, age, and performance on the scanner task. A further group of 113 65-75 year-olds were included for comparison purposes. While in the scanner, participants carried out a face-name association task. Having been presented with face-name pairs, participants were tested on their memory by being shown the faces with three letters, of which one was the initial letter of the name.

Performance on the scanner task was significantly higher for T carriers — but only for the 55-60 age group, not for the 65-75 age group. Activity in the hippocampus was significantly higher for younger T carriers during retrieval, but not encoding. No such difference was seen in the older group.

This finding is in contrast with an earlier, and much smaller, study involving 15 carriers and 15 non-carriers, which found higher activation of the hippocampus in non-T carriers. This was taken at the time to indicate some sort of compensatory activity. The present finding challenges that idea.

Although higher hippocampal activation during retrieval is generally associated with faster retrieval, the higher activity seen in T carriers was not fully accounted for by performance. It may be that such activity also reflects deeper processing.

KIBRA-T carriers were neither more nor less likely to carry other ‘memory genes’ — APOEe4; COMTval158met; BDNFval66met.

The findings, then, fail to support the idea that non-carriers engage compensatory mechanisms, but do indicate that the KIBRA-T gene helps episodic memory by improving the hippocampus function.

BDNF gene variation predicts rate of age-related decline in skilled performance

In another study, this time into the effects of the BDNF gene, performance on an airplane simulation task on three annual occasions was compared. The study involved 144 pilots, of whom all were healthy Caucasian males aged 40-69, and 55 (38%) of whom turned out to have at least one copy of a BDNF gene that contained the ‘met’ variant. This variant is less common, occurring in about one in three Asians, one in four Europeans and Americans, and about one in 200 sub-Saharan Africans.  

While performance dropped with age for both groups, the rate of decline was much steeper for those with the ‘met’ variant. Moreover, there was a significant inverse relationship between age and hippocampal size in the met carriers — and no significant correlation between age and hippocampal size in the non-met carriers.

Comparison over a longer time-period is now being undertaken.

The finding is more evidence for the value of physical exercise as you age — physical activity is known to increase BDNF levels in your brain. BDNF levels tend to decrease with age.

The met variant has been linked to higher likelihood of depression, stroke, anorexia nervosa, anxiety-related disorders, suicidal behavior and schizophrenia. It differs from the more common ‘val’ variant in having methionine rather than valine at position 66 on this gene. The BDNF gene has been remarkably conserved across evolutionary history (fish and mammalian BDNF have around 90% agreement), suggesting that mutations in this gene are not well tolerated.

In the last five years, three studies have linked lower neighborhood socioeconomic status to lower cognitive function in older adults. Neighborhood has also been linked to self-rated health, cardiovascular disease, and mortality. Such links between health and neighborhood may come about through exposure to pollutants or other environmental stressors, access to alcohol and cigarettes, barriers to physical activity, reduced social support, and reduced access to good health and social services.

Data from the large Women’s Health Initiative Memory Study has now been analyzed to assess whether the relationship between neighborhood socioeconomic status can be explained by various risk and protective factors for poor cognitive function.

Results confirmed that higher neighborhood socioeconomic status was associated with higher cognitive function, even after individual factors such as age, ethnicity, income, education, and marital status have been taken into account. A good deal of this was explained by vascular factors (coronary heart disease, diabetes, stroke, hypertension), health behaviors (amount of alcohol consumed, smoking, physical activity), and psychosocial factors (depression, social support). Nevertheless, the association was still (barely) significant after these factors were taken account of, suggesting some other factors may also be involved. Potential factors include cognitive activity, diet, and access to health services.

In contradiction of earlier research, the association appeared to be stronger among younger women. Consistent with other research, the association was stronger for non-White women.

Data from 7,479 older women (65-81) was included in the analysis. Cognitive function was assessed by the Modified MMSE (3MSE). Neighborhood socioeconomic status was assessed on the basis of: percentage of adults over 25 with less than a high school education, percentage of male unemployment, percentage of households below the poverty line, percentage of households receiving public assistance, percentage of female-headed households with children, and median household income. Around 87% of participants were White, 7% Black, 3% Hispanic, and 3% other. Some 92% had graduated high school, and around 70% had at least some college.

[2523] Shih, R. A., Ghosh-Dastidar B., Margolis K. L., Slaughter M. E., Jewell A., Bird C. E., et al.
(2011).  Neighborhood Socioeconomic Status and Cognitive Function in Women.
Am J Public Health. 101(9), 1721 - 1728.

Previous:

Lang IA, Llewellyn DJ, Langa KM, Wallace RB, Huppert FA, Melzer D. 2008. Neighborhood deprivation, individual socioeconomic status, and cognitive function in older people: analyses from the English Longitudinal Study of Ageing. J Am Geriatr Soc., 56(2), 191-198.

Sheffield KM, Peek MK. 2009. Neighborhood context and cognitive decline in older Mexican Americans: results from the Hispanic Established Populations for Epidemiologic Studies of the Elderly. Am J Epidemiol., 169(9), 1092-1101.

Wight RG, Aneshensel CS, Miller-Martinez D, et al. 2006. Urban neighborhood context, educational attainment, and cognitive function among older adults. Am J Epidemiol., 163(12), 1071-1078.

A telephone survey of around 17,000 older women (average age 74), which included questions about memory lapses plus standard cognitive tests, found that getting lost in familiar neighborhoods was highly associated with cognitive impairment that might indicate Alzheimer’s. Having trouble keeping up with a group conversation and difficulty following instructions were also significantly associated with cognitive impairment. But, as most of us will be relieved to know, forgetting things from one moment to the next was not associated with impairment!

Unsurprisingly, the more memory complaints a woman had, the more likely she was to score poorly on the cognitive test.

The 7 memory lapse questions covered:

  • whether they had recently experienced a change in their ability to remember things,
  • whether they had trouble remembering a short list of items (such as a shopping list),
  • whether they had trouble remembering recent events,
  • whether they had trouble remembering things from one second to the next,
  • whether they had difficulty following spoken or written instructions,
  • whether they had more trouble than usual following a group conversation or TV program due to memory problems,
  • whether they had trouble finding their way around familiar streets.

Because this survey was limited to telephone tests, we can’t draw any firm conclusions. But the findings may be helpful for doctors and others, to know which sort of memory complaints should be taken as a flag for further investigation.

The very large and long-running Women's Health Initiative study surprised everyone when it produced its finding that hormone therapy generally increased rather than decreased stroke risk as well as other health problems. But one explanation for that finding might be that many of the women only received hormone replacement therapy years after menopause. There are indications that timing is crucial.

This new rat study involved female rats equivalent to human 60-65 year olds, about a decade past menopause.  An enzyme called CHIP (carboxyl terminus of Hsc70 interacting protein) was found to increase binding with estrogen receptors, resulting in about half the receptors getting hauled to the cell's proteosome to be chopped up and degraded. When some of the aged rats were later treated with estrogen, mortality increased. When middle-aged rats were treated with estrogen, on the other hand, results were positive.

In other words, putting in extra estrogen after the number of estrogen receptors in the brain has been dramatically decreased is a bad idea.

While this study focused on mortality, other research has produced similar conflicting results as to whether estrogen therapy helps fight age-related cognitive impairment in women (see my report). It’s interesting to note that this effect only occurred in the hippocampus — estrogen receptors in the uterus were unaffected.

In a study involving 115 seniors (average age 81), those who participated in a six-week, 12-session memory training program significantly improved their verbal memory. 15-20 seniors participated in each hour-long class, which included explanations of how memory works, quick strategies for remembering names, faces and numbers, basic memory strategies such as linking ideas and creating visual images, and information on a healthy lifestyle for protecting and maintaining memory.

Most of the study participants were women, Caucasian and had attained a college degree or higher level of education.

[2491] Miller, K. J., Siddarth P., Gaines J. M., Parrish J. M., Ercoli L. M., Marx K., et al.
(2011).  The Memory Fitness Program.
American Journal of Geriatric Psychiatry. 1 - 1.

Following a 1994 study that found that errorless learning was better than trial-and-error learning for amnesic patients and older adults, errorless learning has been widely adopted in the rehabilitation industry. Errorless learning involves being told the answer without repeatedly trying to answer the question and perhaps making mistakes. For example, in the 1994 study, participants in the trial-and-error condition could produce up to three errors in answer to the question “I am thinking of a word that begins with QU”, before being told the answer was QUOTE; in contrast, participants in the errorless condition were simply told “I am thinking of a word that begins with QU and it is ‘QUOTE’.”

In a way, it is surprising that errorless learning should be better, given that trial-and-error produces much deeper and richer encoding, and a number of studies with young adults have indeed found an advantage for making errors. Moreover, it’s well established that retrieving an item leads to better learning than passively studying it, even when you retrieve the wrong item. This testing effect has also been found in older adults.

In another way, the finding is not surprising at all, because clearly the trial-and-error condition offers many opportunities for confusion. You remember that QUEEN was mentioned, for example, but you don’t remember whether it was a right or wrong answer. Source memory, as I’ve often mentioned, is particularly affected by age.

So there are good theoretical reasons for both positions regarding the value of mistakes, and there’s experimental evidence for both. Clearly it’s a matter of circumstance. One possible factor influencing the benefit or otherwise of error concerns the type of processing. Those studies that have found a benefit have generally involved conceptual associations (e.g. What’s Canada’s capital? Toronto? No, Ottawa). It may be that errors are helpful to the extent that they act as retrieval cues, and evoke a network of related concepts. Those studies that have found errors harm learning have generally involved perceptual associations, such as word stems and word fragments (e.g., QU? QUeen? No, QUote). These errors are arbitrary, produce interference, and don’t provide useful retrieval cues.

So this new study tested the idea that producing errors conceptually associated with targets would boost memory for the encoding context in which information was studied, especially for older adults who do not spontaneously elaborate on targets at encoding.

In the first experiment, 33 young (average age 21) and 31 older adults (average age 72) were shown 90 nouns presented in three different, intermixed conditions. In the read condition (designed to provide a baseline), participants read aloud the noun fragment presented without a semantic category (e.g., p­_g). In the errorless condition, the semantic category was presented with the target word fragment (e.g. a farm animal  p­_g), and the participants read aloud the category and their answer. The category and target were then displayed. In the trial-and-error condition, the category was presented and participants were encouraged to make two guesses before being shown the target fragment together with the category. The researchers changed the target if it was guessed. Participants were then tested using a list of 70 words, of which 10 came from each of the study conditions, 10 were new unrelated words, and 30 were nontarget exemplars from the TEL categories. Those that the subject had guessed were labeled as learning errors; those that hadn’t come up were labeled as related lures. In addition to an overall recognition test (press “yes” to any word you’ve studied and “no” to any new word), there were two tests that required participants to endorse items that were studied in the TEL condition and reject those studied in the EL condition, and vice versa.

The young adults did better than the older on every test. TEL produced better learning than EL, and both produced better learning than the read condition (as expected). The benefit of TEL was greater for older adults. This is in keeping with the idea that generating exemplars of a semantic category, as occurs in trial-and-error learning, helps produce a richer, more elaborated code, and that this is of greater to older adults, who are less inclined to do this without encouragement.

There was a downside, however. Older adults were also more prone to falsely endorsing prior learning errors or semantically-related lures. It’s worth noting that both groups were more likely to falsely endorse learning errors than related lures.

But the main goal of this first experiment was to disentangle the contributions of recollection and familiarity to the two types of learning. It turns out that there was no difference between young and older adults in terms of familiarity; the difference in performance between the two groups stemmed from recollection. Recollection was a problem for older adults in the errorless condition, but not in the trial-and-error condition (where the recollective component of their performance matched that of young adults). This deficit is clearly closely related to age-related deficits in source memory.

It was also found that familiarity was marginally more important in the errorless condition than the trial-and-error condition. This is consistent with the idea that targets learned without errors acquire greater fluency than those learned with errors (with the downside that they don’t pick up those contextual details that making errors can provide).

In the second experiment, 15 young and 15 older adults carried out much the same procedure, except that during the recognition test they were also required to mention the context in which the words were learned was tested (that is, were the words learned through trial-and-error or not).

Once again, trial-and-error learning was associated with better source memory relative to errorless learning, particularly for the older adults.

These results support the hypothesis that trial-and-error learning is more beneficial than errorless learning for older adults when the trials encourage semantic elaboration. But another factor may also be involved. Unlike other errorless studies, participants were required to attend to errors as well as targets. Explicit attention to errors may help protect against interference.

In a similar way, a recent study involving young adults found that feedback given in increments (thus producing errors) is more effective than feedback given all at once in full. Clearly what we want is to find that balance point, where elaborative benefits are maximized and interference is minimized.

[2496] Cyr, A-A., & Anderson N. D.
(2011).  Trial-and-error learning improves source memory among young and older adults.
Psychology and Aging. No Pagination Specified - No Pagination Specified.

In the first mouse study, when young and old mice were conjoined, allowing blood to flow between the two, the young mice showed a decrease in neurogenesis while the old mice showed an increase. When blood plasma was then taken from old mice and injected into young mice, there was a similar decrease in neurogenesis, and impairments in memory and learning.

Analysis of the concentrations of blood proteins in the conjoined animals revealed the chemokine (a type of cytokine) whose level in the blood showed the biggest change — CCL11, or eotaxin. When this was injected into young mice, they indeed showed a decrease in neurogenesis, and this was reversed once an antibody for the chemokine was injected. Blood levels of CCL11 were found to increase with age in both mice and humans.

The chemokine was a surprise, because to date the only known role of CCL11 is that of attracting immune cells involved in allergy and asthma. It is thought that most likely it doesn’t have a direct effect on neurogenesis, but has its effect through, perhaps, triggering immune cells to produce inflammation.

Exercise is known to at least partially reverse loss of neurogenesis. Exercise has also been shown to produce chemicals that prevent inflammation. Following research showing that exercise after brain injury can help the brain repair itself, another mouse study has found that mice who exercised regularly produced interleukin-6 (a cytokine involved in immune response) in the hippocampus. When the mice were then exposed to a chemical that destroys the hippocampus, the interleukin-6 dampened the harmful inflammatory response, and prevented the loss of function that is usually observed.

One of the actions of interleukin-6 that brings about a reduction in inflammation is to inhibit tumor necrosis factor. Interestingly, I previously reported on a finding that inhibiting tumor necrosis factor in mice decreased cognitive decline that often follows surgery.

This suggests not only that exercise helps protect the brain from the damage caused by inflammation, but also that it might help protect against other damage, such as that caused by environmental toxins, injury, or post-surgical cognitive decline. The curry spice cucurmin, and green tea, are also thought to inhibit tumor necrosis factor.

Comparison of 99 chimpanzee brains ranging from 10-51 years of age with 87 human brains ranging from 22-88 years of age has revealed that, unlike the humans, chimpanzee brains showed no sign of shrinkage with age. But the answer may be simple: we live much longer. In the wild, chimps rarely live past 45, and although human brains start shrinking as early as 25 (as soon as they reach maturity, basically!), it doesn’t become significant until around 50.

The answer suggests one reason why humans are uniquely vulnerable to Alzheimer’s disease — it’s all down to our combination of large brain and long life. There are other animals that experience some cognitive impairment and brain atrophy as they age, but nothing as extreme as that found in humans (a 10-15% decline in volume over the life-span). (Elephants and whales have the same two attributes as humans — large brains and long lives — but we lack information on how their brains change with age.)

The problem may lie in the fact that our brains use so much more energy than chimps’ (being more than three times larger than theirs) and thus produce a great deal more damaging oxidation. Over a longer life-span, this accumulates until it significantly damages the brain.

If that’s true, it reinforces the value of a diet high in antioxidants.

[2500] Sherwood, C. C., Gordon A. D., Allen J. S., Phillips K. A., Erwin J. M., Hof P. R., et al.
(2011).  Aging of the cerebral cortex differs between humans and chimpanzees.
Proceedings of the National Academy of Sciences. 108(32), 13029 - 13034.

In my book on remembering what you’re doing and what you intend to do, I briefly discuss the popular strategy of asking someone to remind you (basically, whether it’s an effective strategy depends on several factors, of which the most important is the reliability of the person doing the reminding). So I was interested to see a pilot study investigating the use of this strategy between couples.

The study confirms earlier findings that the extent to which this strategy is effective depends on how reliable the partner's memory is, but expands on that by tying it to age and conversational style.

The study involved 11 married couples, of whom five were middle-aged (average age 52), and six were older adults (average age 73). Participants completed a range of prospective memory tasks by playing the board game "Virtual Week," which encourages verbal interaction among players about completing real life tasks. For each virtual "day" in the game, participants were asked to perform 10 different prospective memory tasks — four that regularly occur (eg, taking medication with breakfast), four that were different each day (eg, purchasing gasoline for the car), and two being time-check tasks that were not based on the activities of the board game (eg, check lung capacity at two specified times).

Overall, the middle-aged group benefited more from collaboration than the older group. But it was also found that those couples who performed best were those who were more supportive and encouraging of each other.

Collaboration in memory tasks is an interesting activity, because it can be both helpful and hindering. Think about how memory works — by association. You start from some point, and if you’re on a good track, more and more should be revealed as each memory triggers another. If another person keeps interrupting your train, you can be derailed. On the other hand, they might help you fill you in gaps that you need, or even point you to the right track, if you’re on the wrong one.

In this small study, it tended to be the middle-aged couples that filled in the gaps more effectively than the older couples. That probably has a lot to do with memory reliability. So it’s not a big surprise (though useful to be aware of). But what I find more interesting (because it’s less obvious, and more importantly, because it’s more under our control) is this idea that our conversational style affects whether memory collaboration is useful or counterproductive. I look forward to results from a larger study.

[2490] Margrett, J. A., Reese-Melancon C., & Rendell P. G.
(2011).  Examining Collaborative Dialogue Among Couples.
Zeitschrift für Psychologie / Journal of Psychology. 219, 100 - 107.

A study comparing activity in the dorsolateral prefrontal cortex in young, middle-aged and aged macaque monkeys as they performed a spatial working memory task has found that while neurons of the young monkeys maintained a high rate of firing during the task, neurons in older animals showed slower firing rates. The decline began in middle age.

Neuron activity was recorded in a particular area of the dorsolateral prefrontal cortex that is most important for visuospatial working memory. Some neurons only fired when the cue was presented (28 CUE cells), but most were active during the delay period as well as the cue and response periods (273 DELAY neurons). Persistent firing during the delay period is of particular interest, as it is required to maintain information in working memory. Many DELAY neurons increased their activity when the preferred spatial location was being remembered.

While the activity of the CUE cells was unaffected by age, that of DELAY cells was significantly reduced. This was true both of spontaneous activity and task-related activity. Moreover, the reduction was greatest during the cue and delay periods for the preferred direction, meaning that the effect of age was to reduce the ability to distinguish preferred and non-preferred directions.

It appeared that the aging prefrontal cortex was accumulating excessive levels of an important signaling molecule called cAMP. When cAMP was inhibited or cAMP-sensitive ion channels were blocked, firing rates rose to more youthful levels. On the other hand, when cAMP was stimulated, aged neurons reduced their activity even more.

The findings are consistent with rat research that has found two of the agents used — guanfacine and Rp-cAMPS — can improve working memory in aged rats. Guanfacine is a medication that is already approved for treating hypertension in adults and prefrontal deficits in children. A clinical trial testing guanfacine's ability to improve working memory and executive functions in elderly subjects who do not have dementia is now taking place.

[2349] Wang, M., Gamo N. J., Yang Y., Jin L. E., Wang X-J., Laubach M., et al.
(2011).  Neuronal basis of age-related working memory decline.
Nature. advance online publication,

A standard test of how we perceive local vs global features of visual objects uses Navon figures — large letters made up of smaller ones (see below for an example). As in the Stroop test when colors and color words disagree (RED), the viewer can focus either on the large letter or the smaller ones. When the viewer is faster at seeing the larger letter, they are said to be showing global precedence; when they’re faster at seeing the component letters, they are said to be showing local precedence. Typically, the greater the number of component letters, the easier it is to see the larger letter. This is consistent with the Gestalt principles of proximity and continuity — elements that are close together and form smooth lines will tend to be perceptually grouped together and seen as a unit (the greater the number of component letters, the closer they will be, and the smoother the line).

In previous research, older adults have often demonstrated local precedence rather than global, although the results have been inconsistent. One earlier study found that older adults performed poorly when asked to report in which direction (horizontal or vertical) dots formed smooth lines, suggesting an age-related decline in perceptual grouping. The present study therefore investigated whether this decline was behind the decrease in global precedence.

In the study 20 young men (average age 22) and 20 older men (average age 57) were shown Navon figures and asked whether the target letter formed the large letter or the smaller letters (e.g., “Is the big or the small letter an E?”). The number of component letters was systematically varied across five quantities. Under such circumstances it is expected that at a certain level of letter density everyone will switch to global precedence, but if a person is impaired at perceptual grouping, this will occur at a higher level of density.

The young men were, unsurprisingly, markedly faster than the older men in their responses. They were also significantly faster at responding when the target was the global letter, compared to when it was the local letter (i.e. they showed global precedence). The older adults, on the other hand, had equal reaction times to global and local targets. Moreover, they showed no improvement as the letter-density increased (unlike the young men).

It is noteworthy that the older men, while they failed to show global precedence, also failed to show local precedence (remember that results are based on group averages; this suggests that the group was evenly balanced between those showing local precedence and those showing global precedence). Interestingly, previous research has suggested that women are more likely to show local precedence.

The link between perceptual grouping and global precedence is further supported by individual differences — older men who were insensitive to changes in letter-density were almost exclusively the ones that showed persistent local precedence. Indeed, increases in letter-density were sometimes counter-productive for these men, leading to even slower reaction times for global targets. This may be the result of greater distractor interference, to which older adults are more vulnerable, and to which this sub-group of older men may have been especially susceptible.

Example of a Navon figure:

FFFFFF
F
FFFFFF
F
FFFFFF

Most research into the importance of folate and B12 levels has centered on seniors, and it does seem clear now that having adequate levels of these vitamins is important for maintaining cognitive functioning as you get older. Folic acid levels are of course also regarded as crucial when the brain is developing, which is why pregnant women are urged to take supplements, and why some countries fortify their bread with it. There is less research in the extensive ground between these two end-points.

A Swedish study involving 386 15-year-olds has now found that those in the top third of folic acid intake (more than 253 micrograms per day for girls and 335 for boys) performed significantly better on their school grades compared to those in the bottom third (less than 173 micrograms folic acid per day for girls and 227 for boys).

Interestingly, while homocysteine levels in the blood were initially significant, this association disappeared after other significant predictors (gender, smoking, and SES) were controlled for. Neither was a genotype linked to higher homocysteine levels (MTHFR 677 TT homozygosity) significantly related to academic achievement. Low folate and B12 levels are associated with higher homocysteine levels in the blood, and there is evidence that it is this increase in homocysteine that is the reason for the cognitive impairment seen in age-related cognitive decline. This finding, then, suggests that this is only one part of the story.

Sweden does not fortify flour with folic acid as the United States, Canada and Australia do. Folate is a B vitamin found particularly in citrus fruit, green leafy vegetables, whole-wheat bread, and dried beans and peas; however, they are often destroyed by cooking or processing.

The sum of school grades in 10 core subjects obtained in the final semester of compulsory 9 years of schooling was used as the measure of academic achievement

Following on from research showing that long-term meditation is associated with gray matter increases across the brain, an imaging study involving 27 long-term meditators (average age 52) and 27 controls (matched by age and sex) has revealed pronounced differences in white-matter connectivity between their brains.

The differences reflect white-matter tracts in the meditators’ brains being more numerous, more dense, more myelinated, or more coherent in orientation (unfortunately the technology does not yet allow us to disentangle these) — thus, better able to quickly relay electrical signals.

While the differences were evident among major pathways throughout the brain, the greatest differences were seen within the temporal part of the superior longitudinal fasciculus (bundles of neurons connecting the front and the back of the cerebrum) in the left hemisphere; the corticospinal tract (a collection of axons that travel between the cerebral cortex of the brain and the spinal cord), and the uncinate fasciculus (connecting parts of the limbic system, such as the hippocampus and amygdala, with the frontal cortex) in both hemispheres.

These findings are consistent with the regions in which gray matter increases have been found. For example, the tSLF connects with the caudal area of the temporal lobe, the inferior temporal gyrus, and the superior temporal gyrus; the UNC connects the orbitofrontal cortex with the amygdala and hippocampal gyrus

It’s possible, of course, that those who are drawn to meditation, or who are likely to engage in it long term, have fundamentally different brains from other people. However, it is more likely (and more consistent with research showing the short-term effects of meditation) that the practice of meditation changes the brain.

The precise mechanism whereby meditation might have these effects can only be speculated. However, more broadly, we can say that meditation might induce physical changes in the brain, or it might be protecting against age-related reduction. Most likely of all, perhaps, both processes might be going on, perhaps in different regions or networks.

Regardless of the mechanism, the evidence that meditation has cognitive benefits is steadily accumulating.

The number of years the meditators had practiced ranged from 5 to 46. They reported a number of different meditation styles, including Shamatha, Vipassana and Zazen.

Another study showing the value of exercise for preserving your mental faculties in old age. This time it has to do with the development of small brain lesions or infarcts called "silent strokes." Don’t let the words “small” and “silent” fool you — these lesions have been linked to memory problems and even dementia, as well as stroke, an increased risk of falls and impaired mobility.

The study involved 1,238 people taken from the Northern Manhattan Study, a long-running study looking at stroke and vascular problems in a diverse community. Their brains were scanned some six years after completing an exercise questionnaire, when they were an average of 70 years old. The scans found that 16% of the participants had these small brain lesions.

Those who had reported engaging in moderate to intense exercise were 40% less likely to have these infarcts compared to people who did no regular exercise. Depressingly, there was no significant difference between those who engaged in light exercise and those who didn’t exercise (which is not to say that light exercise doesn’t help in other regards! a number of studies have pointed to the value of regular brisk walking for fighting cognitive decline). This is consistent with earlier findings that only the higher levels of activity consistently protect against stroke.

The results remained the same after other vascular risk factors such as high blood pressure, high cholesterol and smoking, were accounted for. Of the participants, 43% reported no regular exercise; 36% engaged in regular light exercise (e.g., golf, walking, bowling or dancing); 21% engaged in regular moderate to intense exercise (e.g., hiking, tennis, swimming, biking, jogging or racquetball).

However, there was no association with white matter lesions, which have also been associated with an increased risk of stroke and dementia.

Moreover, this effect was not seen among those with Medicaid or no health insurance, suggesting that lower socioeconomic status (or perhaps poorer access to health care) is associated with negative factors that counteract the benefits of exercise. Previous research has found that lower SES is associated with higher cardiovascular disease regardless of access to care.

Of the participants, 65% were Hispanic, 17% non-Hispanic black, and 15% non-Hispanic white. Over half (53%) had less than high school education, and 47% were on Medicaid or had no health insurance.

It wasn’t so long ago we believed that only young brains could make neurons, that once a brain was fully matured all it could do was increase its connections. Then we found out adult brains could make new neurons too (but only in a couple of regions, albeit critical ones). Now we know that neurogenesis in the hippocampus is vital for some operations, and that the production of new neurons declines with age (leading to the idea that the reduction in neurogenesis may be one reason for age-related cognitive decline).

What we didn’t know is why this happens. A new study, using mice genetically engineered so that different classes of brain cells light up in different colors, has now revealed the life cycle of stem cells in the brain.

Adult stem cells differentiate into progenitor cells that ultimately give rise to mature neurons. It had been thought that the stem cell population remained stable, but that these stem cells gradually lose their ability to produce neurons. However, the mouse study reveals that during the mouse's life span, the number of brain stem cells decreased 100-fold. Although the rate of this decrease actually slows with age, and the output per cell (the number of progenitor cells each stem cell produces) increases, nevertheless the pool of stem cells is dramatically reduced over time.

The reason this happens (and why it wasn’t what we expected) is explained in a computational model developed from the data. It seems that stem cells in the brain differ from other stem cells. Adult stem cells in the brain wait patiently for a long time until they are activated. They then undergo a series of rapid divisions that give rise to progeny that differentiate into neurons, before ‘retiring’ to become astrocytes. What this means is that, unlike blood or gut stem cells (that renew themselves many times), brain stem cells are only used once.

This raises a somewhat worrying question: if we encourage neurogenesis (e.g. by exercise or drugs), are we simply using up stem cells prematurely? The researchers suggest the answer depends on how the neurogenesis has been induced. Parkinson's disease and traumatic brain injury, for example, activate stem cells directly, and so may reduce the stem cell population. However, interventions such as exercise stimulate the progenitor cells, not the stem cells themselves.

The brain tends to shrink with age, with different regions being more affected than others. Atrophy of the hippocampus, so vital for memory and learning, is associated with increased risk of developing Alzheimer’s, and has also been linked to depression.

In a study involving 268 older adults (58+), the hippocampus of those reporting a life-changing religious experience was found to be shrinking significantly more compared to those not reporting such an experience. Significantly greater hippocampal atrophy was also found among born-again Protestants, Catholics, and those with no religious affiliation, compared with Protestants not identifying as born-again.

The participants are not a general sample — they were originally recruited for the NeuroCognitive Outcomes of Depression in the Elderly. However, some of the participants were from the control group, who had no history of depression. Brain scans were taken at the beginning of the study, and then every two years. The length of time between the baseline scan and the final scan ranged from 2 to 8 years (average was 4).

Questions about religious experiences were asked in an annual survey, so could change over time. Two-thirds of the group was female, and 87% were white. The average age was 68. At baseline, 42% of the group was non-born-again Protestant, 36% born-again Protestant; 8% Catholic; 6% other religion. Only 7% reported themselves as having no religion. By the end of the study, 44% (119 participants) reported themselves born-again, and 13% (36) reported having had life-changing religious experiences.

These associations persisted after depression status, acute stress, and social support were taken into account. Nor did other religious factors (such as prayer, meditation, or Bible study) account for the changes.

It is still possible that long-term stress might play a part in this association — the study measured acute rather than cumulative stress. The researchers suggest that life-changing religious experiences can be stressful, if they don’t fit in with your existing beliefs or those of your family and friends, or if they lead to new social systems that add to your stress.

Of course, the present results can be interpreted in several ways — is it the life-changing religious experience itself that is the crucial factor? Or the factors leading up to that experience? Or the consequences of that experience? Still, it’s certainly an intriguing finding, and it will be interesting to see more research expanding and confirming (or not!) this result.

More generally, the findings may help clarify the conflicting research about the effects of religion on well-being, by pointing to the fact that religion can’t be considered a single factor, but one subject to different variables, some of which may be positive and others not.

A number of studies have demonstrated the cognitive benefits of music training for children. Now research is beginning to explore just how long those benefits last. This is the second study I’ve reported on this month, that points to childhood music training protecting older adults from aspects of cognitive decline. In this study, 37 adults aged 45 to 65, of whom 18 were classified as musicians, were tested on their auditory and visual working memory, and their ability to hear speech in noise.

The musicians performed significantly better than the non-musicians at distinguishing speech in noise, and on the auditory temporal acuity and working memory tasks. There was no difference between the groups on the visual working memory task.

Difficulty hearing speech in noise is among the most common complaints of older adults, but age-related hearing loss only partially accounts for the problem.

The musicians had all begun playing an instrument by age 8 and had consistently played an instrument throughout their lives. Those classified as non-musicians had no musical experience (12 of the 19) or less than three years at any point in their lives. The seven with some musical experience rated their proficiency on an instrument at less than 1.5 on a 10-point scale, compared to at least 8 for the musicians.

Physical activity levels were also assessed. There was no significant difference between the groups.

The finding that visual working memory was not affected supports the idea that musical training helps domain-specific skills (such as auditory and language processing) rather than general ones.

As we get older, when we suffer memory problems, we often laughingly talk about our brain being ‘full up’, with no room for more information. A new study suggests that in some sense (but not the direct one!) that’s true.

To make new memories, we need to recognize that they are new memories. That means we need to be able to distinguish between events, or objects, or people. We need to distinguish between them and representations already in our database.

We are all familiar with the experience of wondering if we’ve done something. Is it that we remember ourselves doing it today, or are we remembering a previous occasion? We go looking for the car in the wrong place because the memory of an earlier occasion has taken precedence over today’s event. As we age, we do get much more of this interference from older memories.

In a new study, the brains of 40 college students and older adults (60-80) were scanned while they viewed pictures of everyday objects and classified them as either "indoor" or "outdoor." Some of the pictures were similar but not identical, and others were very different. It was found that while the hippocampus of young students treated all the similar pictures as new, the hippocampus of older adults had more difficulty with this, requiring much more distinctiveness for a picture to be classified as new.

Later, the participants were presented with completely new pictures to classify, and then, only a few minutes later, shown another set of pictures and asked whether each item was "old," "new" or "similar." Older adults tended to have fewer 'similar' responses and more 'old' responses instead, indicating that they could not distinguish between similar items.

The inability to recognize information as "similar" to something seen recently is associated with “representational rigidity” in two areas of the hippocampus: the dentate gyrus and CA3 region. The brain scans from this study confirm this, and find that this rigidity is associated with changes in the dendrites of neurons in the dentate/CA3 areas, and impaired integrity of the perforant pathway — the main input path into the hippocampus, from the entorhinal cortex. The more degraded the pathway, the less likely the hippocampus is to store similar memories as distinct from old memories.

Apart from helping us understand the mechanisms of age-related cognitive decline, the findings also have implications for the treatment of Alzheimer’s. The hippocampus is one of the first brain regions to be affected by the disease. The researchers plan to conduct clinical trials in early Alzheimer's disease patients to investigate the effect of a drug on hippocampal function and pathway integrity.

Growing evidence has pointed to the benefits of social and mental stimulation in preventing dementia, but until now no one has looked at the role of physical environment.

A study involving 1294 healthy older adults found that those whose life-space narrowed to their immediate home were almost twice as likely to develop the condition as those with the largest life-space (out-of-town). The homebound also had an increased risk of MCI and a faster rate of global cognitive decline.

By the end of the eight-year study (average follow-up of 4.4 years), 180 people (13.9%) had developed Alzheimer’s. The association remained after physical function, disability, depressive symptoms, social network size, vascular disease burden, and vascular risk factors, were taken into account.

It may be that life-space is an indicator of how engaged we are with the world, with the associated cognitive stimulation that offers.

A study involving 70 older adults (60-83) has found that those with at least ten years of musical training performed the best on cognitive tests, followed by those with one to nine years of musical study, with those with no musical training trailing the field.

All the musicians were amateurs who began playing an instrument at about 10 years of age. Half of the high-level musicians still played an instrument at the time of the study, but they didn't perform better on the cognitive tests than the other advanced musicians who had stopped playing years earlier. Previous research suggests that both years of musical participation and age of acquisition are critical.

All the participants had similar levels of education and fitness. The cognitive tests related to visuospatial memory, naming objects and executive function.

Hanna-Pladdy, B. & MacKay, A. 2011. The relation between instrumental musical activity and cognitive aging. Neuropsychology, 25 (3), 378-86. doi: 10.1037/a0021895

Adding to the growing evidence that social activity helps prevent age-related cognitive decline, a longitudinal study involving 1,138 older adults (mean age 80) has found that those who had the highest levels of social activity (top 10%) experienced only a quarter of the rate of cognitive decline experienced by the least socially active individuals (bottom 10%). The participants were followed for up to 12 years (mean of 5 years).

Social activity was measured using a questionnaire that asked participants whether, and how often, in the previous year they had engaged in activities that involve social interaction—for example, whether they went to restaurants, sporting events or the teletract (off-track betting) or played bingo; went on day trips or overnight trips; did volunteer work; visited relatives or friends; participated in groups such as the Knights of Columbus; or attended religious services.

Analysis adjusted for age, sex, education, race, social network size, depression, chronic conditions, disability, neuroticism, extraversion, cognitive activity, and physical activity.

There has been debate over whether the association between social activity and cognitive decline is because inactivity leads to impairment, or because impairment leads to inactivity. This study attempted to solve this riddle. Participants were evaluated yearly, and analysis indicates that the inactivity precedes decline, rather than the other way around. Of course, it’s still possible that there are factors common to both that affect social engagement before showing up in a cognitive test. But even in such a case, it seems likely that social inactivity increases the rate of cognitive decline.

[2228] James, B. D., Wilson R. S., Barnes L. L., & Bennett D. A.
(2011).  Late-Life Social Activity and Cognitive Decline in Old Age.
Journal of the International Neuropsychological Society. FirstView, 1 - 8.

A study involved 117 older adults (mean age 78) found those at greater risk of coronary artery disease had substantially greater risk for decline in verbal fluency and the ability to ignore irrelevant information. Verbal memory was not affected.

The findings add to a growing body of research linking cardiovascular risk factors and age-related cognitive decline, leading to the mantra: What’s good for the heart is good for the brain.

The study also found that the common classification into high and low risk groups was less useful in predicting cognitive decline than treating risk as a continuous factor. This is consistent with a growing view that no cognitive decline is ‘normal’, but is always underpinned by some preventable damage.

Risk for coronary artery disease was measured with the Framingham Coronary Risk Score, which uses age, cholesterol levels, blood pressure, presence of diabetes, and smoking status to generate a person's risk of stroke within 10 years. 37 (31%) had high scores. Age, education, gender, and stroke history were controlled for in the analysis.

Gooblar, J., Mack, W.J., Chui, H.C., DeCarli, C., Mungas, D., Reed, B.R. & Kramer, J.H. 2011. Framingham Coronary Risk Profile Predicts Poorer Executive Functioning in Older Nondemented Adults. Presented at the American Academy of Neurology annual meeting on Tuesday, April 12, 2011.

A study involving 200 older adults (70+) experiencing a stay in hospital has found that at discharge nearly a third (31.5%) had previously unrecognized low cognitive function (scoring below 25 on the MMSE if high-school-educated, or below 18 if not). This impairment had disappeared a month later for more than half (58%).The findings are consistent with previous research showing a lack of comprehension of discharge instructions, often resulting in rehospitalization.

The findings demonstrate the effects of hospitalization on seniors, and point to the need for healthcare professionals and family to offer additional support. It’s suggested that patient self-management may be better taught as an outpatient following discharge rather than at the time of hospital discharge.

Sleep disruption and stress are presumed to be significant factors in why this occurs.

From the Whitehall II study, data involving 5431 older participants (45-69 at baseline) has revealed a significant effect of midlife sleep changes on later cognitive function. Sleep duration was assessed at one point between 1997 and 1999, and again between 2002 and 2004. A decrease in average night’s sleep from 6, 7, or 8 hours was significantly associated with poorer scores on tests of reasoning, vocabulary, and the MMSE. An increase from 7 or 8 hours (but not from 6 hours) was associated with lower scores on these, as well as on tests of phonemic and semantic fluency. Short-term verbal memory was not significantly affected. The magnitude of these effects was equivalent to a 4–7 year increase in age.

Around 8% of participants showed an increase from 7-8 hours of sleep over the five-year period (7.4% of women; 8.6% of men), while around a quarter of women and 18% of men decreased their sleep amount from 6-8 hours. About 58% of men and 50% of women reported no change in sleep duration during the study period. Some 27% of the participants were women.

The optimal amount of sleep (in terms of highest cognitive performance) was 7 hours for women, closely followed by 6 hours. For men, results were similar at 6, 7 and 8 hours.

Analysis took into account age, sex, education and occupational status. The Whitehall II study is a large, long-running study involving British civil servants. Sleep duration was assessed simply by responses to the question "How many hours of sleep do you have on an average week night?"

A very large Chinese study, involving 28,670 older adults (50-85), of whom some 72% were women, also supports an inverted U-shaped association between sleep duration and cognitive function, with 7-8 hours sleep associated with the highest scores on a delayed word recall test.

I would speculate that this finding of an effect of short-term verbal memory (in contrast to that of the Whitehall study) may reflect a group distinction in terms of education and occupation. The Whitehall study is the more homogenous (mostly white-collar), with participants probably averaging greater cognitive reserve than the community-based Chinese study. The findings suggest that memory is slower to be affected, rather than not affected.

Ferrie JE; Shipley MJ; Akbaraly TN; Marmot MG; Kivimäki M; Singh-Manoux A. Change in sleep duration and cognitive function: findings from the Whitehall II study. SLEEP 2011;34(5):565-573.

Xu L; Jiang CQ; Lam TH; Liu B; Jin YL; Zhu T; Zhang WS; Cheng KK; Thomas GN. Short or long sleep duration is associated with memory impairment in older Chinese: the Guangzhou Biobank Cohort Study. SLEEP 2011;34(5):575-580.

A review of 23 longitudinal studies of older adults (65+) has found that small amounts of alcohol were associated with lower incidence rates of overall dementia and Alzheimer dementia, but not of vascular dementia or age-related cognitive decline. A three-year German study involving 3,327 adults aged 75+ extends the evidence to the older-old.

The study found alcohol consumption was significantly associated with 3 other factors that helped protect against dementia: better education, not living alone, and absence of depression. Nevertheless, the lower risk remained after accounting for these factors.

The ‘magic’ amount of alcohol was between 20-29g, roughly 2-3 drinks a day. As in other studies, a U-shaped effect was found, with higher risk found among both those who consumed less than this amount of alcohol, and those who consumed more.

A study involving 614 middle-aged vineyard workers has found that those who were exposed to pesticides were five times as likely to perform more poorly on cognitive tests compared to those not exposed, and twice as likely to show cognitive decline over a two-year period.

Participants were in their 40s and 50s and had worked for at least 20 years in the agricultural sector. One in five had never been exposed to pesticides as part of their job; over half had been directly exposed, and the remainder had been possibly or certainly indirectly exposed. Educational level, age, sex, alcohol consumption, smoking, psychotropic drug use and depressive symptoms were taken into account.

What makes one person so much better than another in picking up a new motor skill, like playing the piano or driving or typing? Brain imaging research has now revealed that one of the reasons appears to lie in the production of a brain chemical called GABA, which inhibits neurons from responding.

The responsiveness of some brains to a procedure that decreases GABA levels (tDCS) correlated both with greater brain activity in the motor cortex and with faster learning of a sequence of finger movements. Additionally, those with higher GABA concentrations at the beginning tended to have slower reaction times and less brain activation during learning.

It’s simplistic to say that low GABA is good, however! GABA is a vital chemical. Interestingly, though, low GABA has been associated with stress — and of course, stress is associated with faster reaction times and relaxation with slower ones. The point is, we need it in just the right levels, and what’s ‘right’ depends on context. Which brings us back to ‘responsiveness’ — more important than actual level, is the ability of your brain to alter how much GABA it produces, in particular places, at particular times.

However, baseline levels are important, especially where something has gone wrong. GABA levels can change after brain injury, and also may decline with age. The findings support the idea that treatments designed to influence GABA levels might improve learning. Indeed, tDCS is already in use as a tool for motor rehabilitation in stroke patients — now we have an idea why it works.

[2202] Stagg, C J., Bachtiar V., & Johansen-Berg H.
(2011).  The Role of GABA in Human Motor Learning.
Current Biology. 21(6), 480 - 484.

Comparison of young adults (mean age 24.5) and older adults (mean age 69.1) in a visual memory test involving multitasking has pinpointed the greater problems older adults have with multitasking. The study involved participants viewing a natural scene and maintaining it in mind for 14.4 seconds. In the middle of the maintenance period, an image of a face popped up and participants were asked to determine its sex and age. They were then asked to recall the original scene.

As expected, older people had more difficulty with this. Brain scans revealed that, for both groups, the interruption caused their brains to disengage from the network maintaining the memory and reallocate resources to processing the face. But the younger adults had no trouble disengaging from that task as soon as it was completed and re-establishing connection with the memory maintenance network, while the older adults failed both to disengage from the interruption and to reestablish the network associated with the disrupted memory.

This finding adds to the evidence that an important (perhaps the most important) reason for cognitive decline in older adults is a growing inability to inhibit processing, and extends the processes to which that applies.

A study involving 125 younger (average age 19) and older (average age 69) adults has revealed that while younger adults showed better explicit learning, older adults were better at implicit learning. Implicit memory is our unconscious memory, which influences behavior without our awareness.

In the study, participants pressed buttons in response to the colors of words and random letter strings — only the colors were relevant, not the words themselves. They then completed word fragments. In one condition, they were told to use words from the earlier color task to complete the fragments (a test of explicit memory); in the other, this task wasn’t mentioned (a test of implicit memory).

Older adults showed better implicit than explicit memory and better implicit memory than the younger, while the reverse was true for the younger adults. However, on a further test which required younger participants to engage in a number task simultaneously with the color task, younger adults behaved like older ones.

The findings indicate that shallower and less focused processing goes on during multitasking, and (but not inevitably!) with age. The fact that younger adults behaved like older ones when distracted points to the problem, for which we now have quite a body of evidence: with age, we tend to become more easily distracted.

Shrinking of the frontal lobe has been associated with age-related cognitive decline for some time. But other brain regions support the work of the frontal lobe. One in particular is the cerebellum. A study involving 228 participants in the Aberdeen Longitudinal Study of Cognitive Ageing (mean age 68.7) has revealed that there is a significant relationship between grey matter volume in the cerebellum and general intelligence in men, but not women.

Additionally, a number of other brain regions showed an association between gray matter and intelligence, in particular Brodmann Area 47, the anterior cingulate, and the superior temporal gyrus. Atrophy in the anterior cingulate has been implicated as an early marker of Alzheimer’s, as has the superior temporal gyrus.

The gender difference was not completely unexpected — previous research has indicated that the cerebellum shrinks proportionally more with age in men than women. More surprising was the fact that there was no significant association between white memory volume and general intelligence. This contrasts with the finding of a study involving older adults aged 79-80. It is speculated that this association may not develop until greater brain atrophy has occurred.

It is also interesting that the study found no significant relationship between frontal lobe volume and general intelligence — although the effect of cerebellar volume is assumed to occur via its role in supporting the frontal lobe.

The cerebellum is thought to play a vital role in three relevant areas: speed of information processing; variability of information processing; development of automaticity through practice.

The new label of ‘metabolic syndrome’ applies to those having three or more of the following risk factors: high blood pressure, excess belly fat, higher than normal triglycerides, high blood sugar and low high-density lipoprotein (HDL) cholesterol (the "good" cholesterol). Metabolic syndrome has been linked to increased risk of heart attack.

A new French study, involving over 7,000 older adults (65+) has found that those with metabolic syndrome were 20% more likely to show cognitive decline on a memory test (MMSE) over a two or four year interval. They were also 13% more likely to show cognitive decline on a visual working memory test. Specifically, higher triglycerides and low HDL cholesterol were linked to poorer memory scores; diabetes (but not higher fasting blood sugar) was linked to poorer visual working memory and word fluency scores.

The findings point to the importance of managing the symptoms of metabolic syndrome.

High cholesterol and blood pressure in middle age tied to early memory problems

Another study, involving some 4800 middle-aged adults (average age 55), has found that those with higher cardiovascular risk were more likely to have lower cognitive function and a faster rate of cognitive decline over a 10-year period. A 10% higher cardiovascular risk was associated not only with increased rate of overall mental decline, but also poorer cognitive test scores in all areas except reasoning for men and fluency for women.

The cardiovascular risk score is based on age, sex, HDL cholesterol, total cholesterol, systolic blood pressure and whether participants smoked or had diabetes.

Memory problems may be sign of stroke risk

A very large study (part of the REGARDS study) tested people age 45 and older (average age 67) who had never had a stroke. Some 14,842 people took a verbal fluency test, and 17,851 people took a word recall memory test. In the next 4.5 years, 123 participants who had taken the verbal fluency test and 129 participants who had taken the memory test experienced a stroke.

Those who had scored in the bottom 20% for verbal fluency were 3.6 times more likely to develop a stroke than those who scored in the top 20%. For the memory test, those who scored in the bottom 20% were 3.5 times more likely to have a stroke than those in the top quintile.

The effect was greatest at the younger ages. At age 50, those who scored in the bottom quintile of the memory test were 9.4 times more likely to later have a stroke than those in the top quintile.

 

Together, these studies, which are consistent with many previous studies, confirm that cardiovascular problems and diabetes add to the risk of greater cognitive decline (and possible dementia) in old age. And point to the importance of treating these problems as soon as they appear.

[2147] Raffaitin, C., Féart C., Le Goff M., Amieva H., Helmer C., Akbaraly T. N., et al.
(2011).  Metabolic syndrome and cognitive decline in French elders.
Neurology. 76(6), 518 - 525.

The findings of the second and third studies are to be presented at the American Academy of Neurology's 63rd Annual Meeting in Honolulu April 9 to April 16, 2011

Lesions of the brain microvessels include white-matter hyperintensities and the much less common silent infarcts leading to loss of white-matter tissue. White-matter hyperintensities are common in the elderly, and are generally regarded as ‘normal’ (although a recent study suggested we should be less blasé about them — that ‘normal’ age-related cognitive decline reflects the presence of these small lesions). However, the degree of white-matter lesions is related to the severity of decline (including increasing the risk of Alzheimer’s), and those with hypertension or diabetes are more likely to have a high number of them.

A new study has investigated the theory that migraines might also lead to a higher number of white-matter hyperintensities. The ten-year French population study involved 780 older adults (65+; mean age 69). A fifth of the participants (21%) reported a history of severe headaches, of which 71% had migraines.

Those with severe headaches were twice as likely to have a high quantity of white-matter hyperintensities as those without headaches. However, there was no difference in cognitive performance between the groups. Those who suffered from migraines with aura (2% of the total), also showed an increased number of silent cerebral infarcts — a finding consistent with other research showing that people suffering from migraine with aura have an increased risk of cerebral infarction (or strokes). But again, no cognitive decline was observed.

The researchers make much of their failure to find cognitive impairment, but I would note that, nevertheless, the increased number of brain lesions does suggest that, further down the track, there is likely to be an effect on cognitive performance. Still, headache sufferers can take comfort in the findings, which indicate the effect is not so great that it shows up in this decade-long study.

Another study has come out proclaiming the cognitive benefits of walking for older adults. Previously sedentary adults aged 55-80 who walked around a track for 40 minutes on three days a week for a year increased the size of their hippocampus, as well as their level of BDNF. Those assigned to a stretching routine showed no such growth. There were 120 participants in the study.

The growth of around 2% contrasts with the average loss of 1.4% hippocampal tissue in the stretching group — an amount of atrophy considered “normal” with age. Although both groups improved their performance on a computerized spatial memory test, the walkers improved more.

The findings are consistent with a number of animal studies showing aerobic exercise increases neurogenesis and BDNF in the hippocampus, and human studies pointing to a lower risk of cognitive decline and dementia in those who walk regularly.

[2097] Erickson, K. I., Voss M. W., Prakash R S., Basak C., Szabo A., Chaddock L., et al.
(Submitted).  Exercise training increases size of hippocampus and improves memory.
Proceedings of the National Academy of Sciences.

We have thought of memory problems principally in terms of forgetting, but using a new experimental method with amnesic animals has revealed that confusion between memories, rather than loss of memory, may be more important.

While previous research has found that amnesic animals couldn't distinguish between a new and an old object, the new method allows responses to new and old objects to be measured separately. Control animals, shown an object and then shown either the same or another object an hour later, spent more time (as expected) with the new object. However, amnesic animals spent less time with the new object, indicating they had some (false) memory of it.

The researchers concluded that the memory problems were the result of the brain's inability to register complete memories of the objects, and that the remaining, less detailed memories were more easily confused. In other words, it’s about poor encoding, not poor retrieval.

Excitingly, when the amnesic animals were put in a dark, quiet space before the memory test, they performed perfectly on the test.

The finding not only points to a new approach for helping those with memory problems (for example, emphasizing differentiating details), but also demonstrates how detrimental interference from other things can be when we are trying to remember something — an issue of particular relevance in modern information-rich environments. The extent to which these findings apply to other memory problems, such as dementia, remains to be seen.

In a study in which 78 healthy elders were given 5 different tests and then tested for cognitive performance 18 months later, two tests combined to correctly predict nearly 80% of those who developed significant cognitive decline. These tests were a blood test to identify presence of the ‘Alzheimer’s gene’ (APOE4), and a 5-minute fMRI imaging scan showing brain activity during mental tasks.

The gene test in itself correctly classified 61.5% of participants (aged 65-88; mean age 73), showing what a strong risk factor this is, but when taken with activity on the fMRI test, the two together correctly classified 78.9% of participants. Age, years of education, gender and family history of dementia were not accurate predictors of future cognitive decline. A smaller hippocampus was also associated with a greater risk of cognitive decline.

These two tests are readily available and not time-consuming, and may be useful in identifying those at risk of MCI and dementia.

Woodard, J.L.  et al. 2010. Prediction of Cognitive Decline in Healthy Older Adults using fMRI. Journal of Alzheimer’s Disease, 21 (3), 871-885.

Following on from previous studies showing that drinking beet juice can lower blood pressure, a study involving 14 older adults (average age 75) has found that after two days of eating a high-nitrate breakfast, which included 16 ounces of beet juice, blood flow to the white matter of the frontal lobes (especially between the dorsolateral prefrontal cortex and anterior cingulate cortex) had increased. This area is critical for executive functioning.

Poor blood flow in the brain is thought to be a factor in age-related cognitive decline and dementia.

High concentrations of nitrates are found in beets, as well as in celery, cabbage and other leafy green vegetables like spinach and some lettuce. When you eat high-nitrate foods, good bacteria in the mouth turn nitrate into nitrite. Research has found that nitrites can help open up the blood vessels in the body, increasing blood flow and oxygen specifically to places that are lacking oxygen.

A six-year study involving over 1200 older women (70+) has found that low amounts of albumin in the urine, at levels not traditionally considered clinically significant, strongly predict faster cognitive decline in older women. Participants with a urinary albumin-to-creatinine ratio of >5 mcg/mg at the start of the study experienced cognitive decline at a rate 2 to 7 times faster in all cognitive measures than that attributed to aging alone over an average 6 years of follow-up. The ability most affected was verbal fluency. Albuminuria may be an early marker of diffuse vascular disease.

Data from 19,399 individuals participating in the Renal Reasons for Geographic and Racial Differences in Stroke (REGARDS) study, of whom 1,184 (6.1%) developed cognitive impairment over an average follow-up of 3.8 years, has found that those with albuminuria were 1.31-1.57 times more likely to develop cognitive impairment compared to individuals without albuminuria. This association was strongest for individuals with normal kidney function. Conversely, low kidney function was associated with a higher risk for developing cognitive impairment only among individuals without albuminuria. Surprisingly, individuals with albuminuria and normal kidney function had a higher probability for developing cognitive impairment as compared to individuals with moderate reductions in kidney function in the absence of albuminuria.

Both albuminuria and low kidney function are characteristics of kidney disease.

Lin, J., Grodstein, F., Kang, J.H. & Curhan, G. 2010. A Prospective Study of Albuminuria and Cognitive Decline in Women. Presented at ASN Renal Week 2010 on November 20 in Denver, CO.

Tamura, M.K. et al. 2010. Albuminuria, Kidney Function and the Incidence of Cognitive Impairment in US Adults. Presented at ASN Renal Week 2010 on November 20 in Denver, CO.

More evidence that vascular disease plays a crucial role in age-related cognitive impairment and Alzheimer’s comes from data from participants in the Alzheimer's Disease Neuroimaging Initiative.

The study involved more than 800 older adults (55-90), including around 200 cognitively normal individuals, around 400 people with mild cognitive impairment, and 200 people with Alzheimer's disease. The first two groups were followed for 3 years, and the Alzheimer’s patients for two. The study found that the extent of white matter hyperintensities (areas of damaged brain tissue typically caused by cardiovascular disease) was an important predictor of cognitive decline.

Participants whose white matter hyperintensities were significantly above average at the beginning of the study lost more points each year in cognitive testing than those whose white matter hyperintensities were average at baseline. Those with mild cognitive impairment or Alzheimer's disease at baseline had additional declines on their cognitive testing each year, meaning that the presence of white matter hyperintensities and MCI or Alzheimer's disease together added up to even faster and steeper cognitive decline.

The crucial point is that this was happening in the absence of major cardiovascular events such as heart attacks, indicating that it’s not enough to just reduce your cardiovascular risk factors to a moderate level — every little bit of vascular damage counts.

A simple new cognitive assessment tool with only 16 items appears potentially useful for identifying problems in thinking, learning and memory among older adults. The Sweet 16 scale is scored from zero to 16 (with 16 representing the best score) and includes questions that address orientation (identification of person, place, time and situation), registration, digit spans (tests of verbal memory) and recall. The test requires no props (not even pencil and paper) and is easy to administer with a minimum of training. It only takes an average of 2 minutes to complete.

A score of 14 or less correctly identified 80% of those with cognitive impairment (as identified by the Informant Questionnaire on Cognitive Decline in the Elderly) and correctly identified 70% of those who did not have cognitive impairment. In comparison, the standard MMSE correctly identified 64% of those with cognitive impairment and 86% of those who were not impaired. In other words, the Sweet 16 missed diagnosing 20% of those who were (according to this other questionnaire) impaired and incorrectly diagnosed as impaired 30% of those who were not impaired, while the MMSE missed 36% of those who were impaired but only incorrectly diagnosed as impaired 14% of those not impaired.

Thus, the Sweet 16 seems to be a great ‘first cut’, since its bias is towards over-diagnosing impairment. It should also be remembered that the IQCDE is not the gold standard for cognitive impairment; its role here is to provide a basis for comparison between the new test and the more complex MMSE. In comparison with a clinician’s diagnosis, Sweet 16 scores of 14 or less occurred in 99% of patients diagnosed by a clinician to have cognitive impairment and 28% of those without such a diagnosis.

The great benefit of the new test is of course its speed and simplicity, and it seems to offer great promise as an initial screening tool. Another benefit is that it supposedly is unaffected by the patient’s education, unlike the MMSE. The tool is open access.

The Sweet 16 was developed using information from 774 patients who completed the MMSE, and then validated using a different group of 709 older adults.

[1983] Fong, T. G., Jones R. N., Rudolph J. L., Yang F. M., Tommet D., Habtemariam D., et al.
(2010).  Development and Validation of a Brief Cognitive Assessment Tool: The Sweet 16.
Arch Intern Med. archinternmed.2010.423 - archinternmed.2010.423.

There have been mixed findings about the benefits of DHA (an omega-3 fatty acid), but in a study involving 485 older adults (55+) with age-related cognitive impairment, those randomly assigned to take DHA for six months improved the score on a visuospatial learning and episodic memory test. Higher levels of DHA in the blood correlated with better scores on the paired associate learning task. DHA supplementation was also associated with better verbal recognition, but not better working memory or executive function.

Other research has found no benefit from DHA to those already with Alzheimer’s, although those with Alzheimer’s tend to have lower levels of DHA in the blood. These findings reinforce the idea that the benefit of many proactive lifestyle strategies, such as diet and exercise, may depend mainly on their use before systems deteriorate.

The daily dose of algal DHA was 900 mg. The study took place at 19 clinical sites in the U.S., and those involved had an MMSE score greater than 26.

A study involving young (average age 22) and older adults (average age 77) showed participants pictures of overlapping faces and places (houses and buildings) and asked them to identify the gender of the person. While the young adults showed activity in the brain region for processing faces (fusiform face area) but not in the brain region for processing places (parahippocampal place area), both regions were active in the older adults. Additionally, on a surprise memory test 10 minutes later, older adults who showed greater activation in the place area were more likely to recognize what face was originally paired with what house.

These findings confirm earlier research showing that older adults become less capable of ignoring irrelevant information, and shows that this distracting information doesn’t merely interfere with what you’re trying to attend to, but is encoded in memory along with that information.

Do retired people tend to perform more poorly on cognitive tests than working people because you’re more likely to retire if your mental skills are starting to decline, or because retirement dulls the brain?

For nearly 20 years the United States has surveyed more than 22,000 Americans over age 50 every two years, and administered memory tests. A similar survey has also been taking place in Europe. A comparison of the 2004 data for the U.S., England, and eleven European countries (Austria, Belgium, Denmark, France, Germany, Greece, Italy, The Netherlands, Spain, Sweden, and Switzerland) has now revealed differences in the level of cognitive performance among older adults between the countries (the 60-64 year age group was used as it represents the greatest retirement-age difference between nations).

These differences show some correlation with differences in the age of retirement. Moreover, the differences also correlate to differences in government policy in terms of pensions — supporting the view that it is retirement that is causing the mental decline, not the decline that brings about early retirement.

Memory was tested through a simple word recall task — recalling a list of 10 nouns immediately and 10 minutes later. People in the United States did best, with an average score of 11 out of a possible 20. Those in England were very close behind, and Denmark and Sweden were both around 10. Switzerland, Germany and the Netherlands, and Austria were all clustered between 9 and 9 ½; Belgium and Greece a little lower. France averaged 8; Italy 7; Spain (the lowest) just over 6.

Now when the average cognitive score is mapped against the percentage of retired for 60-64 year olds, the points for each country (with one exception) cluster around a line with a slope of -5, indicating that there is a systematic relationship between these two variables, and that on average being retired is associated with a lower memory score of about 5 points on a 20-point scale. This is a very large effect.

But the correlation is not (unsurprisingly) exact. Although the top scorers, U.S., England and Denmark, are among those nations who have lower retirement rates at this age, Switzerland has the same levels as the U.S., and Sweden has the fewest retired of all (around 40% compared to around 47% for the U.S. and Switzerland). Most interesting of all, why does Spain, which has around 74% retired, show such a low cognitive score, when five other countries have even higher rates of retirement (Austria has over 90% retired)?

There are of course many other differences between the countries. One obvious one to look at would be the degree to which older people who are not working for pay are involved in voluntary work. There’s also the question of the extent to which different countries might have different occupation profiles, assuming that some occupations are more mentally stimulating than others, and the degree to which retired people are engaged in other activities, such as hobbies and clubs.

The paper also raises an important point, namely, that retirement may be preceded by years of ‘winding-down’, during which workers become progressively more reluctant to keep up with changes in their field, and employers become increasingly reluctant to invest in their training.

[1932] Rohwedder, S., & Willis R. J.
(2010).  Mental Retirement.
Journal of Economic Perspectives. 24(1), 119 - 138.

In a study involving 15 young adults, a very small electrical current delivered to the scalp above the right anterior temporal lobe significantly improved their memory for the names of famous people (by 11%). Memory for famous landmarks was not affected. The findings support the idea that the anterior temporal lobes are critically involved in the retrieval of people's names.

A follow-up study is currently investigating whether transcranial direct current stimulation (tDCS) will likewise improve name memory in older adults — indeed, because their level of recall is likely to be lower, it is hoped that the procedure will have a greater effect. If so, the next question is whether repeating tDCS may lead to longer lasting improvement. The procedure may offer hope for rehabilitation for stroke or other neurological damage.

This idea receives support from another recent study, in which 15 students spent six days learning a series of unfamiliar symbols that corresponded to the numbers zero to nine, and also had daily sessions of (tDCS). Five students were given 20 minutes of stimulation above the right parietal lobe; five had 20 minutes of stimulation above the left parietal lobe, and five experienced only 30 seconds of stimulation — too short to induce any permanent changes.

The students were tested on the new number system at the end of each day. After four days, those who had experienced current to the right parietal lobe performed as well as they would be expected to do with normal numbers. However, those who had experienced the stimulation to the left parietal lobe performed significantly worse. The control students performed at a level between the two other groups.

Most excitingly, when the students were tested six months later, they performed at the same level, indicating the stimulation had a durable effect. However, it should be noted that the effects were small and highly variable, and were limited to the new number system. While it may be that one day this sort of approach will be of benefit to those with dyscalculia, more research is needed.

I love cognitive studies on bees. The whole notion that those teeny-tiny brains are capable of the navigation and communication feats bees demonstrate is so wonderful. Now a new study finds that, just like us, aging bees find it hard to remember the location of a new home.

The study builds on early lab research that demonstrated that old bees find it harder to learn floral odors. In this new study, researchers trained bees to a new nest box while their former nest was closed off. Groups composed of mature and old bees were given several days in which to learn the new home location and to extinguish the bees' memory of their unusable former nest box. The new home was then disassembled, and groups of mixed-age bees were given three alternative nest locations to choose from (including the former nest box). Some old bees (those with symptoms of senescence) preferentially went to the former nest site, despite the experience that should have told them that it was unusable.

The findings demonstrate that memory problems and increasing inflexibility with age are not problems confined to mammals.

Following on from earlier research suggesting that simply talking helps keep your mind sharp at all ages, a new study involving 192 undergraduates indicates that the type of talking makes a difference. Engaging in brief (10 minute) conversations in which participants were simply instructed to get to know another person resulted in boosts to their executive function (the processes involved in working memory, planning, decision-making, and so on). However when participants engaged in conversations that had a competitive edge, their performance showed no improvement. The improvement was limited to executive function; neither processing speed nor general knowledge was affected.

Further experiments indicated that competitive discussion could boost executive function — if the conversations were structured to allow for interpersonal engagement. The crucial factor seems to be the process of getting into another person’s mind and trying to see things from their point of view (something most of us do naturally in conversation).

The findings also provide support for the social brain hypothesis — that we evolved our larger brains to help us deal with large social groups. They also support earlier speculations by the researcher, that parents and teachers could help children improve their intellectual skills by encouraging them to develop their social skills.

A long-running study involving 299 older adults (average age 78) has found that those who walked at least 72 blocks during a week of recorded activity (around six to nine miles) had greater gray matter volume nine years later. Gray matter does shrink as we get older, so this is not about growth so much as counteracting decline. Walking more than 72 blocks didn’t appear to confer any additional benefit (in terms of gray matter volume). Moreover, when assessed four years after that, those who had shown this increased brain size were only half as likely to have developed dementia (40% of the participants had developed dementia by this point).

Beginning in 1971, healthy older adults in Gothenburg, Sweden, have been participating in a longitudinal study of their cognitive health. The first H70 study started in 1971 with 381 residents of Gothenburg who were 70 years old; a new one began in 2000 with 551 residents and is still ongoing. For the first cohort (born in 1901-02), low scores on non-memory tests turned out to be a good predictor of dementia; however, these tests were not predictive for the generation born in 1930. Those from the later cohort also performed better in the intelligence tests at age 70 than their predecessors had.

It’s suggested that the higher intelligence is down to the later cohort’s better pre and postnatal care, better nutrition, higher quality education, and better treatment of high blood pressure and cholesterol. And possibly the cognitive demands of modern life.

Nevertheless, the researchers reported that the incidence of dementia at age 75 was little different (5% in the first cohort and 4.4% in the later). However, since a substantially greater proportion of the first cohort were dead by that age (15.7% compared to 4.4% of the 2nd cohort), it seems quite probable that there really was a higher incidence of dementia in the earlier cohort.

The fact that low scores on non-memory cognitive tests were predictive in the first cohort of both dementia and death by age 75 supports this argument.

The fact that low scores on non-memory cognitive tests were not predictive of dementia or death in the later cohort is in keeping with the evidence that higher levels of education help delay dementia. We will need to wait for later findings from this study to see whether that is what is happening.

The findings are not inconsistent with those from a very large U.S. national study that found older adults (70+) are now less likely to be cognitively impaired (see below). It was suggested then also that better healthcare and more education were factors behind this decline in the rate of cognitive impairment.

Previous study:

A new nationally representative study involving 11,000 people shows a downward trend in the rate of cognitive impairment among people aged 70 and older, from 12.2% to 8.7% between 1993 and 2002. It’s speculated that factors behind this decline may be that today’s older people are much likelier to have had more formal education, higher economic status, and better care for risk factors such as high blood pressure, high cholesterol and smoking that can jeopardize their brains. In fact the data suggest that about 40% of the decrease in cognitive impairment over the decade was likely due to the increase in education levels and personal wealth between the two groups of seniors studied at the two time points. The trend is consistent with a dramatic decline in chronic disability among older Americans over the past two decades.

Recent rodent studies add to our understanding of how estrogen affects learning and memory. A study found that adult female rats took significantly longer to learn a new association when they were in periods of their estrus cycle with high levels of estrogen, compared to their ability to learn when their estrogen level was low. The effect was not found among pre-pubertal rats. The study follows on from an earlier study using rats with their ovaries removed, whose learning was similarly affected when given high levels of estradiol.

Human females have high estrogen levels while they are ovulating. These high levels have also been shown to interfere with women's ability to pay attention.

On the other hand, it needs to be remembered that estrogen therapy has been found to help menopausal and post-menopausal women. It has also been found to be detrimental. Recent research has suggested that timing is important, and it’s been proposed that a critical period exists during which hormone therapy must be administered if it is to improve cognitive function.

This finds some support in another recent rodent study, which found that estrogen replacement increased long-term potentiation (a neural event that underlies memory formation) in young adult rats with their ovaries removed, through its effects on NMDA receptors and dendritic spine density — but only if given within 15 months of the ovariectomy. By 19 months, the same therapy couldn’t induce the changes.

Previous research has indicated that obesity in middle-age is linked to higher risk of cognitive decline and dementia in old age. Now a study of 32 middle-aged adults (40-60) has revealed that although obese, overweight and normal-weight participants all performed equally well on a difficult cognitive task (a working memory task called the 2-Back task), obese individuals displayed significantly lower activation in the right inferior parietal cortex. They also had lower insulin sensitivity than their normal weight and overweight peers (poor insulin sensitivity may ultimately lead to diabetes). Analysis pointed to the impaired insulin sensitivity mediating the relationship between task-related activation in that region and BMI.

This suggests that it is insulin sensitivity that is responsible for the higher risk of cognitive impairment later in life. The good news is that insulin sensitivity is able to be modified through exercise and diet.

A follow-up study to determine if a 12-week exercise intervention can reverse the differences is planned.

Inflammation in the brain appears to be a key contributor to age-related memory problems, and it may be that this has to do with the dysregulation of microglia that, previous research has shown, occurs with age. As these specialized support cells in the brain do normally when there’s an infection, with age microglia start to produce excessive cytokines, some of which result in the typical behaviors that accompany illness (sleepiness, appetite loss, cognitive deficits and depression).

Now new cell and mouse studies suggests that the flavenoid luteolin, known to have anti-inflammatory properties, apparently has these benefits because it acts directly on the microglial cells to reduce their production of inflammatory cytokines. It was found that although microglia exposed to a bacterial toxin produced inflammatory cytokines that killed neurons, if the microglia were first exposed to luteolin, the neurons lived. Exposing the neuron to luteolin had no effect.

Old mice fed a luteolin-supplemented diet for four weeks did better on a working memory test than old mice on an ordinary diet, and restored levels of inflammatory cytokines in their brains to that of younger mice.

Luteolin is found in many plants, including carrots, peppers, celery, olive oil, peppermint, rosemary and chamomile.

A long-running study involving 1,157 healthy older adults (65+) who were scored on a 5-point scale according to how often they participated in mental activities such as listening to the radio, watching television, reading, playing games and going to a museum, has found that this score is correlated to the rate of cognitive decline in later years.

Some 5 ½ years after this initial evaluation, 395 (34%) were found to have mild cognitive impairment and 148 (13%) to have Alzheimer’s. Participants were then tested at 3-yearly intervals for the next 6 years. The rate of cognitive decline in those without cognitive impairment was reduced by 52% for each point on the cognitive activity scale, but for those with Alzheimer's disease, the average rate of decline per year increased by 42% for each point on the cognitive activity scale. Rate of decline was unrelated to earlier cognitive activity in those with MCI (presumably they were at the balance point).

This is not terribly surprising when you think of it, if you assume that the benefit of mental stimulation is to improve your brain function so that it can better cope with the damage happening to it. But eventually it reaches the point where it can no longer compensate for that damage because it is so overwhelming.

Findings from the long-running Religious Orders Study, from 354 Catholic nuns and priests who were given annual cognitive tests for up to 13 years before having their brains examined post-mortem, has revealed that even the very early cognitive impairments we regard as normal in aging are associated with dementia pathology. Although pathology in the form of neurofibrillary tangles, Lewy bodies, and cerebral infarctions were all associated with rapid decline, they were also associated with “normal” mild impairment. In the absence of any of these lesions, there was almost no cognitive decline.

Previous research has shown that white matter lesions are very common in older adults, and mild cognitive impairment is more likely in those with quickly growing white matter lesions; importantly, the crucial factor appears to be the rate of growth, not the amount of lesions. This new study extends the finding, suggesting that any age-related cognitive impairment reflects the sort of brain pathology that ultimately leads to dementia (if given enough time). It suggests that we should be more proactive in fighting such damage, instead of simply regarding it as normal.

Type 2 diabetes is known to increase the risk of cognitive impairment in old age. Now analysis of data from 41 older diabetics (aged 55-81) and 458 matched controls in the Victoria Longitudinal Study has revealed that several other factors make it more likely that an older diabetic will develop cognitive impairment. These factors are: having higher (though still normal) blood pressure, having gait and balance problems, and/or reporting yourself to be in bad health regardless of actual problems.

Diabetes and hypertension often go together, and both are separately associated with greater cognitive impairment and dementia risk, so it is not surprising that higher blood pressure is one of the significant factors that increases risk. The other factors are less expected, although gait and balance problems have been linked to cognitive impairment in a recent study, and they may be connected to diabetes through diabetes’ effect on nerves. Negativity about one’s health may reflect emotional factors such as anxiety, stress, or depression, although depression and well-being measures were not themselves found to be mediating effects for cognitive impairment in diabetics (Do note that this study is not investigating which factors, in general, are associated with age-related cognitive impairment; it is trying to establish which factors are specifically sensitive to cognitive impairment in older diabetics).

In the U.S., type 2 diabetes occurs in over 23% of those over 60; in Canada (where this study took place) the rate is 19%. It should be noted that the participants in this study are not representative of the general population, in that they were fairly well-educated older Canadians, most of whom have benefited from a national health care system. Moreover, the study did not have longitudinal data on these various factors, meaning that we don’t know the order of events (which health problems come first? How long between the development of the different problems?). Nevertheless, the findings provide useful markers to alert diabetics and health providers.

Reports on cognitive decline with age have, over the years, come out with two general findings: older adults do significantly worse than younger adults; older adults are just as good as younger adults. Part of the problem is that there are two different approaches to studying this, each with their own specific bias. You can keep testing the same group of people as they get older — the problem with this is that they get more and more practiced, which mitigates the effects of age. Or you can test different groups of people, comparing older with younger — but cohort differences (e.g., educational background) may disadvantage the older generations. There is also argument about when it starts. Some studies suggest we start declining in our 20s, others in our 60s.

One of my favorite cognitive aging researchers has now tried to find the true story using data from the Virginia Cognitive Aging Project involving nearly 3800 adults aged 18 to 97 tested on reasoning, spatial visualization, episodic memory, perceptual speed and vocabulary, with 1616 tested at least twice. This gave a nice pool for both cross-sectional and longitudinal comparison (retesting ranged from 1 to 8 years and averaged 2.5 years).

From this data, Salthouse has estimated the size of practice effects and found them to be as large as or larger than the annual cross-sectional differences, although they varied depending on the task and the participant’s age. In general the practice effect was greater for younger adults, possibly because younger people learn better.

Once the practice-related "bonus points" were removed, age trends were flattened, with much less positive changes occurring at younger ages, and slightly less negative changes occurring at older ages. This suggests that change in cognitive ability over an adult lifetime (ignoring the effects of experience) is smaller than we thought.

Following on from indications that gum disease might be a risk factor for dementia, analysis of data from 152 subjects in the Danish Glostrop Aging Study has revealed that periodontal inflammation at age 70 was strongly associated with lower cognitive scores (on the Digit Symbol Test). Those with periodontal inflammation were nine times more likely to test in the lower range compared to those with little or no periodontal inflammation. A larger follow-up study, among a more ethnically diverse range of subjects, is planned. I hope they also plan to extend the cognitive testing.

The findings were presented by Dr. Angela Kamer at the 2010 annual meeting of the International Association for Dental Research July 16, in Barcelona, Spain.

A two-year study involving 271 older adults (70+) with mild cognitive impairment has found that the rate of brain atrophy in those taking folic acid (0.8 mg/d), vitamin B12 (0.5 mg/d) and vitamin B6 (20 mg/d), was significantly slower than in those taking a placebo, with those taking the supplements experiencing on average 30% less brain atrophy. Higher rates of atrophy were associated with lower cognitive performance. Moreover those who with the highest levels of homocysteine at the beginning of the trial benefited the most, with 50% less brain shrinkage. High levels of homocysteine are a risk factor for Alzheimer’s, and folate, B12 and B6 help regulate it.

The finding that atrophy can be slowed in those with MCI offers hope that the treatment could delay the development of Alzheimer’s, since MCI is a major risk factor for Alzheimer’s, and faster brain atrophy is typical of those who go on to develop Alzheimer’s.

Commercial use is a long way off, but research with mice offers hope for a ‘smart drug’ that doesn’t have the sort of nasty side-effects that, for example, amphetamines have. The mice, genetically engineered to produce dramatically less (70%) kynurenic acid, had markedly better cognitive abilities. The acid, unusually, is produced not in neurons but in glia, and abnormally high levels are produced in the brains of people with disorders such as schizophrenia, Alzheimer's and Huntington's. More acid is also typically produced as we get older.

The acid is produced in our brains after we’ve eaten food containing the amino acid tryptophan, which helps us produce serotonin (turkey is a food well-known for its high tryptophan levels). But serotonin helps us feel good (low serotonin levels are linked to depression), so the trick is to block the production of kynurenic acid without reducing the levels of serotonin. The next step is therefore to find a chemical that blocks production of the acid in the glia, and can safely be used in humans. Although no human tests have yet been performed, several major pharmaceutical companies are believed to be following up on this research.

A number of studies have found evidence that fruits and vegetables help fight age-related cognitive decline, and this has been thought to be due to their antioxidant and anti-inflammatory effects. A new study shows there may be an additional reason why polyphenols benefit the aging brain. One reason why the brain works less effectively as it gets older is that the cells (microglia) that remove and recycle biochemical debris not only fail to do their housekeeping work, but they actually begin to damage healthy cells. Polyphenols restore normal housekeeping, by inhibiting the action of a protein that shuts down the housekeeping (autophagy) process.

While many fruits and vegetables are good sources of polyphenols, berries and walnuts, and fruit and vegetables with deep red, orange, or blue colors, are particularly good.

Poulose, S. & Joseph, J. 2010. Paper presented at the 240th National Meeting of the American Chemical Society.

I have often spoken of the mantra: What’s good for your heart is good for your brain. The links between cardiovascular risk factors and cognitive decline gets more confirmation in this latest finding that people whose hearts pumped less blood had smaller brains than those whose hearts pumped more blood. The study involved 1,504 participants of the decades-long Framingham Offspring Cohort who did not have a history of stroke, transient ischemic attack or dementia. Participants were 34 to 84 years old.

Worryingly, it wasn’t simply those with the least amount of blood pumping from the heart who had significantly more brain atrophy (equivalent to almost two years more brain aging) than the people with the highest cardiac index. Those with levels at the bottom end of normal showed similar levels of brain atrophy. Moreover, although only 7% of the participants had heart disease, 30% had a low cardiac index.

A study involving 65 older adults (59-80), who were very sedentary before the study (reporting less than two episodes of physical activity lasting 30 minutes or more in the previous six months), has found that those who joined a walking group improved their cognitive performance and the connectivity in important brain circuits after a year. However, those who joined a stretching and toning group showed no such improvement. The walking program involved three 40-minute walks at a moderate pace every week. The two affected brain circuits (the default mode network and the fronto-executive network) typically become less connected with age. It is worth emphasizing that the improvement was not evident at the first test, after six months, but only at the second 12-month test.

Interestingly, I noticed in the same journal issue a study into the long-term benefits of dancing for older adults. The study compared physical and cognitive performance of those who had engaged in amateur dancing for many years (average: 16.5 years) and those with no dancing or sporting engagement. The dancing group were overall significantly better than the other group on all tests: posture, balance, reaction time, motor behavior, cognitive performance. However, the best dancers weren’t any better than individuals in the other group; the group difference arose because none of the dancers performed poorly, while many of the other group did.

A very large study of older women has found that although there was a small downward trend in cognitive function (as measured by the MMSE) with increasing obesity, this trend was almost entirely driven by those with a waist-hip ratio below 0.78 — that is, for women who carry excess weight around their hips, known as pear shapes (as opposed to carrying it around the waist, called apple shapes). The study of 8,745 post-menopausal women (aged 65-79) found a drop of around 2 points on the 100-point MMSE for those with a BMI over 40 compared to those who were of normal weight, after controlling for such variables as education, diabetes, heart disease and hypertension, all of which were also significantly associated with BMI and MMSE score. Because 86% of the participants were white, and women belonging to other ethnic groups were not equally distributed between BMI categories, only data from white women were used. Some 70% of the participants were overweight (36%) or obese (34%).

Fat around the middle is thought to make more estrogen, which protects cognitive function. However, although depositing fat around the waist may be better for the brain, it is said to increase the risk of cancer, diabetes and heart disease.

Anticholinergics are widely used for a variety of common medical conditions including insomnia, allergies, or incontinence, and many are sold over the counter. Now a large six-year study of older African-Americans has found that taking one anticholinergic significantly increased an individual's risk of developing mild cognitive impairment and taking two of these drugs doubled this risk. The risk was greater for those who didn’t have the ‘Alzheimer’s gene’, APOE-e4.

This class of drugs includes Benadryl®, Dramamine®, Excedrin PM®, Nytol®, Sominex®, Tylenol PM®, Unisom®, Paxil®, Detrol®, Demerol® and Elavil® (for a more complete list of medications with anticholinergic effects, go to http://www.indydiscoverynetwork.org/AnticholienrgicCognitiveBurdenScale....).

While brain training programs can certainly improve your ability to do the task you’re practicing, there has been little evidence that this transfers to other tasks. In particular, the holy grail has been very broad transfer, through improvement in working memory. While there has been some evidence of this in pilot programs for children with ADHD, a new study is the first to show such improvement in older adults using a commercial brain training program.

A study involving 30 healthy adults aged 60 to 89 has demonstrated that ten hours of training on a computer game designed to boost visual perception improved perceptual abilities significantly, and also increased the accuracy of their visual working memory to the level of younger adults. There was a direct link between improved performance and changes in brain activity in the visual association cortex.

The computer game was one of those developed by Posit Science. Memory improvement was measured about one week after the end of training. The improvement did not, however, withstand multi-tasking, which is a particular problem for older adults. The participants, half of whom underwent the training, were college educated. The training challenged players to discriminate between two different shapes of sine waves (S-shaped patterns) moving across the screen. The memory test (which was performed before and after training) involved watching dots move across the screen, followed by a short delay and then re-testing for the memory of the exact direction the dots had moved.

A review of the many recent studies into the effects of music training on the nervous system strongly suggests that the neural connections made during musical training also prime the brain for other aspects of human communication, including learning. It’s suggested that actively engaging with musical sounds not only helps the plasticity of the brain, but also helps provide a stable scaffolding of meaningful patterns. Playing an instrument primes the brain to choose what is relevant in a complex situation. Moreover, it trains the brain to make associations between complex sounds and their meaning — something that is also important in language. Music training can provide skills that enable speech to be better heard against background noise — useful not only for those with some hearing impairment (it’s a common difficulty as we get older), but also for children with learning disorders. The review concludes that music training tones the brain for auditory fitness, analogous to the way physical exercise tones the body, and that the evidence justifies serious investment in music training in schools.

[1678] Kraus, N., & Chandrasekaran B.
(2010).  Music training for the development of auditory skills.
Nat Rev Neurosci. 11(8), 599 - 605.

A rat study demonstrates how specialized brain training can reverse many aspects of normal age-related cognitive decline in targeted areas. The month-long study involved daily hour-long sessions of intense auditory training targeted at the primary auditory cortex. The rats were rewarded for picking out the oddball note in a rapid sequence of six notes (five of them of the same pitch). The difference between the oddball note and the others became progressively smaller. After the training, aged rats showed substantial reversal of their previously degraded ability to process sound. Moreover, measures of neuron health in the auditory cortex had returned to nearly youthful levels.

Another study has come out showing that older adults with low levels of vitamin D are more likely to have cognitive problems. The six-year study followed 858 adults who were age 65 or older at the beginning of the study. Those who were severely deficient in vitamin D were 60% more likely to have substantial cognitive decline, and 31% more likely to have specific declines in executive function, although there was no association with attention. Vitamin D deficiency is common in older adults in the United States and Europe (levels estimated from 40% to 100%!), and has been implicated in a wide variety of physical disease.

Subjective cognitive impairment (SCI), marked by situations such as when a person recognizes they can't remember a name like they used to or where they recently placed important objects the way they used to, is experienced by between one-quarter and one-half of the population over the age of 65. A seven-year study involving 213 adults (mean age 67) has found that healthy older adults reporting SCI are dramatically more likely to progress to MCI or dementia than those free of SCI (54% vs 15%). Moreover, those who had SCI declined significantly faster.

Reisberg, B. et al. 2010. Outcome over seven years of healthy adults with and without subjective cognitive impairment. Alzheimer's & Dementia, 6 (1), 11-24.

A German study involving nearly 4000 older adults (55+) has found that physical activity significantly reduced the risk of developing mild cognitive impairment over a two-year period. Nearly 14% of those with no physical activity at the start of the study developed cognitive impairment, compared to 6.7% of those with moderate activity, and 5.1% of those with high activity. Moderate activity was defined as less than 3 times a week.

In another report, a study involving 1,324 individuals without dementia found those who reported performing moderate exercise during midlife or late life were significantly less likely to have MCI. Midlife moderate exercise was associated with 39% reduction in the odds of developing MCI, and moderate exercise in late life was associated with a 32% reduction. Light exercise (such as bowling, slow dancing or golfing with a cart) or vigorous exercise (including jogging, skiing and racquetball) were not significantly associated with reduced risk for MCI.

And in a clinical trial involving 33 older adults (55-85) with MCI has found that women who exercised at high intensity levels with an aerobics trainer for 45 to 60 minutes per day, four days per week, significantly improved performance on multiple tests of executive function, compared to those who engaged in low-intensity stretching exercises. The results for men were less significant: high-intensity aerobics was associated only with improved performance on one cognitive task, Trail-making test B, a test of visual attention and task-switching.

A study (“Midlife in the United States”) assessing 3,343 men and women aged 32-84 (mean age 56), of whom almost 40% had at least a 4-year college degree, has found evidence that frequent cognitive activity can counteract the detrimental effect of poor education on age-related cognitive decline. Although, as expected, those with higher education engaged in cognitive activities more often and did better on the memory tests, those with lower education who engaged in reading, writing, attending lectures, doing word games or puzzles once or week or more had memory scores similar to people with more education on tests of episodic memory (although this effect did not occur for executive functioning).

[651] Lachman, M. E., Agrigoroaei S., Murphy C., & Tun P. A.
(2010).  Frequent cognitive activity compensates for education differences in episodic memory.
The American Journal of Geriatric Psychiatry: Official Journal of the American Association for Geriatric Psychiatry. 18(1), 4 - 10.

Previous research has shown that older adults are more likely to incorrectly repeat an action in situations where a prospective memory task has become habitual — for example, taking more medication because they’ve forgotten they’ve already taken it. A new study has found that doing something unusual at the same time helps seniors remember having done the task. In the study, older adults told to put a hand on their heads whenever they made a particular response, reduced the level of repetition errors to that of younger adults. It’s suggested that doing something unusual, like knocking on wood or patting yourself on the head, while taking a daily dose of medicine may be an effective strategy to help seniors remember whether they've already taken their daily medications.

It’s now well established that older brains tend to find it harder to filter out irrelevant information. But now a new study suggests that that isn’t all bad. The study compared the performance of 24 younger adults (17-29) and 24 older adults (60-73) on two memory tasks separated by a 10-minute break. In the first task, they were shown pictures overlapped by irrelevant words, told to ignore the words and concentrate on the pictures only, and to respond every time the same picture appeared twice in a row. The second task required them to remember how the pictures and words were paired together in the first task. The older adults showed a 30% advantage over younger adults in their memory for the preserved pairs. It’s suggested that older adults encode extraneous co-occurrences in the environment and transfer this knowledge to subsequent tasks, improving their ability to make decisions.

[276] Campbell, K. L., Hasher L., & Thomas R. C.
(2010).  Hyper-binding: a unique age effect.
Psychological Science: A Journal of the American Psychological Society / APS. 21(3), 399 - 405.

Full text available at http://pss.sagepub.com/content/early/2010/01/15/0956797609359910.full

A study involving 155 women aged 65-75 has found that those who participated in resistance training once or twice weekly for a year significantly improved their selective attention (maintaining mental focus) and conflict resolution (as well as muscular function of course!), compared to those who participated in twice-weekly balance and tone training. Performance on the Stroop test improved by 12.6% and 10.9% in the once-weekly and twice-weekly resistance training groups respectively, while it deteriorated by 0.5% in the balance and tone group. Improved attention and conflict resolution was also significantly associated with increased gait speed.

A number of rodent studies have shown that blueberries can improve aging memory; now for the first time, a human study provides evidence. In the small study, nine older adults (mean age 76) with mild cognitive impairment (MCI) drank the equivalent of 2-2 l/2 cups of a commercially available blueberry juice every day. After three months they showed significantly improved paired associate learning and word list recall. The findings will of course have to be confirmed by larger trials, but they are consistent with other research.

A companion study involving 12 older adults (75-80) with MCI found that those who drank a pure variety of Concord grape juice for 12 weeks also saw their performance progressively improve on tests in which they had to learn lists and remember items placed in a certain order.

A study involving over 1000 older men and women (60-75) with type-2 diabetes has found that those with higher levels of the stress hormone cortisol in their blood are more likely to have experienced cognitive decline. Higher fasting cortisol levels were associated with greater estimated cognitive decline in general intelligence, working memory and processing speed. This was independent of mood, education, metabolic variables and cardiovascular disease. Strategies aimed at lowering stress levels may be helpful for older diabetics.

More data from the National Survey of Midlife Development in the United States has revealed that cognitive abilities reflect to a greater extent how old you feel, not how old you actually are. Of course that may be because cognitive ability contributes to a person’s wellness and energy. But it also may reflect benefits of trying to maintain a sense of youthfulness by keeping up with new trends and activities that feel invigorating.

[171] Schafer, M. H., & Shippee T. P.
(2009).  Age Identity, Gender, and Perceptions of Decline: Does Feeling Older Lead to Pessimistic Dispositions About Cognitive Aging?.
The Journals of Gerontology Series B: Psychological Sciences and Social Sciences. 65B(1), 91 - 96.

A long-running study involving 930 70-year-old Swedish men has found that those who were among the bottom 25% on the Trail Making Test B were three times more likely to have a stroke or a brain infarction compared to those in the top 25%. Performance on the Trail Making Test A and the MMSE did not predict brain infarction or stroke. Test B measures the ability to execute and modify a plan, while Test A measures attention and visual-motor abilities, and the MMSE is a standard test of general cognitive decline.

Following on from studies showing that a Mediterranean-like diet may be associated with a lower risk of Alzheimer's disease and may lengthen survival in people with Alzheimer's, a six-year study of 712 New Yorkers has revealed that those who were most closely following a Mediterranean-like diet were 36% less likely to have brain infarcts (small areas of dead tissue linked to thinking problems), compared to those who were least following the diet. Those moderately following the diet were 21% less likely to have brain damage. The association was comparable to the effects of high blood pressure — that is, not eating a Mediterranean-like diet was like having high blood pressure. The Mediterranean diet includes high intake of vegetables, legumes, fruits, cereals, fish and monounsaturated fatty acids such as olive oil; low intake of saturated fatty acids, dairy products, meat and poultry; and mild to moderate amounts of alcohol.

The study will be presented at the American Academy of Neurology's 62nd Annual Meeting in Toronto April 10 to April 17, 2010.

A study involving 54 older adults (66-76) and 58 younger adults (18-35) challenges the idea that age itself causes people to become more risk-averse and to make poorer decisions. Analysis revealed that it is individual differences in processing speed and memory that affect decision quality, not age. The stereotype has arisen no doubt because more older people process slowly and have poorer memory. The finding points to the need to identify ways in which to present information that reduces the demand on memory or the need to process information very quickly, to enable those in need of such help (both young and old) to make the best choices. Self-knowledge also helps — recognizing if you need to take more time to make a decision.

A rhesus monkey study has revealed which dendritic spines are lost with age, providing a new target for therapies to help prevent age-association cognitive impairment. It appears that it is the thin, dynamic spines in the dorsolateral prefrontal cortex, which are key to learning new things, establishing rules, and planning, that are lost. Learning of a new task was correlated with both synapse density and average spine size, but was most strongly predicted by the head volume of thin spines. There was no correlation with size or density of the large, mushroom-shaped spines, which were very stable across age and probably mediate long-term memories, enabling the retention of expertise and skills learned early in life. There was no correlation with any of these spine characteristics once the task was learned. The findings underscore the importance of building skills and broad expertise when young.

A large longitudinal study, comparing physical activity at teenage, age 30, age 50, and late life against cognition of 9,344 women, has revealed that women who are physically active at any point have a lower risk of cognitive impairment in late-life compared to those who are inactive, but teenage physical activity is the most important. When age, education, marital status, diabetes, hypertension, depressive symptoms, smoking, and BMI were accounted for, only teenage physical activity status remained significantly associated with cognitive performance in old age. Although becoming active later in life didn’t make up for being inactive in adolescence, it did significantly reduce the risk of cognitive impairment compared to those who remained physically inactive. The findings are a strong argument for greater effort in increasing physical activity in today's youth.

A special supplement in the Journal of Alzheimer's Disease focuses on the effects of caffeine on dementia and age-related cognitive decline. Here are the highlights:

A mouse study has found memory restoration and lower levels of amyloid-beta in Alzheimer’s mice following only 1-2 months of caffeine treatment. The researchers talk of “ a surprising ability of moderate caffeine intake to protect against or treat AD”, and define moderate intake as around 5 cups of coffee a day(!).

A review of studies into the relation between caffeine intake, diabetes, cognition and dementia, concludes that indications that coffee/caffeine consumption is associated with a decreased risk of Type 2 diabetes and possibly also with a decreased dementia risk, cannot yet be confirmed with any certainty.

A study involving 351 older adults without dementia found the association between caffeine intake and cognitive performance disappeared once socioeconomic status was taken into account.

A study involving 641 older adults found caffeine consumption was significantly associated with less cognitive decline for women only. Supporting this, white matter lesions were significantly fewer in women consuming more than 3 units of caffeine per day (after adjustment for age) than in women consuming less.

A Portuguese study involving 648 older adults found that caffeine intake was associated with a lower risk of cognitive decline in women, but not significantly in men.

A review of published studies examining the relation between caffeine intake and cognitive decline or dementia shows a trend towards a protective effect of caffeine, but because of the limited number of epidemiological studies, and the methodological differences between them, is unable to come up with a definitive conclusion.

A review of published epidemiological studies looking at the association between caffeine intake and Parkinson’s Disease confirms that higher caffeine intake is associated with a lower risk of developing Parkinson’s Disease (though this association may be stronger for men than women). Other studies provide evidence of caffeine’s potential in treatment, improving both the motor deficits and non-motor symptoms of Parkinson’s.

Arendash, G.W. & Cao, C. Caffeine and Coffee as Therapeutics Against Alzheimer’s Disease. Journal of Alzheimer's Disease, 20 (Supp 1), 117-126.
Biessels, G.J. Caffeine, Diabetes, Cognition, and Dementia. Journal of Alzheimer's Disease, 20 (Supp 1), 143-150.
Kyle, J., Fox, H.C. & Whalley, L.J. Caffeine, Cognition, and Socioeconomic Status. Journal of Alzheimer's Disease, 20 (Supp 1), 151-159.
Ritchie, K. et al. Caffeine, Cognitive Functioning, and White Matter Lesions in the Elderly: Establishing Causality from Epidemiological Evidence. Journal of Alzheimer's Disease, 20 (Supp 1), 161-161
Santos, C. et al. Caffeine Intake is Associated with a Lower Risk of Cognitive Decline: A Cohort Study from Portugal. Journal of Alzheimer's Disease, 20 (Supp 1), 175-185.
Santos, C. et al. Caffeine Intake and Dementia: Systematic Review and Meta-Analysis. Journal of Alzheimer's Disease, 20 (Supp 1), 187-204.
Costa, J. et al. Caffeine Exposure and the Risk of Parkinson’s Disease: A Systematic Review and Meta-Analysis of Observational Studies. Journal of Alzheimer's Disease, 20 (Supp 1), 221-238.
Prediger, R.D.S. Effects of Caffeine in Parkinson’s Disease: From Neuroprotection to the Management of Motor and Non-Motor Symptoms. Journal of Alzheimer's Disease, 20 (Supp 1), 205-220.

Studies on the roundworm C. elegans have revealed that the molecules required for learning and memory are the same from C. elegans to mammals, suggesting that the basic mechanisms underlying learning and memory are ancient, and that this animal can serve as a testing ground for treatments for age-related memory loss. Intriguingly, a comparison of two known regulators of longevity — reducing calorie intake and reducing activity in the insulin-signaling pathway (achieved through genetic manipulation) — has found that these two treatments produce very different effects on memory. While dietary restriction impaired memory in early adulthood, it maintained memory with age. On the other hand, reduced insulin signaling improved early adult memory performance but failed to preserve it with age. These different effects appear to be linked to expression of CREB, a protein known to be important for long-term memory. Young roundworms with defective insulin receptors had higher levels of CREB protein, while those worms genetically altered to eat less had low levels, but the level did not diminish with age. These findings add to our understanding of why memory declines with age.

Although research has so far been confined to mouse studies, researchers are optimistic about the promise of histone deacetylase inhibitors in reversing age-related memory loss — both normal decline, and the far more dramatic loss produced by Alzheimer’s. The latest study reveals that memory impairment in the aging mouse is associated with altered hippocampal chromatin plasticity, specifically with the failure of histone H4 lysine 12 acetylation, leading to a failure to initiate the gene expression program associated with memory consolidation. Restoring this acetylation leads to the recovery of cognitive abilities.

An imaging study reveals why older adults are better at remembering positive events. The study, involving young adults (ages 19-31) and older adults (ages 61-80) being shown a series of photographs with positive and negative themes, found that while there was no difference in brain activity patterns between the age groups for the negative photos, there were age differences for the positive photos. In older adult brains, but not the younger, two emotion-processing regions (the ventromedial prefrontal cortex and the amygdala) strongly influenced the memory-encoding hippocampus.

Examination of the brains from 9 “super-aged” — people over 80 whose memory performance was at the level of 50-year-olds — has found that some of them had almost no tau tangles. The accumulation of tau tangles has been thought to be a natural part of the aging process; an excess of them is linked to Alzheimer’s disease. The next step is to work out why some people are immune to tangle formation, while others appear immune to the effects. Perhaps the first group is genetically protected, while the others are reaping the benefits of a preventive lifestyle.

The findings were presented March 23 at the 239th National Meeting of the American Chemical Society (ACS).

An imaging study involving 79 volunteers aged 44 to 88 has found lower volumes of gray matter and faster rates of decline in the frontal and medial temporal lobes of those who ranked high in neuroticism traits, compared with those who ranked high in conscientious traits. These are brain regions particularly affected by aging. The idea that this might occur derived from the well-established effects of chronic stress on the brain. This is the first study to investigate whether the rate and extent of cognitive decline with age is influenced by personality variables. Extraversion, also investigated, had no effect. The study does not, however, rule out the possibility that it is reduction in brain tissue in these areas that is affecting personality. There is increasing evidence that people tend to become more neurotic and less conscientious in early-stage Alzheimer's.

[174] Jackson, J., Balota D. A., & Head D.
(Submitted).  Exploring the relationship between personality and regional brain volume in healthy aging.
Neurobiology of Aging. In Press, Corrected Proof,

A new study provides more support for the idea that cognitive decline in older adults is a product of a growing inability to ignore distractions. Moreover, the study, involving 21 older adults (60-80) shown random sequences of pictures containing faces and scenes and asked to remember only the scene or the face, reveals that being given forewarning about which specific pictures would be relevant (say the second, or the fourth) did not help. The findings suggest that the failure to suppress irrelevant information is not due to a failure in quickly assessing what is relevant, but is a related to mechanisms that occur early in the visual processing stream.

The role of the catechol-O-methyltransferase (COMT) gene in cognitive function has been the subject of some debate. The gene, which affects dopamine, comes in two flavors: Val and Met. One recent study found no difference between healthy carriers of these two gene variants in terms of cognitive performance, but did find differences in terms of neural activity. Another found that, although the gene did not affect Alzheimer’s risk in its own, it acted synergistically with the Alzheimer’s gene variant to do so. Now an eight-year study of nearly 3000 adults in their 70s has revealed that the Met variant of the COMT gene was linked to a greater decline in cognitive function. This effect was more pronounced for African-Americans. This is interesting because it has been the Val genotype that in other research has been shown to have a detrimental effect. It seems likely that this genotype must be considered in its context (age, race, gender, and ApoE status have all been implicated in research).

The largest ever trial of fish oil supplements has found no evidence that they offer benefits for cognitive function in older people. The British study enrolled 867 participants aged 70-80 years, and lasted two years. After two years, those receiving fish oil capsules had significantly higher levels of omega-3 fatty acids in their blood than those receiving placebo capsules. However, cognitive function did not decline in either group over the period. The researchers caution that it may be that more time is needed for benefits to show.

It is common for people to feel as they get older that they more frequently experience occasions when they cannot immediately retrieve a word they know perfectly well ("it's on the tip of my tongue").

Tips of the tongue (TOTs) do indeed increase with age, and this increase is evident as early as the mid-thirties. There are other differences however, in the TOT experiences as people age. For example, older adults are much more likely to "go blank" than either young or mid-age (35-45) adults. That is, younger adults are more likely to be able to retrieve some information about the target word.

At all ages, the most common type of word involved in TOTs is proper names. But while forgetting proper names and object names becomes more common as we get older, interestingly, abstract words are forgotten less.

The most common means of resolution at all ages is that the forgotten word simply "pops up", but as we get older, it takes longer before this happens. "Pop-ups" are relatively more common for older adults. It is suggested that this may be because they are less likely to actively attempt to retrieve the information. According to a questionnaire, older adults are more likely to simply relax and think about something else.

Burke, D.M., MacKay, D.G., Worthley, J.S. & Wade, E. (1991). On the tip of the tongue: What causes word finding failures in young and older adults. Journal of Memory and Language, 30, 542-579.

It has been well-established that, compared to younger adults, older adults require more practice to achieve the same level of performance1. Sometimes, indeed, they may need twice as much2.

In the present study, two groups of adult subjects were given paired items to learn during multiple study-test trials. During each trial items were presented at the subject's pace. Afterwards the subjects were asked to judge how likely they were to be able to recall each item in a test.

It was found that people were very good at accurately judging the likelihood of their correct recall. Correlations between judgments and the amount of time the subjects studied the items suggested that people were monitoring their learning and using this to allocate study time.

However, older adults (with a mean age of 67) used monitoring to a lesser degree than the younger adults (with a mean age of 22), and the results suggested that part of the reason for the deficit in recall commonly found with older adults is due to this factor.

References

1. For a review, see Kausler, D.H. 1994. Learning and memory in normal aging. New York: Academic Press.

2. Delbecq-Derousné, J. & Beauvois, M. 1989. Memory processes and aging: A defect of automatic rather than controlled processes? Archives of Gerontology & Geriatrics, 1 (Suppl), 121-150.

Salthouse, T.A. & Dunlosky, J. 1995. Analyses of adult age differences in associative learning. Zeitschrift für Psychologie, 203, 351-360

Dunlosky, J. & Connor, L.T. (1997). Age differences in the allocation of study time account for age differences in memory performance. Memory and Cognition, 25, 691-700.

A study of over 3,100 older men (49-71) from across Europe has found that men with higher levels of vitamin D performed consistently better in an attention and speed of processing task. There was no difference on visual memory tasks. Although previous studies have suggested low vitamin D levels may be associated with poorer cognitive performance, findings have been inconsistent. Vitamin D is primarily synthesised from sun exposure but is also found in certain foods such as oily fish.

A review described as “definitive” has concluded that there is ample biological evidence to suggest an important role for vitamin D in brain development and function, and that supplementation for groups chronically low in vitamin D is warranted. Vitamin D has long been known to promote healthy bones, but more recently has been found to have a much broader role — over 900 different genes are now known to be able to bind the vitamin D receptor. Evidence for vitamin D's involvement in brain function includes the wide distribution of vitamin D receptors throughout the brain, as well as its ability to affect proteins in the brain known to be directly involved in learning and memory and motor control. Because we receive most of our Vitamin D from sunlight (UV from the sun converts a biochemical in the skin to vitamin D), those with darker skin living in northern latitudes are particularly at risk of vitamin D deficiency. Nursing infants and the elderly are also particularly vulnerable. It has also argued that current recommendations set the recommended level of vitamin D too low. This review is the fourth in a series that critically evaluate scientific evidence linking deficiencies in micronutrients to brain function. Earlier reviews have looked at DHA, choline, and iron.

A study has found that gerbils given a ‘cocktail’ of DHA, uridine and choline performed significantly better on learning and memory tests than untreated gerbils, and their brains had up to 70% more phosphatides (a type of molecule that forms cell membranes) than controls, suggesting that new synapses are forming. Some of the gerbils received all three compounds and some received only two; the improvements were greatest in those given all three. An earlier study had found that the treatment improved function in rats with cognitive impairment. Omega-3 fatty acids are found in fish, eggs, flaxseed and meat from grass-fed animals. Choline is found in meats, nuts and eggs. Uridine cannot be obtained from food sources, but is a component of human breast milk and can be produced in the body.

Older news items (pre-2010) brought over from the old website

Factors helping you maintain cognitive function in old age

An 8-year study of over 2,500 seniors in their 70s, has found that 53% showed normal age-related decline, 16% showed major cognitive decline, and an encouraging 30% had no change or improved on the tests over the years. The most important factors in determining whether a person maintained their cognitive health was education and literacy: those with a ninth grade literacy level or higher were nearly five times as likely to stay sharp than those with lower literacy levels; those with at least a high school education were nearly three times as likely to stay sharp as those who have less education. Lifestyle factors were also significant: non-smokers were nearly twice as likely to stay sharp as smokers; those who exercised moderately to vigorously at least once a week were 30% more likely to maintain their cognitive function than those who do not exercise that often; people working or volunteering and people who report living with someone were 24% more likely to maintain cognitive function.

[909] Ayonayon, H. N., Harris T. B., For the Health ABC Study, Yaffe K., Fiocco A. J., Lindquist K., et al.
(2009).  Predictors of maintaining cognitive function in older adults: The Health ABC Study.
Neurology. 72(23), 2029 - 2035.

http://www.eurekalert.org/pub_releases/2009-06/aaon-ssn060209.php

Better cognitive performance from US seniors compared to British

A study involving over 8,000 older Americans and over 5,000 British seniors has found a significant difference in cognitive performance between the two nationalities, with the Americans scoring on average as if they were ten years younger than the British. The U.S. advantage in "brain health" was greatest for the oldest old---those aged 85 and older. Part of the difference can be accounted for by higher levels of education and net worth in the United States, and part by significantly lower levels of depressive symptoms (possibly attributable to the much greater degree of medication in the US for depression). It was also found that dramatically more U.S. seniors reported no alcohol use (over 50%), compared to the British (15.5%). It is also speculated that the earlier retirement in Britain may be a factor, and also the greater prevalence of untreated hypertension.

[773] Langa, K. M., Llewellyn D., Lang I., Weir D., Wallace R., Kabeto M., et al.
(2009).  Cognitive health among older adults in the United States and in England.
BMC Geriatrics. 9(1), 23 - 23.

Full text available at http://www.biomedcentral.com/content/pdf/1471-2318-9-23.pdf
http://www.eurekalert.org/pub_releases/2009-06/bc-aet062309.php
http://www.eurekalert.org/pub_releases/2009-06/uom-us062309.php

Memory gets worse with age if you think about it

Confirming earlier research (and what I’ve been saying for ten years), thinking that memory diminishes with age is sufficient for some elderly people to score lower on cognitive tests. Moreover, and confirming other research relating to gender and race, the study also found that a senior's ability to remember something was heavily influenced by the activation or inactivation of negative stereotypes (for example, by being told before the test that older people perform more poorly on that type of memory test). The effects of negative stereotypes were experienced more by those in their sixties than older (but those in their seventies performed worse when they felt stigmatized), and more by the very well-educated. There was some indication that these effects occur through their effect on motivation.

[1013] Hess, T. M., Hinson J. T., & Hodges E. A.
(2009).  Moderators of and Mechanisms Underlying Stereotype Threat Effects on Older Adults' Memory Performance.
Experimental Aging Research: An International Journal Devoted to the Scientific Study of the Aging Process. 35(2), 153 - 153.

http://news.softpedia.com/news/Memory-Gets-Worse-With-Age-If-you-Think-About-It-109909.shtml
http://www.physorg.com/news159544866.html
http://www.eurekalert.org/pub_releases/2009-04/ncsu-tmw042109.php

Circadian clock may be critical for remembering what you learn

We know circadian rhythm affects learning and memory in that we find it easier to learn at certain times of day than others, but now a study involving Siberian hamsters has revealed that having a functioning circadian system is in itself critical to being able to remember. The finding has implications for disorders such as Down syndrome and Alzheimer's disease. The critical factor appears to be the amount of the neurotransmitter GABA, which acts to inhibit brain activity. The circadian clock controls the daily cycle of sleep and wakefulness by inhibiting different parts of the brain by releasing GABA. It seems that if it’s not working right, if the hippocampus is overly inhibited by too much GABA, then the circuits responsible for memory storage don't function properly. The effect could be fixed by giving a GABA antagonist, which blocks GABA from binding to synapses. Recent mouse studies have also demonstrated that mice with symptoms of Down syndrome and Alzheimer's also show improved learning and memory when given the same GABA antagonist. The findings may also have implications for general age-related cognitive decline, because age brings about a degradation in the circadian system. It’s also worth noting that the hamsters' circadian systems were put out of commission by manipulating the hamsters' exposure to light, in a technique that was compared to "sending them west three time zones." The effect was independent of sleep duration.

[688] Ruby, N. F., Hwang C. E., Wessells C., Fernandez F., Zhang P., Sapolsky R., et al.
(2008).  Hippocampal-dependent learning requires a functional circadian system.
Proceedings of the National Academy of Sciences. 105(40), 15593 - 15598.

http://www.eurekalert.org/pub_releases/2008-10/su-ccm100808.php

Occasional memory loss tied to lower brain volume

A study of 503 seniors (aged 50-85) with no dementia found that 453 of them (90%) reported having occasional memory problems such as having trouble thinking of the right word or forgetting things that happened in the last day or two, or thinking problems such as having trouble concentrating or thinking more slowly than they used to. Such problems have been attributed to white matter lesions, which are very common in older adults, but all of the participants in the study had white matter lesions in their brains, and the amount of lesions was not tied to occasional memory problems. However it was found that those who reported having such problems had a smaller hippocampus than those who had no cognitive problems. This was most noteworthy in subjects with good objective cognitive performance.

[895] van Norden, A. G. W., Fick W. F., de Laat K. F., van Uden I. W. M., van Oudheusden L. J. B., Tendolkar I., et al.
(2008).  Subjective cognitive failures and hippocampal volume in elderly with white matter lesions.
Neurology. 71(15), 1152 - 1159.

http://www.eurekalert.org/pub_releases/2008-10/aaon-oml093008.php

Decline of mental skills in years before death

A long-running study of 288 people with no dementia, who were followed from age 70 to death, has found that there was substantial acceleration in cognitive decline many years prior to death. Time of onset and rate of terminal decline varied considerably across cognitive abilities, with verbal ability beginning its terminal decline 6.6 years prior to death, spatial ability 7.8 years before death, and perceptual speed 14.8 years before death. With verbal ability, it appeared that the decline was not due to age only, but due to health issues.

[212] Thorvaldsson, V., Hofer S. M., Berg S., Skoog I., Sacuiu S., & Johansson B.
(2008).  Onset of terminal decline in cognitive abilities in individuals without dementia.
Neurology. 01.wnl.0000312379.02302.ba - 01.wnl.0000312379.02302.ba.

http://www.eurekalert.org/pub_releases/2008-08/aaon-ewd081908.php

Aging impairs the 'replay' of memories during sleep

During sleep, the hippocampus repeatedly "replays" brain activity from recent experiences, in a process believed to be important for memory consolidation. A new rat study has found reduced replay activity during sleep in old compared to young rats, and rats with the least replay activity performed the worst in tests of spatial memory. The best old rats were also the ones that showed the best sleep replay. Indeed, the animals who more faithfully replayed the sequence of neural activity recorded during their earlier learning experience were the ones who performed better on the spatial memory task, regardless of age. The replay activity occurs during slow-wave sleep.

[1319] Gerrard, J. L., Burke S. N., McNaughton B. L., & Barnes C. A.
(2008).  Sequence Reactivation in the Hippocampus Is Impaired in Aged Rats.
J. Neurosci.. 28(31), 7883 - 7890.

http://www.eurekalert.org/pub_releases/2008-07/sfn-ait072408.php

White-matter changes linked to gait and balance problems

A three-year study involving 639 adults between the ages of 65 and 84 has found that people with severe white matter changes (leukoaraiosis) were twice as likely to score poorly on walking and balance tests as those people with mild white matter changes. The study also found people with severe changes were twice as likely as the mild group to have a history of falls. The moderate group was one-and-a-half times as likely as the mild group to have a history of falls. Further research will explore the effect of exercise.

[1004] Langhorne, P., O'Brien J., Scheltens P., Visser M. C., Wahlund L. O., Waldemar G., et al.
(2008).  Association of gait and balance disorders with age-related white matter changes: The LADIS Study.
Neurology. 70(12), 935 - 942.

http://www.physorg.com/news124990876.html

Lack of imagination in older adults linked to declining memory

In a study in which older and younger adults were asked to think of past and future events, older adults were found to generate fewer details about past events — and this correlated with an impaired ability to imagine future events. The number of details remembered by older adults was also linked to their relational memory abilities. The findings suggest that our ability to imagine future events is based on our ability to remember the details of previously experienced ones, extract relevant details and put them together to create an imaginary event.

[287] Addis, D R., Wong A. T., & Schacter D. L.
(2008).  Age-related changes in the episodic simulation of future events.
Psychological Science: A Journal of the American Psychological Society / APS. 19(1), 33 - 41.

http://www.eurekalert.org/pub_releases/2008-01/afps-loi010708.php

Brain systems become less coordinated with age, even in the absence of disease

An imaging study of the brain function of 93 healthy individuals from 18 to 93 years old has revealed that normal aging disrupts communication between different regions of the brain. The finding is consistent with previous research showing that normal aging slowly degrades white matter. The study focused on the links within two critical networks, one responsible for processing information from the outside world and one, known as the default network, which is more internal and kicks in when we muse to ourselves. “We found that in young adults, the front of the brain was pretty well in sync with the back of the brain [but] in older adults this was not the case. The regions became out of sync and they were less correlated with each other.” However, older adults with normal, high correlations performed better on cognitive tests. Among older individuals whose brain systems did not correlate, all of the systems were not affected in the same way. The default system was most severely disrupted with age. The visual system was very well preserved.

[1052] Andrews-Hanna, J. R., Snyder A. Z., Vincent J. L., Lustig C., Head D., Raichle M E., et al.
(2007).  Disruption of Large-Scale Brain Systems in Advanced Aging.
Neuron. 56(5), 924 - 935.

http://www.eurekalert.org/pub_releases/2007-12/hhmi-tab120307.php
http://www.eurekalert.org/pub_releases/2007-12/hu-bsb120307.php
http://www.eurekalert.org/pub_releases/2007-12/cp-co112907.php

Why neurogenesis is so much less in older brains

A rat study has revealed that the aging brain produces progressively fewer new nerve cells in the hippocampus (neurogenesis) not because there are fewer of the immature cells (neural stem cells) that can give rise to new neurons, but because they divide much less often. In young rats, around a quarter of the neural stem cells were actively dividing, but only 8% of cells in middle-aged rats and 4% in old rats were. This suggests a new approach to improving learning and memory function in the elderly.

[1077] Hattiangady, B., & Shetty A. K.
(2008).  Aging does not alter the number or phenotype of putative stem/progenitor cells in the neurogenic region of the hippocampus.
Neurobiology of Aging. 29(1), 129 - 147.

http://www.eurekalert.org/pub_releases/2006-12/dumc-sca121806.php

Senior’s memory complaints should be taken seriously

A study involving 120 people over 60 found those who complained of significant memory problems who still performed normally on memory tests had a 3% reduction in gray matter density in their brains. This compares to 4% in those diagnosed with mild cognitive impairment. This suggests that significant memory loss complaints may indicate a very early "pre-MCI" stage of dementia for some people.

[979] Saykin, A. J., Wishart H. A., Rabin L. A., Santulli R. B., Flashman L. A., West J. D., et al.
(2006).  Older adults with cognitive complaints show brain atrophy similar to that of amnestic MCI.
Neurology. 67(5), 834 - 842.

http://www.eurekalert.org/pub_releases/2006-09/aaon-fym090506.php

Alzheimer's pathology related to episodic memory loss in those without dementia

A study of 134 participants from the Religious Orders Study or the Memory and Aging Project has found that, although they didn't have cognitive impairment at the time of their death, more than a third of the participants (50) met criteria for a pathologic diagnosis of Alzheimer's disease. This group also scored significantly lower on tests for episodic memory, such as recalling stories and word lists. The results provide further support for the idea that a ‘cognitive reserve’ can allow people to tolerate a significant amount of Alzheimer's pathology without manifesting obvious dementia. It also raises the question whether we should accept any minor episodic memory loss in older adults as 'normal'.

[967] Bennett, D. A., Schneider J. A., Arvanitakis Z., Kelly J. F., Aggarwal N. T., Shah R. C., et al.
(2006).  Neuropathology of older persons without cognitive impairment from two community-based studies.
Neurology. 66(12), 1837 - 1844.

http://www.eurekalert.org/pub_releases/2006-06/aaon-apr062006.php

Does IQ drop with age or does something else impact intelligence?

As people grow older, their IQ scores drop. But is it really that they lose intelligence? A study has found that if college students had to perform under conditions that mimic the perception deficits many older people have, their IQ scores would also take a drop.

[234] Gilmore, G. C., Spinks R. A., & Thomas C. W.
(2006).  Age effects in coding tasks: componential analysis and test of the sensory deficit hypothesis.
Psychology and Aging. 21(1), 7 - 18.

http://www.eurekalert.org/pub_releases/2006-05/cwru-did050106.php

Walking in older people is related to cognitive skills

A study of 186 adults aged 70 and older tested gait speed with and without interference (walking while reciting alternate letters of the alphabet). Walking speed was predictable from performance on cognitive tests of executive control and memory, particularly when the participant was required to recite at the same time. The findings suggest that in old age, walking involves higher-order executive-control processes, suggesting that cognitive tests could help doctors assess risk for falls. Conversely, slow gait could alert them to check for cognitive impairment.

[1812] Holtzer, R., Verghese J., Xue X., & Lipton R. B.
(2006).  Cognitive Processes Related to Gait Velocity: Results From the Einstein Aging Study..
Neuropsychology. 20(2), 215 - 223.

http://www.eurekalert.org/pub_releases/2006-03/apa-opw032306.php

Immune function important for cognition

New research overturns previous beliefs that immune cells play no part in — and may indeed constitute a danger to — the brain. Following on from an earlier study that suggested that T cells — immune cells that recognize brain proteins — have the potential to fight off neurodegenerative conditions such as Alzheimer’s, researchers have found that neurogenesis in adult rats kept in stimulating environments requires these immune cells. A further study found that mice with these T cells performed better at some tasks than mice lacking the cells. The researchers suggest that age-related cognitive decline may be related to this, as aging is associated with a decrease in immune system function, suggesting that boosting the immune system may also benefit cognitive function in older adults.

[435] Ziv, Y., Ron N., Butovsky O., Landa G., Sudai E., Greenberg N., et al.
(2006).  Immune cells contribute to the maintenance of neurogenesis and spatial learning abilities in adulthood.
Nat Neurosci. 9(2), 268 - 275.

http://www.eurekalert.org/pub_releases/2006-01/acft-wis011106.php

Early life stress can lead to memory loss and cognitive decline in middle age

Age-related cognitive decline is probably a result of both genetic and environmental factors. A rat study has demonstrated that some of these environmental factors may occur in early life. Among the rats, emotional stress in infancy showed no ill effects by the time the rats reached adulthood, but as the rats reached middle age, cognitive deficits started to appear in those rats who had had stressful infancies, and progressed much more rapidly with age than among those who had had nurturing infancies. Middle-aged rats who had been exposed to early life emotional stress showed deterioration in brain-cell communication in the hippocampus.

[1274] Brunson, K. L., Kramar E., Lin B., Chen Y., Colgin L L., Yanagihara T. K., et al.
(2005).  Mechanisms of Late-Onset Cognitive Decline after Early-Life Stress.
J. Neurosci.. 25(41), 9328 - 9338.

http://www.eurekalert.org/pub_releases/2005-10/uoc--els100605.php

Older people with the 'Alzheimer's gene' find it harder to remember intentions

It has been established that those with a certain allele of a gene called ApoE have a much greater risk of developing Alzheimer’s (those with this allele on both genes have 8 times the risk; those with the allele on one gene have 3 times the risk). Recent studies also suggest that such carriers are also more likely to show signs of deficits in episodic memory – but that these deficits are quite subtle. In the first study to look at prospective memory in seniors with the “Alzheimer’s gene”, involving 32 healthy, dementia-free adults between ages of 60 and 87, researchers found a marked difference in performance between those who had the allele and those who did not. The results suggest an exception to the thinking that ApoE status has only a subtle effect on cognition.

[1276] Driscoll, I., McDaniel M. A., & Guynn M. J.
(2005).  Apolipoprotein E and prospective memory in normally aging adults.
Neuropsychology. 19(1), 28 - 34.

http://www.eurekalert.org/pub_releases/2005-01/apa-opw011805.php

Some brains age more rapidly than others

Investigation of the patterns of gene expression in post-mortem brain tissue has revealed two groups of genes with significantly altered expression levels in the brains of older individuals. The most significantly affected are mostly those related to learning and memory. One of the most interesting, and potentially useful, findings, is that patterns of gene expression are quite similar in the brains of younger adults. Very old adults also show similar patterns, although the similarity is less. But the greatest degree of individual variation occurs in those aged between 40 and 70. Some of these adults show gene patterns that look more like the young group, whereas others show gene patterns that look more like the old group. It appears that gene changes start around 40 in some people, but not in others. It also appears that those genes that are affected by age are unusually vulnerable to damage from agents such as free radicals and toxins in the environment, suggesting that lifestyle in young adults may play a part in deciding rate and degree of cognitive decline in later years.

[1335] Lu, T., Pan Y., Kao S-Y., Li C., Kohane I., Chan J., et al.
(2004).  Gene regulation and DNA damage in the ageing human brain.
Nature. 429(6994), 883 - 891.

http://www.eurekalert.org/pub_releases/2004-06/chb-dgi060204.php

Drugs to improve memory may worsen memory in some

Drugs that increase the activity of an enzyme called protein kinase A improve long-term memory in aged mice and have been proposed as memory-enhancing drugs for elderly humans. However, the type of memory improved by this activity occurs principally in the hippocampus. A new study suggests that increased activity of this enzyme has a deleterious effect on working memory (which principally involves the prefrontal cortex). In other words, a drug that helps you remember a recent event may worsen your ability to remember what you’re about to do (to take an example).

[1404] Ramos, B. P., Birnbaum S. G., Lindenmayer I., Newton S. S., Duman R. S., & Arnsten A. F. T.
(2003).  Dysregulation of protein kinase a signaling in the aged prefrontal cortex: new strategy for treating age-related cognitive decline.
Neuron. 40(4), 835 - 845.

http://www.eurekalert.org/pub_releases/2003-11/naos-mdf110303.php

Memory-enhancing drugs for elderly may impair working memory and other executive functions

A number of pharmaceutical companies are working on developing memory-enhancing drugs not only for patients with clinical memory impairment, but also for perfectly healthy people. Although some drugs have been found that can improve cognitive function in those suffering from impairment, the side effects preclude their use among healthy people. However, a recent study has found evidence that a well-established drug used for narcolepsy (excessive daytime sleepiness) may improve cognition in normal people, without side effects. The drug seems to particularly affect some tasks requiring planning and working memory (and in a further, as yet unpublished study, appears helpful for adults with ADHD). Whether the drug (modafinil) has anything over caffeine in terms of the cognitive benefits it brings is still debated. More interestingly, and in line with the sometimes conflicting results of these kinds of drugs on different people, the researchers suggest that the effect of drugs on cognitive function depends on the level at which the individual cognitive system is operating: if your system is mildly below par, the right brain chemical could improve performance; if it’s well below par, the same dose will have a much smaller effect; if (and this is the interesting one) it’s already operating at peak, the chemical could in fact degrade performance.

[1360] Turner, D. C., Robbins T. W., Clark L., Aron A. R., Dowson J., & Sahakian B. J.
(2003).  Cognitive enhancing effects of modafinil in healthy volunteers.
Psychopharmacology. 165(3), 260 - 269.

Magnetic resonance imaging may help predict future memory decline

A six-year imaging study of 45 healthy seniors assessed changes in brain scans against cognitive decline. They found that progressive atrophy in the medial temporal lobe was the most significant predictor of cognitive decline, which occurred in 29% of the subjects.

[490] Rusinek, H., de Santi S., Frid D., Tsui W-H., Tarshish C. Y., Convit A., et al.
(2003).  Regional brain atrophy rate predicts future cognitive decline: 6-year longitudinal MR imaging study of normal aging.
Radiology. 229(3), 691 - 696.

http://www.eurekalert.org/pub_releases/2003-11/rson-mhr111703.php

Mouse study suggests new approach to reducing age-related cognitive decline

Young and old mice learned that a particular tone was associated with a mild electric footshock. When the tone was immediately followed by a shock, both young and aged mice easily remembered the association on the following day. When the tone was separated from the shock by several seconds, the old mice were strongly impaired in comparison to the young mice. The researchers found highly elevated levels of a calcium-activated potassium channel, the so-called SK3 channel, in the hippocampus of old, but not of young mice. When the researchers selectively downregulated SK3 channels in the hippocampus of aged mice, the impairment in learning and memory was prevented. This suggests a new approach to treating age-related memory decline.

Blank, T., Nijholt, I., Kye, M-J., Radulovic, J. & Spiess, J. 2003. Small-conductance, Ca2+-activated K+ channel SK3 generates age-related memory and LTP deficits. Nature Neuroscience, 6(9),911–912. Published online: 27 July 2003, doi:10.1038/nn1101

http://tinyurl.com/nm3r

Rat study offers more complex model of brain aging

A study of young, middle-aged, and aged rats, trained on two memory tasks, has revealed 146 genes connected with brain aging and cognitive impairment. Importantly, the changes in gene activity had mostly begun in mid-life, suggesting that changes in gene activity in the brain in early adulthood might set off cellular or biological changes that could affect how the brain works later in life. The study provides more information on genes already linked to aging, including some involved in inflammation and oxidative stress, and also describes additional areas in which gene activity might play a role in brain aging, including declines in energy metabolism in cells and changes in the activity of neurons (nerve cells) in the brain and their ability to make new connections with each other, increases in cellular calcium levels which could trigger cell death, cholesterol synthesis, iron metabolism and the breakdown of the insulating myelin sheaths that when intact facilitate efficient communication among neurons.

[852] Blalock, E. M., Chen K-C., Sharrow K., Herman J. P., Porter N. M., Foster T. C., et al.
(2003).  Gene Microarrays in Hippocampal Aging: Statistical Profiling Identifies Novel Processes Correlated with Cognitive Impairment.
J. Neurosci.. 23(9), 3807 - 3819.

http://www.eurekalert.org/pub_releases/2003-05/nioa-nsi050203.php

Is a dwindling brain chemical responsible for age-related cognitive decline?

A study of what are probably the world's oldest monkeys may explain age-related mental decline. The study found that the very old monkeys' nerves in the visual cortex lose their ability to discriminate between one signal and another and that this loss was directly related to the presence of a chemical called gamma-aminobutyric acid (Gaba), a neurotransmitter that appears to dwindle in old age. If a lack of GABA is indeed responsible for the old neurons' indiscriminate firing, this problem may be simple enough to treat. There already exist drugs that increase GABA production, although these drugs have yet to be carefully tested on the elderly.

[660] Leventhal, A. G., Wang Y., Pu M., Zhou Y., & Ma Y.
(2003).  GABA and its agonists improved visual cortical function in senescent monkeys.
Science (New York, N.Y.). 300(5620), 812 - 815.

http://www.eurekalert.org/pub_releases/2003-05/aaft-sow042403.php http://www.newswise.com/articles/2003/5/OLDBRAIN.UUT.html
http://www.utah.edu/unews/releases/03/may/oldbrain.html
http://news.independent.co.uk/world/science_medical/story.jsp?story=402317

Rat studies provide more evidence on why aging can impair memory

Among aging rats, those that have difficulty navigating water mazes have no more signs of neuron damage or cell death in the hippocampus, a brain region important in memory, than do rats that navigate with little difficulty. Nor does the extent of neurogenesis (birth of new cells in an adult brain) seem to predict poorer performance. Although the researchers have found no differences in a variety of markers for postsynaptic signals between elderly rats with cognitive impairment and those without, decreases in a presynaptic signal are correlated with worse cognitive impairment. That suggests that neurons in the impaired rat brains may not be sending signals correctly.

Gallagher, M. 2002. Markers for memory decline. Paper presented at the Society for Neuroscience annual meeting in Orlando, Florida, 5 November.

http://news.bmn.com/conferences/list/view?rp=2002-SFN-3-S4

An enzyme that helps us to forget

A series of experiments on genetically altered laboratory mice showed those with low levels of the enzyme protein phosphatase-1 (PP1), were less likely to forget what they had learned. This enzyme appears to be critical in helping us forget unwanted information, but it may also be partly responsible for an increase in forgetting in older adults. It was found that as the mice aged, the level of PP1 increased. When the action of PP1 was blocked, the mice recovered their full learning and memory abilities.

[1357] Genoux, D., Haditsch U., Knobloch M., Michalon A., Storm D., & Mansuy I. M.
(2002).  Protein phosphatase 1 is a molecular constraint on learning and memory.
Nature. 418(6901), 970 - 975.

http://www.sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/2002/08/29/MN2052.DTL
http://news.bbc.co.uk/1/hi/health/2222871.stm

Age-related changes in brain dopamine may underpin the normal cognitive problems of aging

A new model suggests why and how many cognitive abilities decline with age, and offers hope for prevention. Research in the past few years has clarified and refined our ideas about the ways in which cognitive abilities decline with age, and one of these ways is in a reduced ability to recall the context of memories. Thus, for example, an older person is less likely to be able to remember where she has heard something. According to this new model, context processing is involved in many cognitive functions — including some once thought to be independent — and therefore a reduction in the ability to remember contextual information can have wide-reaching implications for many aspects of cognition. The model suggests that context processing occurs in the prefrontal cortex and requires a certain level of the brain chemical dopamine. It may be that in normal aging, dopamine levels become low or erratic. Changes in dopamine have also been implicated in Alzheimer’s, as well as other brain-based diseases.

[1180] Mumenthaler, M. S., Jagust W. J., Reed B. R., Braver T. S., Barch D. M., Keys B. A., et al.
(2001).  Context processing in older adults: evidence for a theory relating cognitive control to neurobiology in healthy aging.
Journal of Experimental Psychology. General. 130(4), 746 - 763.

http://www.eurekalert.org/pub_releases/2001-12/apa-ocf121701.php

Error | About memory

Error

The website encountered an unexpected error. Please try again later.