Mild Cognitive Impairment

Latest Research News

A study involving 30 previously physically inactive older adults (aged 61-88) found that a three-month exercise program reversed some brain atrophy.

Participants included 14 with MCI. The exercise program included moderate intensity walking on a treadmill four times a week over a twelve-week period. On average, cardiorespiratory fitness improved by about 8% as a result of the training in both the healthy and MCI participants. Fitness was assessed using peak oxygen capacity rates.

Those who showed the greatest improvements in fitness had the most growth in cortical thickness. Those with MCI showed greater improvements compared to healthy group in the left insula and superior temporal gyrus, two brain regions that have been shown to exhibit accelerated neurodegeneration in Alzheimer’s disease.

Reiter, K., Nielson, K. A., Smith, T. J., Weiss, L. R., Alfini, A. J., & Smith, J. C. (2015). Improved Cardiorespiratory Fitness Is Associated with Increased Cortical Thickness in Mild Cognitive Impairment. Journal of the International Neuropsychological Society, 21(Special Issue 10), 757–767. https://doi.org/10.1017/S135561771500079X

Optimal levels of cardiovascular health in older age associated with lower dementia risk

A French study involving 6,626 older adults (65+) found that having optimal levels in more measures of cardiovascular health (nonsmoking, weight, diet, physical activity, cholesterol, blood glucose and blood pressure) was associated with lower dementia risk and slower rates of cognitive decline. Dementia risk and rates of cognitive decline lowered with each additional metric at the recommended optimal level.

The measures come from an American Heart Association seven-item checklist aimed at preventing cardiovascular disease.

https://www.eurekalert.org/pub_releases/2018-08/jn-hol081618.php

Dementia risk increased in 50-year-olds with blood pressure below hypertension threshold

New findings from the large, long-running Whitehall II study revealed that 50-year-olds who had blood pressure that was higher than normal but still below the usual threshold for treating hypertension, were at increased risk of developing dementia in later life.

This increased risk was seen even when they didn’t have other heart or blood vessel-related problems.

The study involved 8,639 people, of whom 32.5% were women. Participants were aged between 35-55 in 1985, and had their blood pressure measured in 1985, 1991, 1997 and 2003. 385 (4.5%) developed dementia by 2017.

Those who had a systolic blood pressure of 130 mmHg or more at the age of 50 had a 45% greater risk of developing dementia than those with a lower systolic blood pressure at the same age. This association was not seen at the ages of 60 and 70, and diastolic blood pressure was not linked to dementia.

https://www.eurekalert.org/pub_releases/2018-06/esoc-dri061118.php

https://www.theguardian.com/science/2018/jun/13/dementia-risk-to-50-year-olds-with-raised-blood-pressure-study

Intensive blood pressure control reduces risk of MCI

Preliminary results from the Systolic Blood Pressure Intervention Trial (SPRINT) has found that aggressive lowering of systolic blood pressure produced significant reductions in the risk of MCI, and MCI/dementia.

The randomized clinical trial compared an intensive strategy with a systolic blood pressure goal of less than 120 mm Hg and a standard care strategy targeting a systolic blood pressure goal of less than 140 mm Hg. The study involved 9,361 hypertensive older adults (mean age 67.9).

The intensive treatment group had a 19% lower rate of new cases of MCI, and the combined outcome of MCI plus probable all-cause dementia was 15% lower. Serious adverse events of hypotension, syncope, electrolyte abnormalities, and acute kidney injury or acute renal failure occurred more frequently in the intensive-treatment group (4.7% vs 2.5%).

Participants were seen monthly for the first 3 months and every 3 months thereafter. Medications were adjusted on a monthly basis and lifestyle modification was encouraged. 30% of the participants were African American and 10% were Hispanic.

Preliminary results from 673 participants in the trial revealed that total white matter lesion (WML) volume increased in both treatment groups, but the increase was significantly less in the intensive treatment group. There was no significant difference in total brain volume change.

The findings were reported at the Alzheimer's Association International Conference (AAIC) 2018 in Chicago.

https://www.eurekalert.org/pub_releases/2018-07/aa-sib072218.php

Arterial stiffness linked to dementia risk

A long-running study involving 356 older adults (average age 78) found that those with high levels of arterial stiffness were 60% more likely to develop dementia during the next 15 years compared to those with lower levels.

Arterial stiffness is correlated with subclinical brain disease and cardiovascular risk factors, but adjusting for these factors didn't reduce the association between arterial stiffness and dementia — indicating that arterial stiffness and subclinical brain damage markers are independently related to dementia risk.

Arterial stiffening can be reduced by antihypertensive medication and perhaps also healthy lifestyle changes such as exercise. This study found that exercise at an average age of 73 was associated with lower arterial stiffness five years later.

https://www.eurekalert.org/pub_releases/2018-10/uops-lsi101518.php

Hypertension linked to brain atrophy & poorer waste management

A rat study found that hypertensive rats exhibited larger ventricles, decreased brain volume, and impaired fluid transport. It’s suggested that hypertension interferes with the clearance of macromolecules from the brain, such as amyloid-beta.

https://www.eurekalert.org/pub_releases/2019-06/sfn-hb061119.php

Samieri C, Perier M, Gaye B, et al. Association of Cardiovascular Health Level in Older Age With Cognitive Decline and Incident Dementia. JAMA. 2018;320(7):657–664. doi:10.1001/jama.2018.11499

Abell, J. et al. 2018. Association between systolic blood pressure and dementia in the Whitehall II cohort study: role of age, duration and threshold used to define hypertension. European Heart Journal. doi:10.1093/eurheartj/ehy288

[4495] Cui, C., Sekikawa A., Kuller L. H., Lopez O. L., Newman A. B., Kuipers A. L., et al.
(2018).  Aortic Stiffness is Associated with Increased Risk of Incident Dementia in Older Adults.
Journal of Alzheimer's Disease. 66(1), 297 - 306.

[4496] Mortensen, K. Nygaard, Sanggaard S., Mestre H., Lee H., Kostrikov S., Xavier A. L. R., et al.
(2019).  Impaired Glymphatic Transport in Spontaneously Hypertensive Rats.
Journal of Neuroscience. 39(32), 6365 - 6377.

Although first reported in 1816, the fact that the brain is surrounded by lymphatic vessels, which connect the brain and the immune system, was only rediscovered in 2015.

Lymphatic vessels are part of the body's circulatory system. In most of the body they run alongside blood vessels. They transport lymph, a colorless fluid containing immune cells and waste, to the lymph nodes. Blood vessels deliver white blood cells to an organ and the lymphatic system removes the cells and recirculates them through the body. The process helps the immune system detect whether an organ is under attack from bacteria or viruses or has been injured.

Since then, brain scans have indicated that our brains drain some waste out through lymphatic vessels, and could act as a pipeline between the brain and the immune system.

More recent research suggests the vessels are vital to the brain's ability to cleanse itself. When a compound was used to improve the flow of waste from the brain to the lymph nodes in the neck of aged mice, their ability to learn and remember improved dramatically.

Moreover, obstructing the vessels in mice worsened the accumulation of harmful amyloid plaques in the brain.

https://www.eurekalert.org/pub_releases/2018-07/uovh-bdc072518.php

https://www.eurekalert.org/pub_releases/2017-10/nion-nru100317.php

[4498] Da Mesquita, S., Louveau A., Vaccari A., Smirnov I., R. Cornelison C., Kingsmore K. M., et al.
(2018).  Functional aspects of meningeal lymphatics in ageing and Alzheimer’s disease.
Nature. 560(7717), 185 - 191.

Absinta, Ha et al. Human and nonhuman primate meninges harbor lymphatic vessels that can be visualized noninvasively by MRI, October 3, 2017, eLife: 10.7554/eLife.29738

Can computer use, crafts and games slow or prevent age-related memory loss?

A study involving 2,000 healthy older adults (average age 78) found that mentally stimulating activities were linked to a lower risk or delay of MCI, and that the timing and number of these activities may also play a role.

During the study, 532 participants developed MCI.

Using a computer in middle-age (50-65) was associated with a 48% lower risk of MCI, while using a computer in later life was associated with a 30% lower risk, and using a computer in both middle-age and later life was associated with a 37% lower risk.

Engaging in social activities, like going to movies or going out with friends, or playing games, like doing crosswords or playing cards, in both middle-age and later life were associated with a 20% lower risk of developing MCI.

Craft activities were associated with a 42% lower risk, but only in later life.

Those who engaged in two activities were 28% less likely to develop MCI than those who took part in no activities, while those who took part in three activities were 45% less likely, those with four activities 56% percent less likely and those with five activities were 43% less likely.

It should be noted that activities in middle-age were assessed by participants’ memory many years later.

https://www.eurekalert.org/pub_releases/2019-07/aaon-ccu071019.php

Regular crosswords & sudoku linked to sharper brain in later life

Data from the PROTECT online platform, involving 19,000 healthy older adults (50-96), found that the more regularly older adults played puzzles such as crosswords and Sudoku, the better they performed on tasks assessing attention, reasoning and memory.

In some areas the improvement was quite dramatic, for example, on measures of problem-solving, people who regularly do these puzzles performed equivalent to an average of eight years younger compared to those who don't.

https://www.eurekalert.org/pub_releases/2019-05/uoe-rca051419.php

Mind-body exercises improve cognitive function in older adults

A meta-analysis of 32 randomized controlled trials with 3,624 older adults with or without cognitive impairment has concluded that mind-body exercises, especially tai chi and dance mind-body exercise, help improve global cognition, cognitive flexibility, working memory, verbal fluency, and learning in older adults.

https://www.eurekalert.org/pub_releases/2018-12/w-mem121718.php

Krell-Roesch, J., Syrjanen, J. A., Vassilaki, M., Machulda, M. M., Mielke, M. M., Knopman, D. S., … Geda, Y. E. (2019). Quantity and quality of mental activities and the risk of incident mild cognitive impairment. Neurology, 93(6), e548. https://doi.org/10.1212/WNL.0000000000007897

Brooker, H., Wesnes, K. A., Ballard, C., Hampshire, A., Aarsland, D., Khan, Z., … Corbett, A. (2019). The relationship between the frequency of number-puzzle use and baseline cognitive function in a large online sample of adults aged 50 and over. International Journal of Geriatric Psychiatry, 34(7), 932–940. https://doi.org/10.1002/gps.5085

Brooker, H., Wesnes, K. A., Ballard, C., Hampshire, A., Aarsland, D., Khan, Z., … Corbett, A. (2019). An online investigation of the relationship between the frequency of word puzzle use and cognitive function in a large sample of older adults. International Journal of Geriatric Psychiatry, 34(7), 921–931. https://doi.org/10.1002/gps.5033

Wu, C., Yi, Q., Zheng, X., Cui, S., Chen, B., Lu, L., & Tang, C. (2019). Effects of Mind-Body Exercises on Cognitive Function in Older Adults: A Meta-Analysis. Journal of the American Geriatrics Society, 67(4), 749–758. https://doi.org/10.1111/jgs.15714

Americans with a college education live longer without dementia and Alzheimer's

Data from the large, long-running U.S. Health and Retirement Study found that healthy cognition characterized most of the people with at least a college education into their late 80s, while those who didn’t complete high school had good cognition up until their 70s.

The study found that those who had at least a college education lived a much shorter time with dementia than those with less than a high school education: an average of 10 months for men and 19 months for women, compared to 2.57 years (men) and 4.12 years (women).

The data suggests that those who graduated high school can expect to live (on average) at least 70% of their remaining life after 65 with good cogntion, compared to more than 80% for those with a college education, and less than 50% for those who didn't finish high school.

The analysis was based on a sample of 10,374 older adults (65+; average age 74) in 2000 and 9,995 in 2010.

https://www.eurekalert.org/pub_releases/2018-04/uosc-awa041618.php

https://academic.oup.com/psychsocgerontology/article/73/suppl_1/S20/4971564 (open access)

More education linked to better cognitive functioning later in life

Data from around 196,000 subscribers to Lumosity online brain-training games found that higher levels of education were strong predictors of better cognitive performance across the 15- to 60-year-old age range of their study participants, and appear to boost performance more in areas such as reasoning than in terms of processing speed.

Differences in performance were small for test subjects with a bachelor's degree compared to those with a high school diploma, and moderate for those with doctorates compared to those with only some high school education.

But people from lower educational backgrounds learned novel tasks nearly as well as those from higher ones.

https://www.eurekalert.org/pub_releases/2017-08/l-mel082117.php

http://www.futurity.org/higher-education-cognitive-peak-1523712/

Youthful cognitive ability strongly predicts mental capacity later in life

Data from more than 1,000 men participating in the Vietnam Era Twin Study of Aging revealed that their cognitive ability at age 20 was a stronger predictor of cognitive function later in life than other factors, such as higher education, occupational complexity or engaging in late-life intellectual activities.

All of the men, now in their mid-50s to mid-60s, took the Armed Forces Qualification Test at an average age of 20. The same test of general cognitive ability (GCA) was given in late midlife, plus assessments in seven cognitive domains.

GCA at age 20 accounted for 40% of the variance in the same measure at age 62, and approximately 10% of the variance in each of the seven cognitive domains. Lifetime education, complexity of job and engagement in intellectual activities each accounted for less than 1% of variance at average age 62.

The findings suggest that the impact of education, occupational complexity and engagement in cognitive activities on later life cognitive function simply reflects earlier cognitive ability.

The researchers speculated that the role of education in increasing GCA takes place primarily during childhood and adolescence when there is still substantial brain development.

https://www.eurekalert.org/pub_releases/2019-01/uoc--yca011819.php

[4484] Crimmins, E. M., Saito Y., Kim J. Ki, Zhang Y. S., Sasson I., & Hayward M. D.
(2018).  Educational Differences in the Prevalence of Dementia and Life Expectancy with Dementia: Changes from 2000 to 2010.
The Journals of Gerontology: Series B. 73(suppl_1), S20 - S28.

Guerra-Carrillo, B., Katovich, K., & Bunge, S. A. (2017). Does higher education hone cognitive functioning and learning efficacy? Findings from a large and diverse sample. PLOS ONE, 12(8), e0182276. https://doi.org/10.1371/journal.pone.0182276

[4485] Kremen, W. S., Beck A., Elman J. A., Gustavson D. E., Reynolds C. A., Tu X. M., et al.
(2019).  Influence of young adult cognitive ability and additional education on later-life cognition.
Proceedings of the National Academy of Sciences. 116(6), 2021.

Memory tests predict brain atrophy and Alzheimer's disease

Data from the Alzheimer's Disease Neuroimaging Initiative (ADNI), involving 230 cognitively normal individuals and 394 individuals with diagnosed with MCI on the basis of one episodic test, has found that performance on two tests markedly improved the identification of those whose MCI was more serious.

MCI can be a step on the road to Alzheimer's, but it can also be a reversible condition, and it’s obviously helpful to be able to distinguish the two.

The study compared those with MCI whose memory performance was impaired only in one (story recall) or two (story recall and word list recall) tests. Those who performed poorly in both showed Alzheimer's biomarkers in the cerebrospinal fluid that more closely resembled Alzheimer's patients than those who only did poorly in one test. Moreover, they showed faster brain atrophy in the medial temporal lobes.

Alzheimer's disease was diagnosed within the three-year study period in around half of the participants who performed poorly in both tests, but in only 16% of those with a poor performance on one test.

https://www.eurekalert.org/pub_releases/2018-12/uoh-mtp121018.php

Clock drawing test should be done routinely in patients with high blood pressure

An Argentinian study involving 1,414 adults with high blood pressure has concluded that the clock drawing test for detecting cognitive dysfunction should be conducted routinely in patients with high blood pressure

A higher prevalence of cognitive impairment was found with the clock drawing test (36%) compared to the MMSE (21%). Three out ten patients who had a normal MMSE score had an abnormal clock drawing result. The disparity in results between the two tests was greatest in middle aged patients.

The clock drawing test is particularly useful for evaluating executive functions, which are the cognitive function most likely to be damaged by untreated high blood pressure.

The clock drawing test involves being given a piece of paper with a 10 cm diameter circle on it, and having to write the numbers of the clock in the correct position inside the circle and then draw hands on the clock indicating the time "twenty to four".

The average blood pressure was 144/84 mmHg, average age was 60 years, and 62% were women.

The findings were presented at ESC Congress 2018.

https://www.eurekalert.org/pub_releases/2018-08/esoc-cdc082318.php

Repeated cognitive testing can mask early signs of dementia

Those suspected of cognitive impairment often undergo repeated cognitive testing over time — indeed it is the change over time that is most diagnostic. However, most cognitive functions get better with practice. A new study involving 995 middle- to late-middle-aged men has found that, indeed, there were significant practice effects in most cognitive domains, and diagnoses of MCI doubled from 4.5 to 9% after correcting for practice effects.

https://www.eurekalert.org/pub_releases/2018-07/uoc--pir071118.php

Verb fluency helpful in detecting early cognitive impairment and predicting dementia

A large study involving 1820 adults (44+), of whom 568 were cognitively healthy, 885 had MCI, and 367 mild Alzheimer's, found that verb fluency worsened at each stage of cognitive decline, and worse scores in verb fluency task were significantly related to development of MCI, and progression from MCI to dementia. Worsening verb fluency was also associated with a faster decline to MCI, but not to faster progression from MCI to dementia.

Most previous research with word fluency has used category and letter fluency tasks (which demand generating names) rather than verb fluency, but verb fluency is more cognitively demanding than generating names, and may thus be a more sensitive tool.

https://www.eurekalert.org/pub_releases/2018-03/ip-tro031618.php

Effectiveness of brief, simple test to screen for MCI

A brief, simple number naming test has been found to differentiate between cognitively healthy older adults and those with MCI or Alzheimer's.

The King-Devick (K-D) test is a one- to two-minute rapid number naming test that has previously been found useful in the detection of concussion, as well as in detecting level of impairment in other neurological conditions such as Parkinson's disease and multiple sclerosis. The K-D test can be quickly administered by non-professional office staff on either a tablet (iPad) or in a paper version.

The test accurately distinguished the controls from the cognitively impaired individuals more than 90% of the time.

The study involved 206 older adults, including 135 cognitively healthy individuals, 39 people with MCI, and 32 Alzheimer's patients.

The test will need to be validated in larger samples.

http://www.eurekalert.org/pub_releases/2016-07/bumc-sse070516.php

Not being aware of memory problems predicts onset of Alzheimer's

A number of studies have shown that people’s own subjective impressions of memory problems should not be discounted, but they shouldn’t be given too much weight either, since many people are over-anxious nowadays about their prospects of dementia. But there is a further complication to this issue, which is that being unaware of one’s own memory problems is typical of Alzheimer's.

Anosognosia is the name for this condition of not being able to recognize one’s memory problems.

A study involving 450 patients who experienced mild memory deficits, but were still capable of taking care of themselves, assessed this awareness by asking both the patients and their close relatives about the patient’s cognitive abilities. Anosognosia was diagnosed when a patient reported having no cognitive problems but the family member reported significant difficulties.

The study found that those suffering from anosognosia had impaired brain metabolic function and higher rates of amyloid deposition. Two years later, they were more likely to have developed dementia.

https://www.eurekalert.org/pub_releases/2018-02/mu-nba021518.php

A study involving 1,062 older adults (55-90), including 191 people with Alzheimer's disease, 499 with MCI and 372 healthy controls, found that those with anosognosia had reduced glucose uptake in specific brain regions. Glucose uptake is impaired in Alzheimer's disease.

https://www.eurekalert.org/pub_releases/2017-10/cfaa-buo101017.php

Cognitive test differentiates between Alzheimer's and normal aging

The hippocampus, one of the earliest brain regions affected in Alzheimer's, has a number of important memory functions. One of these is relational memory — the hippocampus can bind together pieces of information stored in different parts of the brain, so that, for example, you can remember the name when you see the associated face.

A new cognitive test that assesses relational memory has been found to be effective in distinguishing cognitive impairment that reflects very early mild Alzheimer's from normal aging.

The test involves a circle divided into three parts, each having a unique design. After studying a circle, participants needed to pick its exact match from a series of 10 circles, presented one at a time.

People with very mild Alzheimer's disease did worse overall on the task than those in the healthy aging group, who, in turn, did worse than a group of young adults. Moreover, those with Alzheimer's were particularly susceptible to interference from intervening lure stimuli. Including this in the analysis improved the test’s ability to differentiate between those who did and those who did not have Alzheimer's. It also provides evidence that Alzheimer's is qualitatively different from normal age-related cognitive decline, not simply an extension of it.

The study involved 90 participants, including 30 young adults, 30 cognitively healthy older adults, and 30 with very early Alzheimer's.

http://www.eurekalert.org/pub_releases/2014-05/uoia-ctc052014.php

[4439] Vuoksimaa, E., McEvoy L. K., Holland D., Franz C. E., Kremen W. S., & Initiative for. the Alzhei
(2018).  Modifying the minimum criteria for diagnosing amnestic MCI to improve prediction of brain atrophy and progression to Alzheimer’s disease.
Brain Imaging and Behavior.

[4440] Elman, J. A., Jak A. J., Panizzon M. S., Tu X. M., Chen T., Reynolds C. A., et al.
(2018).  Underdiagnosis of mild cognitive impairment: A consequence of ignoring practice effects.
Alzheimer's & Dementia: Diagnosis, Assessment & Disease Monitoring. 10, 372 - 381.

Alegret M, Peretó M, Pérez A, Valero S, Espinosa A, Ortega G, Hernández I, Mauleón A, Rosende-Roca M, Vargas L, Rodríguez-Gómez O, Abdelnour C, Berthier ML, Bak TH, Ruiz A, Tárraga L, Boada M. The Role of Verb Fluency in the Detection of Early Cognitive Impairment in Alzheimer's Disease Journal of Alzheimer's Disease 2018.

[4442] Galetta, K. M., Chapman K. R., Essis M. D., Alosco M. L., Gillard D., Steinberg E., et al.
(2017).  Screening Utility of the King-Devick Test in Mild Cognitive Impairment and Alzheimer Disease Dementia.
Alzheimer Disease & Associated Disorders. 31(2), 152.

[4443] Therriault, J., Ng K. Pin, Pascoal T. A., Mathotaarachchi S., Kang M. Su, Struyfs H., et al.
(2018).  Anosognosia predicts default mode network hypometabolism and clinical progression to dementia.
Neurology. 90(11), e932.

[4444] Gerretsen, P., Chung J. Ku, Shah P., Plitman E., Iwata Y., Caravaggio F., et al.
(2017).  Anosognosia Is an Independent Predictor of Conversion From Mild Cognitive Impairment to Alzheimer’s Disease and Is Associated With Reduced Brain Metabolism.
The Journal of Clinical Psychiatry. 78(9), 1187 - 1196.

Monti, J. M., Balota, D. A., Warren, D. E., & Cohen, N. J. (2014). Very mild Alzheimer׳s disease is characterized by increased sensitivity to mnemonic interference. Neuropsychologia, 59, 47–56. https://doi.org/10.1016/j.neuropsychologia.2014.04.007

A review of 34 longitudinal studies, involving 71,244 older adults, has concluded that depression is associated with greater cognitive decline.

The study included people who presented with symptoms of depression as well as those that were diagnosed as clinically depressed, but excluded any who were diagnosed with dementia at the start of study.

Previous research has found that depression is associated with an increased dementia risk.

The researchers recommend that preventative measures such as exercising, practicing mindfulness, and undertaking recommended therapeutic treatments, such as Cognitive Behaviour Therapy, might help protect cognitive health.

While the review included some studies into anxiety, the numbers were insufficient to draw a conclusion.

https://www.eurekalert.org/pub_releases/2018-05/uos-dsu052318.php

A small study has found that a 12-week exercise program significantly improved cognition in both older adults with MCI and those who were cognitively healthy, but that effect on blood flow in the brain was different in these two groups.

While the exercise increased cerebral blood flow in the frontal cortex of those in the healthy group, those with MCI experienced decreases in cerebral blood flow. It has been speculated that the brain responds to early difficulties by increasing cerebral blood flow. This suggests that exercise may have the potential to reduce this compensatory blood flow and improve cognitive efficiency in those who are in the very early stages of Alzheimer's Disease.

The exercise training program consisted of four 30-minute sessions of moderate-intensity treadmill walking per week.

Both working memory and verbal fluency were tested (using the Rey Auditory Verbal Learning Test, and the Controlled Oral Word Association Test).

Changes in cerebral blood flow were measured in specific brain regions that are known to be involved in the pathogenesis of Alzheimer's disease, including the insula, the anterior cingulate cortex, and the inferior frontal gyrus.

Among those with MCI, decreased blood flow in the left insula and anterior cingulate cortex was strongly associated with improved verbal fluency.

https://www.eurekalert.org/pub_releases/2019-01/uom-usf013119.php

Alfini, A. J. et al. 2019. Resting Cerebral Blood Flow After Exercise Training in Mild Cognitive Impairment. Journal of Alzheimer's Disease, 67 (2), 671-684.

 

A clinical trial involving 9361 older adults (50+) with hypertension but without diabetes or history of stroke has found that intensive control of blood pressure significantly reduced the risk of developing mild cognitive impairment.

While there was also a 15% reduction in dementia, this result did not reach statistical significance. This may have been due to the small number of new cases of dementia in the study groups.

Participants were randomly assigned to a systolic blood pressure goal of either less than 120 mm HG (intensive treatment) or less than 140 mm HG (standard treatment). They were then classified after five years as having no cognitive impairment, MCI or probable dementia.

The trial was stopped early due to its success in reducing cardiovascular disease. As a result, participants were on intensive blood pressure lowering treatment for a shorter period than originally planned. This impacted the number of cases of dementia occurring.

Hypertension affects more than half of Americans over age 50 and more than 75% of those older than 65.

https://www.eurekalert.org/pub_releases/2019-01/wfbm-lbp012419.php

The SPRINT MIND Investigators for the SPRINT Research Group. (2019). Effect of Intensive vs Standard Blood Pressure Control on Probable Dementia: A Randomized Clinical Trial. JAMA, 321(6), 553–561.

 

The APOE gene, the strongest genetic risk factor for Alzheimer’s disease, is known to be involved in cholesterol and lipid metabolism. Now the largest ever genetic study of Alzheimer’s disease, using DNA from more than 1.5 million people, has identified 90 points across the genome that were associated with an increased risk of both cardiovascular disease and Alzheimer’s disease.

The study focused on specific risk factors for heart disease (e.g., high BMI, type 2 diabetes, high cholesterol) to see if any were genetically related to Alzheimer’s risk. It was found that only those genes involved in lipid metabolism also related to Alzheimer's risk.

Six of the 90 regions had very strong effects on Alzheimer’s and heightened blood lipid levels, including several points within the CELF1/MTCH2/SPI1 region on chromosome 11 that was previously linked to the immune system.

The same genetic risk factors were also more common in people with a family history of Alzheimer’s, even though they had not themselves developed dementia or MCI.

The findings suggest that cardiovascular and Alzheimer's risk co-occur because of a shared genetic basis.

They also suggest a therapeutic target — namely, pathways involved in lipid metabolism.

https://www.futurity.org/alzheimers-disease-heart-disease-cholesterol-1913312-2/

https://www.eurekalert.org/pub_releases/2018-11/wuso-cda111118.php

Broce I, Karch C, Desikan R, et al. Dissecting the genetic relationship between cardiovascular risk factors and Alzheimer's disease. Acta Neuropathologica, published online Nov. 9, 2018.

 

Data from 1,215 older adults, of whom 173 (14%) were African-American, has found that, although brain scans showed no significant differences between black and white participants, cerebrospinal fluid (CSF) showed significantly lower levels of the brain protein tau in African-Americans.

While both groups showed the same (expected) pattern of higher tau levels being associated with greater chance of cognitive impairment, the absolute amounts of tau protein were consistently lower in African-Americans.

However, when APOE status was taken into account, it was found that those who held the low-risk variants of the “Alzheimer’s gene” had similar levels of tau, regardless of race. It was only African-Americans with the APOE4 gene variant that showed lower levels of tau.

This suggests that the APOE4 risk factor has different effects in African-Americans compared to non-Hispanic white Americans, and points to the need for more investigation into how Alzheimer’s develops in various populations.

Interestingly, another study, using data from 1798 patients (of whom 1690 were white), found that there was a strong gender difference in the association between APOE status and tau levels in the CSF.

Previous research has shown that the link between APOE4 and Alzheimer's is stronger in women than men. This study points to a connection with tau levels, as there was no gender difference in the association between APOE and amyloid-beta levels, amyloid plaques, or tau tangles.

https://www.futurity.org/alzheimers-disease-black-patients-1951502/

Morris JC, Schindler SE, McCue LM, et al. Assessment of Racial Disparities in Biomarkers for Alzheimer Disease. JAMA Neurol. Published online January 07, 2019. doi:10.1001/jamaneurol.2018.4249

Hohman TJ, Dumitrescu L, Barnes LL, et al. Sex-Specific Association of Apolipoprotein E With Cerebrospinal Fluid Levels of Tau. JAMA Neurol. 2018;75(8):989–998. doi:10.1001/jamaneurol.2018.0821

 

One important reason for the greater cognitive problems commonly experienced as we age, is our increasing difficulty in ignoring distracting and irrelevant information. But it may be that in some circumstances that propensity can be used to help memory.

The study involved 25 younger (17-23) and 32 older adults (60-86), who were shown the faces and names of 24 different people and told to learn them. The names were written in bright blue text and placed on the forehead, and each photo was shown for 3 seconds. After the learning session, participants were immediately tested on their recall of the name for each face. The test was self-paced. Following a 10 minute interval, during which they were given psychological tests, they were shown more photos of faces, but this time were told to ignore the text — their task was to push a button when they saw the same face appear twice in a row. The text was varied: sometimes names, sometimes words, and sometimes nonwords. Ten of the same faces and names from the first task were repeated in the series of 108 trials; all items were repeated three times (thus, 30 repeated face-name pairs; 30 other face-name pairs; 24 face-word pairs; 24 face-nonword pairs). The photos were each displayed for 1.5 seconds. A delayed memory test was given after another 10 minutes of psychological testing. A cued-recall test was followed by a forced-choice recognition test.

Unsurprisingly, overall younger adults remembered more names than older adults, and both groups remembered more on the second series, with younger adults improving more. But younger adults showed no benefit for the repeated face-name pairs, while — on the delayed recall task only — older adults did.

Interestingly, there was no sign, in either group, of repeated names being falsely recalled or recognized. Nor did they significantly affect familiarity.

It seems that this sort of inadvertent repetition doesn’t improve memory for items (faces, names), but, specifically, the face-name associations. The study builds on previous research indicating that older adults hyperbind distracting names and attended faces, which produces better learning of these face-name pairs.

It’s suggested that repetition as distraction might act as a sort of covert retrieval practice that relies on a nonconscious process specifically related to the priming of relational associations. Perhaps older adults’ vulnerability to distraction is not simply a sign of degeneration, but reflects a change of strategy to one that increases receptiveness to environmental regularities that have predictive value. Younger adults have narrowed attention that, while it allows them greater focus on the task, also stops them noticing information that is immediately irrelevant but helpful further down the track.

The researchers are working on a training program to help older adults with MCI use this benefit to better remember faces and names.

https://www.eurekalert.org/pub_releases/2018-03/bcfg-oad031618.php

Biss, Renée K., Rowe, Gillian, Weeks, Jennifer C., Hasher, Lynn, Murphy, Kelly J. 2018. Leveraging older adults’ susceptibility to distraction to improve memory for face-name associations. Psychology and Aging, 33(1), 158-164.

A small Japanese study has found evidence that those with amnestic mild cognitive impairment (aMCI) show a specific decline in their ability to recognize faces, and this is accompanied by changes in the way they scan faces.

The study involved 18 patients with aMCI and 18 age-matched healthy controls. Participants were tested on their ability to perceive and remember images of faces and houses.

Those with aMCI showed poorer memory for faces compared to their memory for houses, while control participants showed no difference between the two. Moreover, compared with controls, those with aMCI spent less time looking at the eyes in the image, while increasing the time they spent looking at the mouths of faces.

In general, people have an excellent memory for faces compared to other visual stimuli, and the eyes are particularly useful in helping us remember the face. The researchers suggest that damage to the brain region known as the fusiform face area (FFA) is responsible for the abnormal processing of faces. It is worth noting that a case study of a patient with acquired prosopagnosia revealed the same pattern of fixating on the mouth rather than the eyes.

The finding is consistent with several other studies showing impaired face processing in those with aMCI, but there is some controversy about that conclusion.

https://www.eurekalert.org/pub_releases/2017-11/ku-pso112117.php

Full text available at https://www.nature.com/articles/s41598-017-14585-5

 

Mild cognitive impairment (MCI) is a precursor of Alzheimer's disease, although having MCI does not mean you are definitely going to progress to Alzheimer's. A new study suggests that one sign of MCI development might be personality changes.

The study involved 277 cognitively healthy residents of a U.S. County, who had the apolipoprotein E (APOE) ɛ4 gene (otherwise known as the ‘Alzheimer’s gene’). Over the study period (around 7 years), 25 developed MCI. Their performance on the Neuroticism, Extraversion, and Openness Personality Inventory—Revised (delivered at the beginning of the study, as well as at other times during the study) was compared with that of the other 252 participants.

Neuroticism increased significantly more in those developing MCI, and openness decreased more. Those developing MCI also showed significantly greater depression, somatization, irritability, anxiety, and aggressive attitude. (Somatization refers to the tendency to generate physical manifestations in response to psychological distress.)

While such personality changes may be barely noticeable at this stage, it may be that diagnosing such early personality changes could help experts develop earlier, safer, and more effective treatments — or even prevention options — for the more severe types of behavior challenges that affect people with Alzheimer's disease.

https://www.eurekalert.org/pub_releases/2018-01/ags-pcd012318.php

Data from over 11,500 participants in the Atherosclerosis Risk in Communities (ARIC) cohort has found evidence that orthostatic hypotension in middle age may increase the risk of cognitive impairment and dementia 20 years later.

Orthostatic hypotension is the name for the experience of dizziness or light-headedness on standing up. Previous research has suggested an association between orthostatic hypotension and cognitive decline in older adults.

In this study, participants aged 45-64 were tested for orthostatic hypotension in 1987. Those with it (703, around 6%) were 40% more likely to develop dementia in the next 20 years. They also had some 15% more cognitive decline.

Orthostatic hypotension was defined as a drop of 20 mmHg or more in systolic blood pressure or 10 mmHg or more in diastolic blood pressure, when the individual stood up after 20 minutes lying down.

More work is needed to understand the reason for the association.

https://www.eurekalert.org/pub_releases/2017-03/jhub-rbp030817.php

Rawlings, Andreea. 2017. Orthostatic Hypotension is Associated with 20-year Cognitive Decline and Incident Dementia: The Atherosclerosis Risk in Communities (ARIC) Study. Presented March 10 at the American Heart Association's EPI|LIFESTYLE 2017 Scientific Sessions in Portland, Oregon.

A study involving 35 adults with MCI found that those who exercised four times a week over a six-month period increased their volume of gray matter. But those who participated in aerobic exercise experienced significantly greater gains than those who just stretched, who also showed signs of white matter loss.

Aerobic activity included treadmill, stationary bike or elliptical training.

The study was presented at the annual meeting of the Radiological Society of North America (RSNA) in November, 2016.

https://www.eurekalert.org/pub_releases/2016-11/rson-aep111716.php

In the past few months, several studies have come out showing the value of three different tests of people's sense of smell for improving the accuracy of MCI and Alzheimer's diagnosis, or pointing to increased risk. The studies also add to growing evidence that a decline in sense of smell is an early marker for mild cognitive impairment and Alzheimer’s. Indeed, it appears that this sensory loss is a very early symptom, preceding even the shrinking of the entorhinal cortex (the first brain region to show signs of atrophy).

Smell test improves accuracy of MCI & Alzheimer's diagnosis

A simple, commercially available test known as the Sniffin' Sticks Odor Identification Test, in which subjects must try to identify 16 different odors, was given to 728 older adults, as well as a standard cognitive test (the Montreal Cognitive Assessment).

The participants had already been evaluated by doctors and classified as being healthy (292 subjects), having MCI (174: 150 aMCI, 24 naMCI), or having Alzheimer's (262).

It was found that, while the cognitive test alone correctly classified 75% of people with MCI, the number rose to 87% when the sniff test results were added. Diagnosis of Alzheimer's, and of subtypes within MCI, was also improved.

The smell test normally takes 5 to 8 minutes to administer; the researchers are trying to get it down to 3 minutes, to encourage greater use.

A new smell test

Another recent study validates a new smell test which is rather more complicated. The test was developed because the standard University of Pennsylvania Smell Identification Test doesn’t take into account the great variation in olfactory ability among healthy individuals. The ability of normal individuals to recognize and discriminate between odors can vary by as much as 40 times!

The new test is actually four tests:

  • In the OPID (Odor Percept IDentification)-10 test, participants are presented with 10 odors (menthol, clove, leather, strawberry, lilac, pineapple, smoke, soap, grape, lemon) for two seconds each. They are then asked whether the scent is familiar and given a choice of four of the 10 words from which are asked to pick the best one that describes the odor.
  • The Odor Awareness Scale (OAS) assesses their overall attention to environmental odors and how they are affected emotionally and behaviorally by scents.
  • The OPID-20 test includes an additional 10 odors (banana, garlic, cherry, baby powder, grass, fruit punch, peach, chocolate, dirt, orange). Participants are first asked whether a presented odor was included in the OPID-10 test and then asked which word best describes the odor. Their ability to remember odors from the first test determines their POEM (Percepts of Odor Episodic Memory) score.
  • In the Odor Discrimination (OD) test, participants are presented with two consecutive odors and asked whether they were different or the same, a process that is repeated 12 times with different paired scents.

The study involved 183 older adults, of whom 70 were cognitively normal, 74 tested normal but were concerned about their cognitive abilities, 29 had MCI and 10 had been diagnosed with possible or probable Alzheimer's disease.

Results of the OPID-20 test significantly differentiated among the four groups of participants, and those results correlated with the thinning of the hippocampus and the entorhinal cortex. Participants' ability to remember a previously presented aroma, as reflected in the POEM score, was also significant, with participants with Alzheimer's disease performing at no better than chance.

POEM scores of the two cognitively normal groups were compared with what would have been predicted based on their ability to identify and differentiate between odors, as reflected in the OAS and OD tests. Poor POEM performers were more likely to have the ‘Alzheimer's gene’ (APOEe4), showed thinning of the entorhinal cortex, and poorer cognitive performance over time.

Validation of UPSIT

However, two 2016 studies support the use of the University of Pennsylvania Smell Identification Test (UPSIT), and suggest it may offer a practical, low-cost alternative to other tests.

In one study, UPSIT was administered to 397 older adults (average age 80) without dementia, who were also given an MRI scan to measure the thickness of the entorhinal cortex (the first brain region to be affected by Alzheimer's disease). After four years, 50 participants (12.6%) had developed dementia, and nearly 20% had signs of cognitive decline.

Low UPSIT scores, but not entorhinal cortical thickness, were significantly associated with dementia and Alzheimer's disease, and with cognitive impairment. Entorhinal cortical thickness was significantly associated with UPSIT score in those who transitioned from MCI to dementia.

In other words, it looks like impairment in odor identification precedes thinning in the entorhinal cortex.

In another study, UPSIT was administered to 84 older adults, of whom 58 had MCI, as well as either beta amyloid PET scanning or analysis of cerebrospinal fluid. After six months, 67% had signs of memory decline, and this was predicted by amyloid-beta levels (assessed by either method), but not UPSIT score. However, participants with a score of less than 35 were more than three times as likely to have memory decline as those with higher UPSIT scores.

The researchers suggest the association wasn’t as strong in this study because of the younger age of participants (median age 71), their higher education, and the short follow-up.

https://www.eurekalert.org/pub_releases/2016-12/uops-psc122016.php

https://www.eurekalert.org/pub_releases/2016-11/mgh-atr111416.php

http://www.eurekalert.org/pub_releases/2016-07/cumc-stm072516.php

[4209] Quarmley, M., Moberg P. J., Mechanic-Hamilton D., Kabadi S., Arnold S. E., Wolk D. A., et al.
(2017).  Odor Identification Screening Improves Diagnostic Classification in Incipient Alzheimer’s Disease.
Journal of Alzheimer's Disease. 55(4), 1497 - 1507.

[4210] Dhilla, A. Alefiya, Asafu-Adjei J., Delaney M. K., Kelly K. E., Gomez-Isla T., Blacker D., et al.
(2016).  Episodic memory of odors stratifies Alzheimer biomarkers in normal elderly.
Annals of Neurology. 80(6), 846 - 857.

Lee, Seonjoo et al. 2016. Predictive Utility of Entorhinal Cortex Thinning and Odor Identification Test for Transition to Dementia and Cognitive Decline in an Urban Community Population. Presented at the Alzheimer's Association's International Conference in Toronto.

Kreisl, William et al. 2016. Both Odor Identification and Amyloid Status Predict Memory Decline in Older Adults. Presented at the Alzheimer's Association's International Conference in Toronto.

A study comparing the language abilities of 22 healthy young individuals, 24 healthy older individuals and 22 people with MCI, has found that those with MCI:

  • were much less concise in conveying information
  • produced much longer sentences
  • had a hard time staying on point
  • were much more roundabout in getting their point across.

So, for example, when given an exercise in which they had to join up three words (e.g., “pen”, “ink” and “paper”), the healthy volunteers typically joined the three in a simple sentence, while the MCI group gave circuitous accounts such as going to the shop and buying a pen.

Additionally, when asked to repeat phrases read out by the interviewer, those with MCI had trouble when given phrases involving ambiguous pronouns (e.g., “Fred visited Bob after his graduation”), although they had no trouble with more complex sentences.

A caveat: if you're just one of those people who has always talked like this, don't panic! It's a matter of change and deterioration, not a stable personality trait.

https://www.theguardian.com/society/2017/feb/21/long-winded-speech-could-be-early-sign-of-alzheimers-says-study

Janet Sherman presented the findings at the annual meeting of the American Association for the Advancement of Science in Boston, in February 2017.

Following on from a previous study showing that such a virtual supermarket game administered by a trained professional can detect MCI, a small study used a modified Virtual SuperMarket Remote Assessment Routine (VSM-RAR) that was self-administered by the patient at home on their own, for a period of one month.

Using the average score over 20 assessments, the game correctly diagnosed MCI 91.8% of the time, a level of diagnostic accuracy similar to the most accurate standardized neuropsychological tests.

The study involved six patients with MCI and six healthy older adults.The level of diagnostic accuracy was better using the average score than in the previous study in which only a single score was used.

A tablet PC was provided to the participants, on which to play the game.

https://www.eurekalert.org/pub_releases/2017-02/ip-mci022317.php

Data from the Women's Health Initiative Memory Study, involving 6,467 postmenopausal women (65+) who reported some level of caffeine consumption, has found that those who consumed above average amounts of coffee had a lower risk of developing dementia.

Caffeine intake was estimated from a questionnaire. The median intake was 172 mg per day (an 8-ounce cup of brewed coffee contains 95mg of caffeine, 8-ounces of brewed black tea contains 47mg, so slightly less than 2 cups of coffee or less than 4 cups of tea). The women were cognitively assessed annually.

Over ten years, 388 were diagnosed with probable dementia (209) or MCI (179). Those who consumed above the median amount of caffeine had a 36% reduction in risk. The average intake in this group was 261 mg (3 cups of coffee), while the average intake for those below the median was 64 mg per day (less than one cup).

Risk factors such as hormone therapy, age, race, education, body mass index, sleep quality, depression, hypertension, prior cardiovascular disease, diabetes, smoking, and alcohol consumption, were taken into account.

The findings are consistent with other research finding a benefit for older women. It should not be assumed that the findings apply to men. It also appears that there may be a difference depending on education level. This sample had a high proportion of college-educated women.

It should also be noted that there was no clear dose-response effect — we could put more weight on the results if there was a clear relationship between amount of caffeine and benefit. Part of the problem here, however, is that it’s difficult to accurately assess the amount of caffeine, given that it’s based on self-report intake of coffee and tea, and the amount of caffeine in different beverages varies significantly.

Moreover, we do have a couple of mechanisms for caffeine to help fight age-related cognitive decline.

A recent study using rats modified to have impaired receptors for the adenosine A2A produced rats showing typical characteristics of an aging brain. In humans, too, age-related cognitive decline has been associated with over-activation of these receptors and dysfunction in glucocorticoid receptors.

The rat study shows that over-activation of the adenosine A2A receptors reduces the levels of glucocorticoid receptors in the hippocampus, which in turn impairs synaptic plasticity and cognition. In other words, it is the over-activation of the adenosine receptors that triggers a process that ends with cognitive impairment.

The point of all this is that caffeine inhibits the adenosine A2A receptors, and when the rats were given a caffeine analogue, their memory deficits returned to normal.

Another more recent study has found that caffeine increases the production of an enzyme that helps prevent tau tangles.

Building on previous research finding that an enzyme called NMNAT2 not only protects neurons from stress, but also helps prevent misfolded tau proteins (linked to Alzheimer’s, and other neurodegenerative disorders), the study identified 24 compounds (out of 1,280 tested) as having potential to increase the production of NMNAT2. One of the most effective of these was caffeine.

When caffeine was given to mice modified to produce lower levels of NMNAT2, the mice began to produce the same levels of the enzyme as normal mice.

https://www.eurekalert.org/pub_releases/2016-10/oupu-fwc100316.php

https://www.eurekalert.org/pub_releases/2016-08/ind-cai083016.php

https://www.eurekalert.org/pub_releases/2017-03/iu-cbe030717.php

Data from 876 patients (average age 78) in the 30-year Cardiovascular Health Study show that virtually any type of aerobic physical activity can improve brain volume and reduce Alzheimer's risk.

A higher level of physical activity was associated with larger brain volumes in the frontal, temporal, and parietal lobes including the hippocampus, thalamus and basal ganglia. Among those with MCI or Alzheimer's (25% of the participants), higher levels of physical activity were also associated with less brain atrophy. An increase in physical activity was also associated with larger grey matter volumes in the left inferior orbitofrontal cortex and the left precuneus.

Further analysis of 326 of the participants found that those with the highest energy expenditure were half as likely to have developed Alzheimer's disease five years later.

Physical activity was assessed using the Minnesota Leisure-Time Activities questionnaire, which calculates kilocalories/week using frequency and duration of time spent in 15 different leisure-time activities: swimming, hiking, aerobics, jogging, tennis, racquetball, walking, gardening, mowing, raking, golfing, bicycling, dancing, calisthenics, and riding an exercise cycle.

The study does not look at whether some types of physical activity are better than others, unfortunately, but its message that overall physical activity, regardless of type, helps in the fight against cognitive impairment is encouraging.

http://www.eurekalert.org/pub_releases/2016-03/ip-dko030916.php

http://www.eurekalert.org/pub_releases/2016-03/uops-bmc031016.php

A German study involving 1,936 older adults (50+) has found that mild cognitive impairment (MCI) occurred twice as often in those diagnosed with type 2 diabetes.

Analysis of 560 participants with MCI (289 with amnestic MCI and 271 with non-amnestic MCI) and 1,376 cognitively normal participants revealed that this was only observed in middle-aged participants (50-65), not in older participants (65-80). Interestingly, there was a gender difference. Middle-aged women showed a stronger association between diabetes and amnestic MCI, while middle-aged men showed a stronger association with non-amnestic MCI.

http://www.eurekalert.org/pub_releases/2014-09/ip-dma090214.php

Winkler, A., Dlugaj, M., Weimar, C., Jöckel, K.-H., Erbel, R., Dragano, N., & Moebus, S. (2014). Association of diabetes mellitus and mild cognitive impairment in middle-aged men and women. Journal of Alzheimer’s Disease: JAD, 42(4), 1269–1277. http://doi.org/10.3233/JAD-140696

A study involving 266 people with mild cognitive impairment (aged 70+) has found that B vitamins are more effective in slowing cognitive decline when people have higher omega 3 levels.

Participants were randomly selected to receive either a B-vitamin supplement (folic acid, vitamins B6 and B12) or a placebo pill for two years. The vitamins had little to no effect for those with low levels of omega-3 fatty acids, but were very effective for those with high baseline omega-3 levels.

Levels of DHA appeared to be more important than levels of EPA, but more research is needed to confirm that.

The finding may help to explain why research looking at the effects of B vitamins, or the effects of omega-3 oils, have produced inconsistent findings.

The study followed research showing that B vitamins can slow or prevent brain atrophy and memory decline in people with MCI, and they were most effective in those who had above average blood levels of homocysteine.

http://www.eurekalert.org/pub_releases/2016-01/uoo-ola011916.php

A large, two-year study challenges the evidence that regular exercise helps prevent age-related cognitive decline.

The study involved 1,635 older adults (70-89) who were enrolled in the Lifestyle Interventions and Independence for Elders (LIFE) study. They were sedentary adults who were at risk for mobility disability but able to walk about a quarter mile. Participants had no significant cognitive impairment (as measured by the MMSE) at the beginning of the study. Around 90% (1476) made it to the end of the study, and were included in the analysis.

Half the participants were randomly assigned to a structured, moderate-intensity physical activity program that included walking, resistance training, and flexibility exercises, and the other half to a health education program of educational workshops and upper-extremity stretching.

In the physical activity condition, participants were expected to attend 2 center-based visits per week and perform home-based activity 3 to 4 times per week. The sessions progressed toward a goal of 30 minutes of walking at moderate intensity, 10 minutes of primarily lower-extremity strength training with ankle weights, and 10 minutes of balance training and large muscle group flexibility exercises.

The health education group attended weekly health education workshops during the first 26 weeks of the intervention and at least monthly sessions thereafter. Sessions lasted 60 to 90 minutes and consisted of interactive and didactic presentations, facilitator demonstrations, guest speakers, or field trips. Sessions included approximately 10 minutes of group discussion and interaction and 5 to 10 minutes of upper-extremity stretching and flexibility exercises.

Cognitive assessments were made at the beginning of the study and at 24 months, as well as a computerized assessment at either 18 or 30 months.

At the end of the study, there was no significant difference in cognitive score, or incidence of MCI or dementia, between the two groups. However, those in the exercise group who were 80 years or older ( 307) and those with poorer baseline physical performance ( 328) did show significantly better performance in executive function.

Executive function is not only a critical function in retaining the ability to live independently, research has also shown that it is the most sensitive cognitive domain to physical exercise.

Note also that there was no absolute control group — that is, people who received no intervention. Both groups showed remarkably stable cognitive scores over the two years, suggesting that both interventions were in fact effective in “holding the line”.

While this finding is disappointing and a little surprising, it is not entirely inconsistent with the research. Studies into the benefits of physical exercise for fighting age-related cognitive decline and dementia have produced mixed results. It does seem clear that the relationship is not a simple one, and what's needed is a better understanding of the complexities of the relationship. For example, elements of exercise that are critical, and the types of people (genes; health; previous social, physical, and cognitive attributes) that may benefit.

http://www.eurekalert.org/pub_releases/2015-08/tjnj-eop082115.php

A large meta-analysis has concluded that having diabetes increases the chance that a person with mild cognitive impairment will progress to dementia by 65%.

There was no consistent evidence that hypertension or cholesterol levels increased the risk of someone with MCI progressing to dementia. Smoking was similarly not associated with increased risk, although the reason for this probably lies in mortality: smokers tend to die before developing dementia.

There was some evidence that having symptoms of psychiatric conditions, including depression, increased the risk of progressing to dementia.

There was some evidence that following a Mediterranean diet decreased the risk of an individual with amnestic MCI progressing to Alzheimer's, and that higher folate levels decrease the risk of progressing from MCI to dementia. The evidence regarding homocysteine levels was inconsistent.

The evidence indicates that level of education does not affect the risk of someone with MCI progressing to dementia.

Do note that all this is solely about progression from MCI to dementia, not about overall risk of developing dementia. Risk factors are complex. For example, cholesterol levels in mid-life are associated with the later development of dementia, but cholesterol levels later in life are not. This is consistent with cholesterol levels not predicting progression from MCI to dementia. Level of education is a known risk factor for dementia, but it acts by masking the damage in the brain, not preventing it. It is not surprising, therefore, that it doesn't affect progression from MCI to dementia, because higher education helps delay the start, it doesn't slow the rate of decline.

Do note also that a meta-analysis is only as good as the studies it's reviewing! Some factors couldn't be investigated because they haven't been sufficiently studied in this particular population (those with MCI).

The long-running Cache County study has previously found that 46% of those with MCI progressed to dementia within three years; this compared with 3% of those (age-matched) with no cognitive impairment at the beginning of the study.

More recently, data from the long-running, population-based Rotterdam study revealed that those diagnosed with MCI were four times more likely to develop dementia, over seven years. compared with those without MCI. Of those with MCI (10% of the 4,198 study participants), 40% had amnestic MCI — the form of MCI that is more closely associated with Alzheimer's disease.

The 2014 study also found that older age, positive APOE-ɛ4 status, low total cholesterol levels, and stroke, were all risk factors for MCI. Having the APOE-ɛ4 genotype and smoking were related only to amnestic MCI. Waist circumference, hypertension, and diabetes were not significantly associated with MCI. This may be related to medical treatment — research has suggested that hypertension and diabetes may be significant risk factors only when untreated or managed poorly.

http://www.theguardian.com/science/occams-corner/2015/feb/24/speeding-up-the-battle-against-slowing-minds

http://www.eurekalert.org/pub_releases/2015-02/ucl-dad022015.php

http://www.eurekalert.org/pub_releases/2014-08/ip-drq080614.php

[3913] Cooper, C., Sommerlad A., Lyketsos C. G., & Livingston G.
(2015).  Modifiable Predictors of Dementia in Mild Cognitive Impairment: A Systematic Review and Meta-Analysis.
American Journal of Psychiatry. 172(4), 323 - 334.

[3914] Tschanz, J. T., Welsh-Bohmer K. A., Lyketsos C. G., Corcoran C., Green R. C., Hayden K., et al.
(2006).  Conversion to dementia from mild cognitive disorder The Cache County Study.
Neurology. 67(2), 229 - 234.

de Bruijn, R.F.A.G. et al. Determinants, MRI Correlates, and Prognosis of Mild Cognitive Impairment: The Rotterdam Study. Journal of Alzheimer’s Disease, Volume 42/Supplement 3 (August 2014): 2013 International Congress on Vascular Dementia (Guest Editor: Amos D. Korczyn), DOI: 10.3233/JAD-132558.

A pilot study involving 17 older adults with mild cognitive impairment and 18 controls (aged 60-88; average age 78) has found that a 12-week exercise program significantly improved performance on a semantic memory task, and also significantly improved brain efficiency, for both groups.

The program involved treadmill walking at a moderate intensity. The semantic memory tasks involved correctly recognizing names of celebrities well known to adults born in the 1930s and 40s (difficulty in remembering familiar names is one of the first tasks affected in Alzheimer’s), and recalling words presented in a list. Brain efficiency was demonstrated by a decrease in the activation intensity in the 11 brain regions involved in the memory task. The brain regions with improved efficiency corresponded to those involved in Alzheimer's disease, including the precuneus region, the temporal lobe, and the parahippocampal gyrus.

Participants also improved their cardiovascular fitness, by about 10%.

http://www.eurekalert.org/pub_releases/2013-07/uom-emb073013.php

Smith, J.C. et al. 2013. Semantic Memory Functional MRI and Cognitive Function After Exercise Intervention in Mild Cognitive Impairment. Journal of Alzheimer’s Disease, 37 (1), 197-215.

Data from 1,425 cognitively healthy older adults (70-89) has found that a diagnosis of chronic obstructive pulmonary disease (COPD) was associated with an 83% greater risk of developing non-amnestic mild cognitive impairment. The greatest risk was among patients who had COPD for more than five years.

Over the study period, 230 (16%) developed amnestic MCI, 97 (7%) nonamnestic MCI, 27 (2%) MCI of unknown type, and 16 dementia (1%).

http://www.eurekalert.org/pub_releases/2014-03/tjnj-caw031414.php

A study involving 97 healthy older adults (65-89) has found that those with the “Alzheimer’s gene” (APOe4) who didn’t engage in much physical activity showed a decrease in hippocampal volume (3%) over 18 months. Those with the gene who did exercise showed no change in the size of their hippocampus, nor did those without the gene, regardless of exercise. Physical activity was classified as low if the participant reported two or fewer days per week of low intensity activity, such as no activity, slow walking or light chores. Physical activity was classified as high if the participant reported three or more days/week of moderate to vigorous activity

The finding suggests that those with the risky gene will benefit most from regular exercise — indeed, this is as yet the only known means to counteract hippocampal shrinkage.

http://www.eurekalert.org/pub_releases/2014-04/uom-pak042214.php

[3605] Smith, J. Carson, Nielson K. A., Woodard J. L., Seidenberg M., Durgerian S., Hazlett K. E., et al.
(2014).  Physical activity reduces hippocampal atrophy in elders at genetic risk for Alzheimer's disease.
Frontiers in Aging Neuroscience. 6,

A study, involving 371 patients with mild cognitive impairment, has found that those with depressive symptoms had higher levels of amyloid-beta, particularly in the frontal cortex and the anterior and posterior cingulate gyrus (both involved in mood disorders such as depression).

The findings suggest that late-life depression could be a major risk factor for developing Alzheimer's faster than others.

http://www.eurekalert.org/pub_releases/2014-06/sonm-dit060814.php

Brendel, M. et al. 2014. Subsyndromal late life depression is associated with amyloid accumulation in mild cognitive impairment. Presented at the Society of Nuclear Medicine and Molecular Imaging's 2014 Annual Meeting, June 7, 2014, St. Louis, Missouri.

A gene linked to Alzheimer's has been linked to brain changes in childhood. This gene, SORL1, has two connections to Alzheimer’s: it carries the code for the sortilin-like receptor, which is involved in recycling some molecules before they develop into amyloid-beta; it is also involved in lipid metabolism, putting it at the heart of the vascular risk pathway.

Brain imaging of 186 healthy individuals (aged 8-86) found that, even among the youngest, those with a specific variant of SORL1 showed a reduction in white matter connections. Post-mortem brain tissue from 269 individuals (aged 0-92) without Alzheimer's disease, found that the same SORL1 variant was linked to a disruption in the process by which the gene translated its code to become the sortilin-like receptor, and this was most prominent during childhood and adolescence. Another set of post-mortem brains from 710 individuals (aged 66-108), of whom the majority had mild cognitive impairment or Alzheimer's, found that the SORL1 risk gene was linked with the presence of amyloid-beta.

It may be that, for those carrying this gene variant, lifestyle interventions may be of greatest value early in life.

http://www.eurekalert.org/pub_releases/2013-12/cfaa-arg120313.php

[3570] Felsky, D., Szeszko P., Yu L., Honer W. G., De Jager P. L., Schneider J. A., et al.
(2013).  The SORL1 gene and convergent neural risk for Alzheimer’s disease across the human lifespan.
Molecular Psychiatry.

Analysis of data from 237 patients with mild cognitive impairment (mean age 79.9) has found that, compared to those carrying the ‘normal’ ApoE3 gene (the most common variant of the ApoE gene), the ApoE4 carriers showed markedly greater rates of shrinkage in 13 of 15 brain regions thought to be key components of the brain networks disrupted in Alzheimer’s.

http://www.eurekalert.org/pub_releases/2014-01/rson-gva010714.php

[3578] Hostage, C. A., Choudhury K R., Doraiswamy M. P., & Petrella J. R.
(2013).  Mapping the Effect of the Apolipoprotein E Genotype on 4-Year Atrophy Rates in an Alzheimer Disease–related Brain Network.
Radiology. 271(1), 211 - 219.

Analysis of brain scans and cognitive scores of 64 older adults from the NIA's Baltimore Longitudinal Study of Aging (average age 76) has found that, between the most cognitively stable and the most declining (over a 12-year period), there was no significant difference in the total amount of amyloid in the brain, but there was a significant difference in the location of amyloid accumulation. The stable group showed relatively early accumulation in the frontal lobes, while the declining group showed it in the temporal lobes.

http://www.eurekalert.org/pub_releases/2013-07/uops-pop071513.php

[3624] Yotter, R. A., Doshi J., Clark V., Sojkova J., Zhou Y., Wong D. F., et al.
(2013).  Memory decline shows stronger associations with estimated spatial patterns of amyloid deposition progression than total amyloid burden.
Neurobiology of Aging. 34(12), 2835 - 2842.

A pilot study involving 94 older adults, of whom 18 had Alzheimer’s, 24 had MCI, 26 other dementias, and 26 were healthy controls, has found those with Alzheimer’s were significantly less able to detect the smell of peanut butter. Peanut butter was chosen because of its purity and accessibility (not because there's something special about its smell!).

The test was undertaken by the patient closing eyes and mouth and blocking one nostril, while the clinician held a ruler next to the open nostril and moved 14g of peanut butter in an open jar up the ruler one centimeter at a time, as the patient breathed out. Those in the early stages of Alzheimer’s disease showed a dramatic difference in detecting odor between the left and right nostril. The average distance at which the peanut butter was detected was 5.1 cm for the left nostril, compared to 17.4 cm for the right. The difference between these (12.4 cm) compares to an average 4.8 cm for other dementias, 1.9 for MCI, and 0 for healthy controls.

Of the 24 patients with MCI, only 10 patients showed a left nostril impairment, suggesting that this may be an indication of who will go on to develop Alzheimer’s.

http://www.futurity.org/can-peanut-butter-smell-test-confirm-alzheimers/

[3609] Stamps, J. J., Bartoshuk L. M., & Heilman K. M.
(2013).  A brief olfactory test for Alzheimer's disease.
Journal of the Neurological Sciences. 333(1), 19 - 24.

Data from 6257 older adults (aged 55-90) evaluated from 2005-2012 has revealed that concerns about memory should be taken seriously, with subjective complaints associated with a doubled risk of developing mild cognitive impairment or dementia, and subjective complaints supported by a loved one being associated with a fourfold risk. Complaints by a loved one alone were also associated with a doubled risk. Among those with MCI, subjective complaints supported by a loved one were associated with a threefold risk of converting to dementia.

Of the 4414 initially cognitively normal, 14% developed MCI or dementia over the course of the study (around 5 years); of the 1843 with MCI, 41% progressed to dementia.

http://www.futurity.org/worry-about-memory-predicts-alzheimer%E2%80%99s-risk/

[3573] Gifford, K. A., Liu D., Lu Z., Tripodis Y., Cantwell N. G., Palmisano J., et al.
(2014).  The source of cognitive complaints predicts diagnostic conversion differentially among nondemented older adults.
Alzheimer's & Dementia. 10(3), 319 - 327.

Analysis of mitochondrial DNA (mtDNA) in the cerebrospinal fluid has found that both symptomatic Alzheimer’s patients and asymptomatic patients at risk of Alzheimer’s showed a significant decrease in levels of circulating cell-free mtDNA in the CSF. Patients with frontotemporal dementia did not display this.

Moreover, this potential biomarker occurred at least a decade before signs of dementia manifested, preceding the appearance of amyloid-beta and tau — suggesting not simply that it might be used as a very early sign of developing Alzheimer’s, but that the pathological process of Alzheimer's disease starts earlier than previously thought.

http://www.eurekalert.org/pub_releases/2013-08/cg-nbc081213.php

[3598] Podlesniy, P., Figueiro-Silva J., Llado A., Antonell A., Sanchez-Valle R., Alcolea D., et al.
(2013).  Low cerebrospinal fluid concentration of mitochondrial DNA in preclinical Alzheimer disease.
Annals of Neurology. 74(5), 655 - 668.

Comparison of the EEGs of 27 healthy older adults, 27 individuals with mild Alzheimer's and 22 individuals with moderate cases of Alzheimer’s, has found statistically significant differences across the three groups, using an algorithm that dissects brain waves of varying frequencies.

In particular, delta modulation of the beta frequency band reliably discriminated between healthy controls and mild Alzheimer’s, and disappeared with an increase in disease severity (from mild to moderate). Increase in disease severity was also marked by the appearance of delta modulation of the theta band.

It’s hoped that the algorithm can be used not only to help detect Alzheimer’s disease early, but also to monitor its progression. The algorithm has been shared on the NeuroAccelerator.org online data analysis portal, to enable it to be used by researchers around the world.

http://www.eurekalert.org/pub_releases/2013-08/i-tae082913.php

[3572] Fraga, F. J., Falk T. H., Kanda P. A. M., & Anghinah R.
(2013).  Characterizing Alzheimer’s Disease Severity via Resting-Awake EEG Amplitude Modulation Analysis.
PLoS ONE. 8(8), 

Data from two longitudinal studies of older adults (a nationally representative sample of older adults, and the Alzheimer’s Disease Neuroimaging Initiative) has found that a brief cognitive test can distinguish memory decline associated with healthy aging from more serious memory disorders, years before obvious symptoms show up.

Moreover, the data challenge the idea that memory continues to decline through old age: after excluding the cognitively impaired, there was no evidence of further memory declines after the age of 69.

The data found that normal aging showed declines in recollective memory (recalling a word or event exactly) but not in reconstructive memory (recalling a word or event by piecing it together from clues about its meaning, e.g., recalling that “dog” was presented in a word list by first remembering that household pets were presented in the list). However, declines in reconstructive memory were reliable predictors of future progression from healthy aging to mild cognitive impairment and Alzheimer’s.

http://www.futurity.org/memory-test-mistakes-can-flag-trouble-sooner/

[3556] Brainerd, C. J., Reyna V. F., Gomes C. F. A., Kenney A. E., Gross C. J., Taub E. S., et al.
(2014).  Dual-retrieval models and neurocognitive impairment.
Journal of Experimental Psychology: Learning, Memory, and Cognition. 40(1), 41 - 65.

Data from 848 adults of all ages has found that brain volume in the default mode network declined in both healthy and pathological aging, but the greatest decline occurred in Alzheimer’s patients and in those who progressed from mild cognitive impairment to Alzheimer’s disease. Reduced brain volumes in these regions were associated with declines in cognitive ability, the presence of Alzheimer’s biomarkers in the cerebrospinal fluid, and with carrying the “Alzheimer’s gene”, the APOE4 allele.

The findings support the idea that neurodegeneration spreads through networks of connected brain regions, in a disease specific manner.

http://www.futurity.org/faster-brain-shrinkage-flag-alzheimers/

[3607] Spreng, R. Nathan, & Turner G. R.
(2013).  Structural Covariance of the Default Network in Healthy and Pathological Aging.
The Journal of Neuroscience. 33(38), 15226 - 15234.

New research supports the classification system for preclinical Alzheimer’s proposed two years ago. The classification system divides preclinical Alzheimer's into three stages:

Stage 1: Levels of amyloid beta begin to decrease in the spinal fluid. This indicates that the substance is beginning to form plaques in the brain.

Stage 2: Levels of tau protein start to increase in the spinal fluid, indicating that brain cells are beginning to die. Amyloid beta levels are still abnormal and may continue to fall.

Stage 3: In the presence of abnormal amyloid and tau biomarker levels, subtle cognitive changes can be detected by neuropsychological testing.

Long-term evaluation of 311 cognitively healthy older adults (65+) found 31% with preclinical Alzheimer’s, of whom 15% were at stage 1, 12% at stage 2, and 4% at stage 3. This is consistent with autopsy studies, which have shown that around 30% of cognitively normal older adults die with some preclinical Alzheimer's pathology in their brain. Additionally, 23% were diagnosed with suspected non-Alzheimer pathophysiology (SNAP), 41% as cognitively normal, and 5% as unclassified.

Five years later, 2% of the cognitively normal, 5% of those with SNAP, 11% of the stage 1 group, 26% of the stage 2 group, and 56% of the stage 3 group had been diagnosed with symptomatic Alzheimer's.

http://www.eurekalert.org/pub_releases/2013-09/wuso-apt092313.php

[3614] Vos, S JB., Xiong C., Visser P J., Jasielec M. S., Hassenstab J., Grant E. A., et al.
(2013).  Preclinical Alzheimer's disease and its outcome: a longitudinal cohort study.
The Lancet Neurology. 12(10), 957 - 965.

Initial findings from an analysis of cerebrospinal fluid taken between 1995 and 2005 from 265 middle-aged healthy volunteers, of whom 75% had a close family member with Alzheimer’s disease, has found that the ratios of phosphorylated tau and amyloid-beta could predict mild cognitive impairment more than five years before symptom onset — the more tau and less amyloid-beta, the more likely MCI will develop. The rate of change in the ratio over time was also predictive — the more rapidly the ratio of tau to amyloid-beta went up, the more likely the eventual development of MCI.

The drop in amyloid-beta is thought to be because it is getting trapped in the plaques characteristic of Alzheimer’s.

http://www.futurity.org/spinal-fluid-test-may-predict-alzheimers/

[3592] Moghekar, A., Li S., Lu Y., Li M., Wang M-C., Albert M., et al.
(2013).  CSF biomarker changes precede symptom onset of mild cognitive impairment.
Neurology.

Cognitive testing for dementia has a problem in that low scores on some tests may simply reflect a person's weakness in some cognitive areas, or the presence of a relatively benign form of mild cognitive impairment (one that is not going to progress to dementia). A 2008 study found that one of every six healthy adults scored poorly on two or more of 10 tests in a brief cognitive battery. Following this up, the same researchers now show that a more holistic view might separate those who are on the path to dementia from those who are not.

Data from 395 clinical patients (aged 60+) and 135 healthy older adults has revealed that, while the cognitively normal produce a pattern of scores on 13 cognitive tests that fits a bell-shaped curve, those experiencing some level of dementia produce a more skewed pattern. Increasingly lower scores and degree of positive skew was also associated with worsening dementia.

http://www.futurity.org/lopsided-cognition-may-predict-early-alzheimers/

http://www.eurekalert.org/pub_releases/2013-11/jhm-jhr111113.php

[3601] Reckess, G. Z., Varvaris M., Gordon B., & Schretlen D. J.
(2014).  Within-person distributions of neuropsychological test scores as a function of dementia severity.
Neuropsychology. 28(2), 254 - 260.

A French study has predicted with 90% accuracy which patients with mild cognitive impairment would receive a clinical diagnosis of Alzheimer's disease within the following two years. The best neurological predictors were cortical thickness in two brain regions (the right anterior cingulate and middle frontal gyri), and the best cognitive predictors were deficits in both free recall and recognition episodic memory. Combining these measures achieved the highest accuracy.

http://www.eurekalert.org/pub_releases/2013-12/uom-amt120213.php

Peters, F., Villeneuve, S. & Belleville, S. 2014. Predicting Progression to Dementia in Elderly Subjects with Mild Cognitive Impairment Using Both Cognitive and Neuroimaging Predictors. Journal of Alzheimer's Disease, 38 (2) 307-318.

A five-year study involving 525 older adults (70+) found 46 had Alzheimer’s or aMCI and a further 28 went on to develop the conditions. The blood levels of 10 specific lipids predicted with more than 90% accuracy whether an individual would go on to develop either Alzheimer’s or aMCI within 2-3 years. The researchers speculate that the lower lipid levels could be an early indication that brain cells are beginning to lose their integrity and break down.

The continual failures in human clinical trials of promising therapies has led to a growing belief that once the cognitive symptoms of the Alzheimer’s have emerged, it may be too late to slow or reverse the neurological damage. However, treatments begun early enough may be more effective. This is why early diagnosis of Alzheimer’s risk is so critical.

http://www.futurity.org/blood-test-predicts-alzheimers-risk/

http://www.theguardian.com/science/2014/mar/09/blood-test-could-detect-early-signs-dementia

http://www.eurekalert.org/pub_releases/2014-03/gumc-bti030314.php

[3588] Mapstone, M., Cheema A. K., Fiandaca M. S., Zhong X., Mhyre T. R., MacArthur L. H., et al.
(2014).  Plasma phospholipids identify antecedent memory impairment in older adults.
Nature Medicine. 20(4), 415 - 418.

A three-year study involving 152 adults aged 50 and older, of whom 52 had been recently diagnosed with mild cognitive impairment and 31 were diagnosed with Alzheimer's disease, has found that those with mild or no cognitive impairment who initially had amyloid-beta plaques showed greater cognitive decline than those whose brain scans were negative for plaques. Moreover, 35% of plaque-positive participants who started with MCI progressed to Alzheimer's, compared to 10% without plaque, and they were more than twice as likely to be started on cognitive-enhancing medication.

The fact that 90% of those with MCI but no plaque didn’t progress to Alzheimer's (within the three-year period) points to the value of using PET imaging to identify patients unlikely to decline, who can be reassured accordingly. The finding also points to the importance of plaque buildup in cognitive decline.

http://www.eurekalert.org/pub_releases/2014-03/dumc-pdi030514.php

[3569] Doraiswamy, M. P., Sperling R. A., Johnson K., Reiman E. M., Wong T. Z., Sabbagh M. N., et al.
(2014).  Florbetapir F 18 amyloid PET and 36-month cognitive decline:a prospective multicenter study.
Molecular Psychiatry.

An analysis of the anatomical connectivity in the brains of 15 people with Alzheimer's disease, 68 with mild cognitive impairment and 28 healthy older individuals, has found several measures showed disease effects:

  • widespread network disruptions
  • decreases in network nodes
  • neural fiber path length
  • decreased signaling efficiency
  • increased asymmetry in the proportions of fibers that connect the left and right cortical regions

http://www.eurekalert.org/pub_releases/2013-08/mali-wgw082213.php

[3565] Daianu, M., Jahanshad N., Nir T. M., Toga A. W., Jack C. R., Weiner M. W., et al.
(2013).  Breakdown of Brain Connectivity Between Normal Aging and Alzheimer's Disease: A Structural k-Core Network Analysis.
Brain Connectivity. 3(4), 407 - 422.

The jugular venous reflux (JVR) occurs when the pressure gradient reverses the direction of blood flow in the veins, causing blood to leak backwards into the brain. A small pilot study has found an association between JVR and white matter changes in the brains of patients with Alzheimer’s disease and those with mild cognitive impairment. This suggests that cerebral venous outflow impairment might play a role in the development of white matter changes in those with Alzheimer’s.

JVR occurs when the internal jugular vein valves don’t open and close properly, which occurs more frequently in the elderly. The study involved 12 patients with Alzheimer’s disease, 24 with MCI, and 17 age-matched controls. Those with severe JVR were more likely to have hypertension, more and more severe white matter changes, and tended to have higher cerebrospinal fluid volumes.

Further research is needed to validate these preliminary findings.

http://www.futurity.org/vascular-changes-neck-may-alzheimers-role/

http://www.eurekalert.org/pub_releases/2013-11/uab-aav112513.php

Chung, C-P. et al. 2013. Jugular Venous Reflux and White Matter Abnormalities in Alzheimer’s Disease: A Pilot Study. Journal of Alzheimer’s Disease, 39 (3), 601-609.

Analyses of cerebrospinal fluid from 15 patients with Alzheimer's disease, 20 patients with mild cognitive impairment, and 21 control subjects, plus brain tissue from some of them, has found that those with Alzheimer’s had lower levels of a particular molecule involved in resolving inflammation. These ‘specialized pro-resolving mediators’ regulate the tidying up of the damage done by inflammation and the release of growth factors that stimulate tissue repair. Lower levels of these molecules also correlated with a lower degree of cognitive function.

The pro-resolving molecules identified so far are derivatives of omega-3 fatty acids, providing support for the idea that dietary supplements of these may provide benefit.

http://www.eurekalert.org/pub_releases/2014-02/ki-irf021414.php

[3616] Wang, X., Zhu M., Hjorth E., Cortés-Toro V., Eyjolfsdottir H., Graff C., et al.
(2014).  Resolution of inflammation is altered in Alzheimer's disease.
Alzheimer's & Dementia.

A large study, involving 3,690 older adults, has found that drugs with strong anticholinergic effects cause memory and cognitive impairment when taken continuously for a mere two months. Moreover, taking multiple drugs with weaker anticholinergic effects, such as many common over-the-counter digestive aids, affected cognition after 90 days’ continuous use. In both these cases, the risk of cognitive impairment doubled (approximately).

More positively, risk of Alzheimer’s did not seem to be affected (however, I do have to wonder how much weight we can put on that, given the apparent length of the study — although this is not a journal to which I have access, so I can’t be sure of that).

Although somewhat unexpected, previous research linking anticholinergics and cognitive impairment is consistent with this new finding.

Anticholinergic drugs block the neurotransmitter acetylcholine. Older adults commonly use over-the-counter drugs with anticholinergic effects as sleep aids and to relieve bladder leakage. Drugs with anticholinergic effects are also frequently prescribed for many chronic diseases including hypertension, cardiovascular disease and chronic obstructive pulmonary disease.

You can download a list detailing the ‘anticholinergic burden’ of medications at: http://www.indydiscoverynetwork.org/AnticholinergicCognitiveBurdenScale.html

http://www.eurekalert.org/pub_releases/2013-05/iu-sua050713.php

[3449] Cai, X., Campbell N., Khan B., Callahan C., & Boustani M.
(2013).  Long-term anticholinergic use and the aging brain.
Alzheimer's & Dementia: The Journal of the Alzheimer's Association. 9(4), 377 - 385.

Analysis of data from 418 older adults (70+) has found that carriers of the ‘Alzheimer’s gene’, APOEe4, were 58% more likely to develop mild cognitive impairment compared to non-carriers. However, ε4 carriers with MCI developed Alzheimer’s at the same rate as non-carriers. The finding turns prevailing thinking on its head: rather than the gene increasing the risk of developing Alzheimer’s, it appears that it increases the risk of MCI — and people with MCI are the main source of new Alzheimer’s diagnoses.

In this regard, it’s worth noting that the cognitive effects of this gene variant have been demonstrated in adults as young as the mid-20s.

The finding points to the benefit of genetic testing for assessing your likelihood of cognitive impairment rather than dementia — and using this knowledge to build habits that fight cognitive impairment.

http://www.futurity.org/health-medicine/genetic-test-fails-to-show-alzhe...

[3370] Brainerd, C. J., Reyna V. F., Petersen R. C., Smith G. E., Kenney A. E., Gross C. J., et al.
(2013).  The apolipoprotein E genotype predicts longitudinal transitions to mild cognitive impairment but not to Alzheimer's dementia: Findings from a nationally representative study.
Neuropsychology. 27(1), 86 - 94.

Brain scans of 61 older adults (65-90), of whom 30 were cognitively healthy, 24 cognitively impaired and 7 diagnosed with dementia, found that, across all groups, both memory and executive function correlated negatively with brain infarcts, many of which had been clinically silent. The level of amyloid in the brain did not correlate with either changes in memory or executive function, and there was no evidence that amyloid interacted with infarcts to impair thinking.

Bottom line: vascular brain injury was far more important than amyloid burden for memory and executive function. The finding highlights the role of vascular injury in mild cognitive impairment.

Read full report at Futurity

[3320] Marchant NL, R BR.
(2013).  The aging brain and cognition: Contribution of vascular injury and aβ to mild cognitive dysfunction.
JAMA Neurology. 1 - 8.

Providing some support for the finding I recently reported — that problems with semantic knowledge in those with mild cognitive impairment (MCI) and Alzheimer’s might be rooted in an inability to inhibit immediate perceptual information in favor of conceptual information — a small study has found that executive function (and inhibitory control in particular) is impaired in far more of those with MCI than was previously thought.

The study involved 40 patients with amnestic MCI (single or multiple domain) and 32 healthy older adults. Executive function was tested across multiple sub-domains: divided attention, working memory, inhibitory control, verbal fluency, and planning.

As a group, those with MCI performed significantly more poorly in all 5 sub-domains. All MCI patients showed significant impairment in at least one sub-domain of executive functioning, with almost half performing poorly on all of the tests. The sub-domain most frequently and severely impaired was inhibitory control.

The finding is in sharp contrast with standard screening tests and clinical interviews, which have estimated executive function impairment in only 15% of those with MCI.

Executive function is crucial for many aspects of our behavior, from planning and organization to self-control to (as we saw in the previous news report) basic knowledge. It is increasingly believed that inhibitory control might be a principal cause of age-related cognitive decline, through its effect on working memory.

All this adds weight to the idea that we should be focusing our attention on ways to improve inhibitory control when it declines. Although training to improve working memory capacity has not been very successful, specific training targeted at inhibitory control might have more luck. Something to hope for!

Previous research has pointed to an association between not having teeth and a higher risk of cognitive decline and dementia. One reason might have to do with inflammation — inflammation is a well-established risk factor, and at least one study has linked gum disease to a higher dementia risk. Or it might have to do with the simple mechanical act of chewing, reducing blood flow to the brain. A new study has directly investigated chewing ability in older adults.

The Swedish study, involving 557 older adults (77+), found that those with multiple tooth loss, and those who had difficulty chewing hard food such as apples, had a significantly higher risk of developing cognitive impairments (cognitive status was measured using the MMSE). However, when adjusted for sex, age, and education, tooth loss was no longer significant, but chewing difficulties remained significant.

In other words, what had caused the tooth loss didn’t matter. The important thing was to maintain chewing ability, whether with your own natural teeth or dentures.

This idea that the physical act of chewing might affect your cognitive function (on a regular basis; I don’t think anyone is suggesting that you’re brighter when you chew!) is an intriguing and unexpected one. It does, however, give even more emphasis to the importance of physical exercise, which is a much better way of increasing blood flow to the brain.

The finding also reminds us that there are many things going on in the brain that may deteriorate with age and thus lead to cognitive decline and even dementia.

A small study shows how those on the road to Alzheimer’s show early semantic problems long before memory problems arise, and that such problems can affect daily life.

The study compared 25 patients with amnestic MCI, 27 patients with mild-to-moderate Alzheimer's and 70 cognitively fit older adults (aged 55-90), on a non-verbal task involving size differences (for example, “What is bigger: a key or a house?”; “What is bigger: a key or an ant?”). The comparisons were presented in three different ways: as words; as images reflecting real-world differences; as incongruent images (e.g., a big ant and a small house).

Both those with MCI and those with AD were significantly less accurate, and significantly slower, in all three conditions compared to healthy controls, and they had disproportionately more difficulty on those comparisons where the size distance was smaller. But MCI and AD patients experienced their biggest problems when the images were incongruent – the ant bigger than the house. Those with MCI performed at a level between that of healthy controls and those with AD.

This suggests that perceptual information is having undue influence in a judgment task that requires conceptual knowledge.

Because semantic memory is organized according to relatedness, and because this sort of basic information has been acquired a long time ago, this simple test is quite a good way to test semantic knowledge. As previous research has indicated, the problem doesn’t seem to be a memory (retrieval) one, but one reflecting an actual loss or corruption of semantic knowledge. But perhaps, rather than a loss of data, it reflects a failure of selective attention/inhibition — an inability to inhibit immediate perceptual information in favor of more relevant conceptual information.

How much does this matter? Poor performance on the semantic distance task correlated with impaired ability to perform everyday tasks, accounting (together with delayed recall) for some 35% of the variance in scores on this task — while other cognitive abilities such as processing speed, executive function, verbal fluency, naming, did not have a significant effect. Everyday functional capacity was assessed using a short form of the UCSD Skills Performance Assessment scale (a tool generally used to identify everyday problems in patients with schizophrenia), which presents scenarios such as planning a trip to the beach, determining a route, dialing a telephone number, and writing a check.

The finding indicates that semantic memory problems are starting to occur early in the deterioration, and may be affecting general cognitive decline. However, if the problems reflect an access difficulty rather than data loss, it may be possible to strengthen these semantic processing connections through training — and thus improve general cognitive processing (and ability to perform everyday tasks).

There's quite a bit of evidence now that socializing — having frequent contact with others — helps protect against cognitive impairment in old age. We also know that depression is a risk factor for cognitive impairment and dementia. There have been hints that loneliness might also be a risk factor. But here’s the question: is it being alone, or feeling lonely, that is the danger?

A large Dutch study, following 2173 older adults for three years, suggests that it is the feeling of loneliness that is the main problem.

At the start of the study, some 46% of the participants were living alone, and some 50% were no longer or never married (presumably the discrepancy is because many older adults have a spouse in a care facility). Some 73% said they had no social support, while 20% reported feelings of loneliness.

Those who lived alone were significantly more likely to develop dementia over the three year study period (9.3% compared with 5.6% of those who lived with others). The unmarried were also significantly more likely to develop dementia (9.2% vs 5.3%).

On the other hand, among those without social support, 5.6% developed dementia compared with 11.4% with social support! This seems to contradict everything we know, not to mention the other results of the study, but the answer presumably lies in what is meant by ‘social support’. Social support was assessed by the question: Do you get help from family, neighbours or home support? It doesn’t ask the question of whether help would be there if they needed it. So this is not a question of social networks, but more one of how much you need help. This interpretation is supported by the finding that those receiving social support had more health problems.

So, although the researchers originally counted this question as part of the measure of social isolation, it is clearly a poor reflection of it. Effectively, then, that leaves cohabitation and marriage as the only indices of social isolation, which is obviously inadequate.

However, we still have the interesting question re loneliness. The study found that 13.4% of those who said they felt lonely developed dementia compared with 5.7% of those who didn’t feel this way. This is a greater difference than that found with the ‘socially isolated’ (as measured!). Moreover, once other risk factors, such as age, education, and other health factors, were accounted for, the association between living alone and dementia disappeared, while the association with feelings of loneliness remained.

Of course, this still doesn’t tell us what the association is! It may be that feelings of loneliness simply reflect cognitive changes that precede Alzheimer’s, but it may be that the feelings themselves are decreasing cognitive and social activity. It may also be that those who are prone to such feelings have personality traits that are in themselves risk factors for cognitive impairment.

I would like to see another large study using better metrics of social isolation, but, still, the study is interesting for its distinction between being alone and feeling lonely, and its suggestion that it is the subjective feeling that is more important.

This is not to say there is no value in having people around! For a start, as discussed, the measures of social isolation are clearly inadequate. Moreover, other people play an important role in helping with health issues, which in turn greatly impact cognitive decline.

Although there was a small effect of depression, the relationship between feeling lonely and dementia remained after this was accounted for, indicating that this is a separate factor (on the other hand feelings of loneliness were a risk factor for depression).

A decrease in cognitive score (MMSE) was also significantly greater for those experiencing feelings of loneliness, suggesting that this is also a factor in age-related cognitive decline.

The point is not so much that loneliness is more detrimental than being alone, but that loneliness in itself is a risk factor for cognitive decline and dementia. This suggests that we should develop a better understanding of loneliness, how to identify the vulnerable, and how to help them.

In a large Mayo Clinic study, self-reported diet was found to be significantly associated with the risk of seniors developing mild cognitive impairment or dementia over a four-year period.

The study involved 1,230 older adults (70-89) who completed a 128-item food-frequency questionnaire about their diet during the previous year. Of these, around three-quarters (937) showed no signs of cognitive impairment at the beginning of the study period, and were asked to return for follow-up cognitive assessments. These assessments took place every 15 months. After about four years, 200 (21%) had developed mild cognitive impairment (MCI) or dementia.

The likelihood of cognitive deterioration was significantly affected by the type of diet. Those with the highest carbohydrate intake were nearly twice as likely to develop cognitive impairment compared to those with the lowest carbohydrate consumption, and when total fat and protein intake were taken into account, they were 3.6 times likelier to develop impairment.

Those with the highest sugar intake were 1.5 times more likely to develop cognitive impairment.

But — a finding that will no doubt surprise many — those with the highest fat consumption were 42% less likely to develop cognitive impairment, compared to those with the lowest level of fats.

Less surprisingly, those with highest intake of protein had a reduced risk of 21%.

In other words, the worst diet you can have, if you want to keep your brain healthy, is one that receives most of its calories from carbohydrates and sugar, and relatively little from fats and protein.

The findings about carbs, sugar, and protein are consistent with other research. The finding regarding fats is somewhat more surprising. The inconsistency may lie in the type of fat. Research implicating high-fat diets as a risk factor in Alzheimer’s have used saturated fats. Diets high in olive oil, on the other hand, have been found to be beneficial.

It seems likely that the danger of carbs and too much sugar lies in the effects on glucose and insulin metabolism. Saturated fats also interfere with glucose metabolism. Alzheimer’s has sometimes been called Type 3 diabetes, because of its association with insulin problems.

Roberts RO, Roberts LA, Geda YE, Cha RH, Pankratz VS, O'Connor HM, Knopman DS, Petersen RC. 2012. Relative intake of macronutrients impacts risk of mild cognitive impairment or dementia. Journal of Alzheimers Disease, 32(2), 329-39.

Caffeine has been associated with a lower of developing Alzheimer's disease in some recent studies. A recent human study suggested that the reason lies in its effect on proteins involved in inflammation. A new mouse study provides more support for this idea.

In the study, two groups of mice, one of which had been given caffeine, were exposed to hypoxia, simulating what happens in the brain during an interruption of breathing or blood flow. When re-oxygenated, caffeine-treated mice recovered their ability to form a new memory 33% faster than the other mice, and the caffeine was observed to have the same anti-inflammatory effect as blocking interleukin-1 (IL-1) signaling.

Inflammation is a key player in cognitive impairment, and IL-1 has been shown to play a critical role in the inflammation associated with many neurodegenerative diseases.

It was found that the hypoxic episode triggered the release of adenosine, the main component of ATP (your neurons’ fuel). Adenosine is released when a cell is damaged, and this leakage into the environment outside the cell begins a cascade that leads to inflammation (the adenosine activates an enzyme, caspase-1, which triggers production of the cytokine IL-1β).

But caffeine blocks adenosine receptors, stopping the cascade before it starts.

The finding gives support to the idea that caffeine may help prevent cognitive decline and impairment.

I’ve reported before on the growing evidence that metabolic syndrome in middle and old age is linked to greater risk of cognitive impairment in old age and faster decline. A new study shows at least part of the reason.

The study involved 71 middle-aged people recruited from the Wisconsin Registry for Alzheimer's Prevention (WRAP), of whom 29 met the criteria for metabolic syndrome (multiple cardiovascular and diabetes risk factors including abdominal obesity, high blood pressure, high blood sugar and high cholesterol).

Those with metabolic syndrome averaged 15% less blood flow to the brain than those without the syndrome.

One tried and true method of increasing blood flow to the brain is of course through exercise.

The study was presented at the Alzheimer's Association International Conference in Vancouver, Canada by Barbara Bendlin.

Memory problems in those with mild cognitive impairment may begin with problems in visual discrimination and vulnerability to interference — a hopeful discovery in that interventions to improve discriminability and reduce interference may have a flow-on effect to cognition.

The study compared the performance on a complex object discrimination task of 7 patients diagnosed with amnestic MCI, 10 older adults considered to be at risk for MCI (because of their scores on a cognitive test), and 19 age-matched controls. The task involved the side-by-side comparison of images of objects, with participants required to say, within 15 seconds, whether the two objects were the same or different.

In the high-interference condition, the objects were blob-like and presented as black and white line-drawings, with some comparison pairs identical, while others only varied slightly in either shape or fill pattern. Objects were rotated to discourage a simple feature-matching strategy. In the low-interference condition, these line-drawings were interspersed with color photos of everyday objects, for which discriminability was dramatically easier. The two conditions were interspersed by a short break, with the low interference condition run in two blocks, before and after the high interference condition.

A control task, in which the participants compared two squares that could vary in size, was run at the end.

The study found that those with MCI, as well as those at risk of MCI, performed significantly worse than the control group in the high-interference condition. There was no difference in performance between those with MCI and those at risk of MCI. Neither group was impaired in the first low-interference condition, although the at-risk group did show significant impairment in the second low-interference condition. It may be that they had trouble recovering from the high-interference experience. However, the degree of impairment was much less than it was in the high-interference condition. It’s also worth noting that the performance on this second low-interference task was, for all groups, notably higher than it was on the first low-interference task.

There was no difference between any of the groups on the control task, indicating that fatigue wasn’t a factor.

The interference task was specifically chosen as one that involved the perirhinal cortex, but not the hippocampus. The task requires the conjunction of features — that is, you need to be able to see the object as a whole (‘feature binding’), not simply match individual features. The control task, which required only the discrimination of a single feature, shows that MCI doesn’t interfere with this ability.

I do note that the amount of individual variability on the interference tasks was noticeably greater in the MCI group than the others. The MCI group was of course smaller than the other groups, but variability wasn’t any greater for this group in the control task. Presumably this variability reflects progression of the impairment, but it would be interesting to test this with a larger sample, and map performance on this task against other cognitive tasks.

Recent research has suggested that the perirhinal cortex may provide protection from visual interference by inhibiting lower-level features. The perirhinal cortex is strongly connected to the hippocampus and entorhinal cortex, two brain regions known to be affected very early in MCI and Alzheimer’s.

The findings are also consistent with other evidence that damage to the medial temporal lobe may impair memory by increasing vulnerability to interference. For example, one study has found that story recall was greatly improved in patients with MCI if they rested quietly in a dark room after hearing the story, rather than being occupied in other tasks.

There may be a working memory component to all this as well. Comparison of two objects does require shifting attention back and forth. This, however, is separate to what the researchers see as primary: a perceptual deficit.

All of this suggests that reducing “visual clutter” could help MCI patients with everyday tasks. For example, buttons on a telephone tend to be the same size and color, with the only difference lying in the numbers themselves. Perhaps those with MCI or early Alzheimer’s would be assisted by a phone with varying sized buttons and different colors.

The finding also raises the question: to what extent is the difficulty Alzheimer’s patients often have in recognizing a loved one’s face a discrimination problem rather than a memory problem?

Finally, the performance of the at-risk group — people who had no subjective concerns about their memory, but who scored below 26 on the MoCA (Montreal Cognitive Assessment — a brief screening tool for MCI) — suggests that vulnerability to visual interference is an early marker of cognitive impairment that may be useful in diagnosis. It’s worth noting that, across all groups, MoCA scores predicted performance on the high-interference task, but not on any of the other tasks.

So how much cognitive impairment rests on problems with interference?

Newsome, R. N., Duarte, A., & Barense, M. D. (2012). Reducing Perceptual Interference Improves Visual Discrimination in Mild Cognitive Impairment : Implications for a Model of Perirhinal Cortex Function, Hippocampus, 22, 1990–1999. doi:10.1002/hipo.22071

Della Sala S, Cowan N, Beschin N, Perini M. 2005. Just lying there, remembering: Improving recall of prose in amnesic patients with mild cognitive impairment by minimising interference. Memory, 13, 435–440.

HIV-associated dementia occurs in around 30% of untreated HIV-positive patients. Surprisingly, it also is occasionally found in some patients (2-3%) who are being successfully treated for HIV (and show no signs of AIDS).

A new study may have the answer for this mystery, and suggest a solution. Moreover, the answer may have general implications for those experiencing cognitive decline in old age.

The study found that HIV, although it doesn’t directly infect neurons, tries to stop the development of BDNF. Long known to be crucial for memory and learning, the reduced production of mature BDNF results in axons and dendrites shortening — meaning connections between neurons are lost. That in turn, brings about the death of some neurons.

It seems that the virus interferes with the normal process of development in BDNF, whereby one form of it, called proBDNF, is cut by certain enzymes into a new form called mature BDNF. It is in this form that it has its beneficial effect on neuron growth. Unfortunately, in its earlier form it is toxic to neurons.

This imbalance in the proportions of mature BDNF and proBDNF also appears to occur as we age, and in depression. It may also be a risk factor in Parkinson's and Huntington's diseases.

However, these findings suggest a new therapeutic approach.

Compounds in green tea and chocolate may help protect brain cells

In which context, it is interesting to note another new study, which has been busy analyzing the effects on brain cells of 2000 compounds, both natural and synthetic. Of the 256 that looked to have protective effects, nine were related to epicatechin, which is found in cocoa and green tea leaves.

While we’ve been aware for some time of these positive qualities, the study specifically identified epicatechin and epigallocatechin gallate (EGCG) as being the most effective at helping protect neurons by inducing production of BDNF.

One of the big advantages these compounds have is in their ability to cross the blood-brain barrier, making them a good candidate for therapy.

While green tea, dark chocolate, and cocoa are particularly good sources, many fruits also have good levels, in particular, black grapes, blackberries, apples, cherries, pears, and raspberries. (see this University of Davis document (pdf) for more detail)

Back in 2009, I reported briefly on a large Norwegian study that found that older adults who consumed chocolate, wine, and tea performed significantly better on cognitive tests. The association was assumed to be linked to the flavanols in these products. A new study confirms this finding, and extends it to older adults with mild cognitive impairment.

The study involved 90 older adults with MCI, who consumed either 990 milligrams, 520 mg, or 45 mg of a dairy-based cocoa drink daily for eight weeks. Their diet was restricted to eliminate other sources of flavanols (such as tea, red wine, apples and grapes).

Cognitive assessment at the end of this period revealed that, although scores on the MMSE were similar across all groups, those consuming higher levels of flavanol cocoa took significantly less time to complete Trail Making Tests A and B, and scored significantly higher on the verbal fluency test. Insulin resistance and blood pressure was also lower.

Those with the highest levels of flavanols did better than those on intermediate levels on the cognitive tests. Both did better than those on the lowest levels.

Changes in insulin resistance explained part, but not all, of the cognitive improvement.

One caveat: the group were generally in good health without known cardiovascular disease — thus, not completely representative of all those with MCI.

 

The latest finding from the large, long-running Health, Aging, and Body Composition (Health ABC) Study adds to the evidence that preventing or controlling diabetes helps prevent age-related cognitive decline.

The study involves 3,069 older adults (70+), of whom 717 (23%) had diabetes at the beginning of the study in 1997. Over the course of the study, a further 159 developed diabetes. Those with diabetes at the beginning had lower cognitive scores, and showed faster decline. Those who developed diabetes showed a rate of decline that was between that faster rate and the slower rate of those who never developed diabetes.

Among those with diabetes, those who had higher levels of a blood marker called glycosylated hemoglobin had greater cognitive impairment. Higher levels of this blood marker reflect poorer control of blood sugar.

In other words, both duration and severity of diabetes are important factors in determining rate of cognitive decline in old age.

While the ‘Alzheimer’s gene’ is relatively common — the ApoE4 mutation is present in around 15% of the population — having two copies of the mutation is, thankfully, much rarer, at around 2%. Having two copies is of course a major risk factor for developing Alzheimer’s, and it has been thought that having a single copy is also a significant (though lesser) risk factor. Certainly there is quite a lot of evidence linking ApoE4 carriers to various markers of cognitive impairment.

And yet, the evidence has not been entirely consistent. I have been puzzled by this myself, and now a new finding suggests a reason. It appears there are gender differences in responses to this gene variant.

The study involved 131 healthy older adults (median age 70), whose brains were scanned. The scans revealed that in older women with the E4 variant, brain activity showed the loss of synchronization that is typically seen in Alzheimer’s patients, with the precuneus (a major hub in the default mode network) out of sync with other brain regions. This was not observed in male carriers.

The finding was confirmed by a separate set of data, taken from the Alzheimer's Disease Neuroimaging Initiative database. Cerebrospinal fluid from 91 older adults (average age 75) revealed that female carriers had substantially higher levels of tau protein (a key Alzheimer’s biomarker) than male carriers or non-carriers.

It’s worth emphasizing that the participants in the first study were all cognitively normal — the loss of synchronization was starting to happen before visible Alzheimer’s symptoms appeared.

The findings suggest that men have less to worry about than women, as far as the presence of this gene is concerned. The study may also explain why more women than men get the disease (3 women to 2 men); it is not (although of course this is a factor) simply a consequence of women tending to live longer.

Whether or not these gender differences extend to carriers of two copies of the gene is another story.

A study designed to compare the relative benefits of exercise and diet control on Alzheimer’s pathology and cognitive performance has revealed that while both are beneficial, exercise is of greater benefit in reducing Alzheimer’s pathology and cognitive impairment.

The study involved mice genetically engineered with a mutation in the APP gene (a familial risk factor for Alzheimer’s), who were given either a standard diet or a high-fat diet (60% fat, 20% carbohydrate, 20% protein vs 10% fat, 70% carbohydrate, 20% protein) for 20 weeks (from 2-3 to 7-8 months of age). Some of the mice on the high-fat diet spent the second half of that 20 weeks in an environmentally enriched cage (more than twice as large as the standard cage, and supplied with a running wheel and other objects). Others on the high-fat diet were put back on a standard diet in the second 10 weeks. Yet another group were put on a standard diet and given an enriched cage in the second 10 weeks.

Unsurprisingly, those on the high-fat diet gained significantly more weight than those on the standard diet, and exercise reduced that gain — but not as much as diet control (i.e., returning to a standard diet) did. Interestingly, this was not the result of changes in food intake, which either stayed the same or slightly increased.

More importantly, exercise and diet control were roughly equal in reversing glucose intolerance, but exercise was more effective than diet control in ameliorating cognitive impairment. Similarly, while amyloid-beta pathology was significantly reduced in both exercise and diet-control conditions, exercise produced the greater reduction in amyloid-beta deposits and level of amyloid-beta oligomers.

It seems that diet control improves metabolic disorders induced by a high-fat diet — conditions such as obesity, hyperinsulinemia and hypercholesterolemia — which affects the production of amyloid-beta. However exercise is more effective in tackling brain pathology directly implicated in dementia and cognitive decline, because it strengthens the activity of an enzyme that decreases the level of amyloid-beta.

Interestingly, and somewhat surprisingly, the combination of exercise and diet control did not have a significantly better effect than exercise alone.

The finding adds to the growing pile of evidence for the value of exercise in maintaining a healthy brain in later life, and helps explain why. Of course, as I’ve discussed on several occasions, we already know other mechanisms by which exercise improves cognition, such as boosting neurogenesis.

Following on from mouse studies, a human study has investigated whether caffeine can help prevent older adults with mild cognitive impairment from progressing to dementia.

The study involved 124 older adults (65-88) who were thoroughly cognitively assessed, given brain scans, and had a fasting blood sample taken. They were then followed for 2 to 4 years, during which their cognitive status was re-assessed annually. Of the 124 participants, 69 (56%) were initially assessed as cognitively normal (average age 73), 32 (26%) with MCI (average age 76.5), and 23 (19%) with dementia (average age 77). The age differences were significant.

Those with MCI on initial assessment showed significantly lower levels of caffeine in their blood than those cognitively healthy; levels in those with dementia were also lower but not significantly. Those initially healthy who developed MCI over the study period similarly showed lower caffeine levels than those who didn’t develop MCI, but again, due to the wide individual variability (and the relatively small sample size), this wasn’t significant. However, among those with MCI who progressed to dementia (11, i.e. a third of those with MCI), caffeine levels were so much lower that the results were significant.

This finding revealed an apparently critical level of caffeine dividing those who progressed to dementia and those who did not — more specifically, all of those who progressed to dementia were below this level, while around half of those who remained stable were at the level or above. In other words, low caffeine would seem to be necessary but not sufficient.

On the other hand (just to show that this association is not as simple as it appears), those already with dementia had higher caffeine levels than those with MCI who progressed to dementia.

The critical factor may have to do with three specific cytokines — GCSF, IL-10, and IL-6 — which all showed markedly lower levels in those converting from MCI to dementia. Comparison of the three stable-MCI individuals with the highest caffeine levels and the three with the lowest levels, and the three from the MCI-to-dementia group with comparable low levels, revealed that high levels of those cytokines were matched with high caffeine levels, while, in both groups, low caffeine levels were matched to low levels of those cytokines.

These cytokines are associated with inflammation — an established factor in cognitive decline and dementia.

The level of coffee needed to achieve the ‘magic’ caffeine level is estimated at around 3 cups a day. While caffeine can be found in other sources, it is thought that in this study, as in the mouse studies, coffee is the main source. Moreover, mouse research suggests that caffeine is interacting with an as yet unidentified component of coffee to boost levels of these cytokines.

This research has indicated that caffeine has several beneficial effects on the brain, including suppressing levels of enzymes that produce amyloid-beta, as well as these anti-inflammatory effects.

It’s suggested that the reason high levels of caffeine don’t appear to benefit those with dementia is because higher levels of these cytokines have become re-established, but this immune response would appear to come too late to protect the brain. This is consistent with other evidence of the importance of timing.

Do note that in mouse studies, the same benefits were not associated with decaffeinated coffee.

While this study has some limitations, the findings are consistent with previous epidemiologic studies indicating coffee/caffeine helps protect against cognitive impairment and dementia. Additionally, in keeping with the apparent anti-inflammatory action, a long-term study tracking the health and coffee consumption of more than 400,000 older adults recently found that coffee drinkers had reduced risk of dying from heart disease, lung disease, pneumonia, stroke, diabetes, infections, injuries and accidents.

Cao, C., Loewenstein, D. a, Lin, X., Zhang, C., Wang, L., Duara, R., Wu, Y., et al. (2012). High Blood Caffeine Levels in MCI Linked to Lack of Progression to Dementia. Journal of Alzheimer’s disease : JAD, 30(3), 559–72. doi:10.3233/JAD-2012-111781

Freedman, N.D. et al. 2012. Association of Coffee Drinking with Total and Cause-Specific Mortality. N Engl J Med, 366, 1891-1904.

A number of studies have come out in recent years linking age-related cognitive decline and dementia risk to inflammation and infection (put inflammation into the “Search this site” box at the top of the page and you’ll see what I mean). New research suggests one important mechanism.

In a mouse study, mice engineered to be deficient in receptors for the CCR2 gene — a crucial element in removing beta-amyloid and also important for neurogenesis — developed Alzheimer’s-like pathology more quickly. When these mice had CCR2 expression boosted, accumulation of beta-amyloid decreased and the mice’s memory improved.

In the human study, the expression levels of thousands of genes from 691 older adults (average age 73) in Italy (part of the long-running InCHIANTI study) were analyzed. Both cognitive performance and cognitive decline over 9 years (according to MMSE scores) were significantly associated with the expression of this same gene. That is, greater CCR2 activity was associated with lower cognitive scores and greater decline.

Expression of the CCR2 gene was also positively associated with the Alzheimer’s gene — meaning that those who carry the APOE4 variant are more likely to have higher CCR2 activity.

The finding adds yet more weight to the importance of preventing / treating inflammation and infection.

[2960] Harries, L. W., Bradley-Smith R. M., Llewellyn D. J., Pilling L. C., Fellows A., Henley W., et al.
(2012).  Leukocyte CCR2 Expression Is Associated with Mini-Mental State Examination Score in Older Adults.
Rejuvenation Research. 120518094735004 - 120518094735004.

Naert, G. & Rivest S. 2012. Hematopoietic CC-chemokine receptor 2-(CCR2) competent cells are protective for the cognitive impairments and amyloid pathology in a transgenic mouse model of Alzheimer's disease. Molecular Medicine, 18(1), 297-313.

El Khoury J, et al. 2007. Ccr2 deficiency impairs microglial accumulation and accelerates progression of Alzheimer-like disease. Nature Medicine, 13, 432–8.

A new study, involving 1,219 dementia-free older adults (65+), has found that the more omega-3 fatty acids the person consumed, the lower the level of beta-amyloid in the blood (a proxy for brain levels). Consuming a gram of omega-3 more than the average per day was associated with 20-30% lower beta-amyloid levels. A gram of omega-3 equates to around half a fillet of salmon per week.

Participants provided information about their diet for an average of 1.2 years before their blood was tested for beta-amyloid. Other nutrients investigated —saturated fatty acids, omega-6 polyunsaturated fatty acids, mono-unsaturated fatty acid, vitamin E, vitamin C, beta-carotene, vitamin B12, folate and vitamin D — were not associated with beta-amyloid levels.

The results remained after adjusting for age, education, gender, ethnicity, amount of calories consumed and APOE gene status.

The findings are consistent with previous research associating higher levels of omega-3 and/or fish intake with lower risk of Alzheimer’s. Additionally, another recent study provides evidence that the brains of those with Alzheimer’s disease, MCI, and the cognitively normal, all have significantly different levels of omega-3 and omega-6 fatty acids. That study concluded that the differences were due to both consumption and metabolic differences.

[2959] Gu, Y., Schupf N., Cosentino S. A., Luchsinger J. a, & Scarmeas N.
(2012).  Nutrient Intake and Plasma Β-Amyloid.
Neurology. 78(23), 1832 - 1840.

Cunnane, S.C. et al. 2012. Plasma and Brain Fatty Acid Profiles in Mild Cognitive Impairment and Alzheimer’s Disease. Journal of Alzheimer’s Disease, 29 (3), 691-697.

Here’s a different aspect to cognitive reserve. I have earlier reported on the first tranche of results from this study. Now new results, involving 246 older adults from the Rush Memory and Aging Project, have confirmed earlier findings that having a greater purpose in life may help protect against the brain damage wrought by Alzheimer’s disease.

Participants received an annual clinical evaluation for up to 10 years, which included detailed cognitive testing and neurological exams. They were also interviewed about their purpose in life, that is, the degree to which they derived meaning from life's experiences and were focused and intentional. After death (average age 88), their brains were examined for Alzheimer’s pathology.

Cognitive function, unsurprisingly, declined progressively with increased Alzheimer’s pathology (such as amyloid plaque and tau tangles). But ‘purpose in life’ modified this association, with higher levels of purposiveness reducing the effect of pathology on cognition. The effect was strongest for those with the greatest damage (especially tangles).

The analysis took into account depression, APOE gene status, and other relevant medical factors.

More findings from the long-running Mayo Clinic Study of Aging reveal that using a computer plus taking moderate exercise reduces your risk of mild cognitive impairment significantly more than you would expect from simply adding together these two beneficial activities.

The study involved 926 older adults (70-93), of whom 109 (12%) were diagnosed with MCI. Participants completed questionnaires on physical exercise and mental stimulation within the previous year. Computer use was targeted in this analysis because of its popularity as a cognitive activity, and because it was particularly associated with reduced odds of having MCI.

Among the cognitively healthy, only 20.1% neither exercised moderately nor used a computer, compared to 37.6% of those with MCI. On the other hand, 36% of the cognitively healthy both exercised and used a computer, compared to only 18.3% of those with MCI. There was little difference between the two groups as regards exercise but no computer use, or computer use but no exercise.

The analysis took into account calorie intake, as well as education, depression, and other health factors. Daily calorie intake was significantly higher in those with MCI compared to those without (respective group medians of 2100 calories vs 1802) — note that the median BMI was the same for the two groups.

Moderate physical exercise was defined as brisk walking, hiking, aerobics, strength training, golfing without a golf cart, swimming, doubles tennis, yoga, martial arts, using exercise machines and weightlifting. Light exercise included activities such as bowling, leisurely walking, stretching, slow dancing, and golfing with a cart. Mentally stimulating activities included reading, crafts, computer use, playing games, playing music, group and social and artistic activities and watching less television.

It should be noted that the assessment of computer activities was very basic. The researchers suggest that in future studies, both duration and frequency should be assessed. I would add type of activity, although that would be a little more difficult to assess.

Overall, the findings add yet more weight to the evidence for the value of physical exercise and mental stimulation in staving off cognitive impairment in old age, and add the twist that doing both is much better than doing either one alone.

Interpreting brain activity is a very tricky business. Even the most basic difference can be interpreted in two ways — i.e., what does it mean if a region is more active in one group of people compared to another? A new study not only indicates a new therapeutic approach to amnestic mild cognitive impairment, but also demonstrates the folly of assuming that greater activity is good.

Higher activity in the dentate gyrus/CA3 region of the hippocampus is often seen in disorders associated with an increased Alzheimer's risk, such as aMCI. It’s been thought, reasonably enough, that this might reflect compensatory activity, as the brain recruits extra resources in the face of memory loss. But rodent studies have suggested an alternative interpretation: that the increased activity might itself be part of the problem.

Following on from animal studies, this new study has investigated the effects of a drug that reduces hippocampal hyperactivity. The drug, levetiracetam, is used to treat epilepsy. The 17 patients with aMCI (average age 73) were given a placebo in the first two-week treatment phase and a low dose of the epilepsy drug during the second treatment phase, while 17 controls (average age 69) were given a placebo in both treatment phases. The treatments were separated by four weeks, and brain scans were given at the end of each phase. Participants carried out a cognitive task designed to assess memory errors attributable to a dysfunction in the dentate gyrus/CA3 region (note that these neighboring areas are not clearly demarcated from each other, and so are best analyzed as one).

As predicted, those with aMCI showed greater activity in this region, and treatment with the drug significantly reduced that activity. The drug treatment also significantly improved their performance on the three-choice recognition task, with a significant decrease in memory errors. It did not have a significant effect on general cognition or memory (as measured by delayed recall on the Verbal Paired Associates subtest of the Wechsler Memory Scale, the Benton Visual Retention Test, and the Buschke Selective Reminding Test).

These findings make it clear that the excess activity in the hippocampus is not compensatory, and also point to the therapeutic value of targeting this hyperactivity for those with aMCI. It also raises the possibility that other conditions might benefit from this approach. For example, those who carry the Alzheimer’s gene, APOE4, also show increased hippocampal activity.

Older adults who sleep poorly react to stress with increased inflammation

A study involving 83 older adults (average age 61) has found that poor sleepers reacted to a stressful situation with a significantly greater inflammatory response than good sleepers. High levels of inflammation increase the risk of several disorders, including cardiovascular disease and diabetes, and have been implicated in Alzheimer’s.

Each participant completed a self-report of sleep quality, perceived stress, loneliness and medication use. Around 27% were categorized as poor sleepers. Participants were given a series of tests of verbal and working memory designed to increase stress, with blood being taken before and after testing, as well as three more times over the next hour. The blood was tested for levels of a protein marker for inflammation (interleukin-6).

Poor sleepers reported more depressive symptoms, more loneliness and more perceived stress compared to good sleepers. Before cognitive testing, levels of IL-6 were the same for poor and good sleepers. However, while both groups showed increases in IL-6 after testing, poor sleepers showed a significantly larger increase — as much as four times larger and at a level found to increase risk for illness and death in older adults.

After accounting for loneliness, depression or perceived stress, this association remained. Surprisingly, there was no evidence that poor sleep led to worse cognitive performance, thus causing more stress. Poor sleepers did just as well on the tests as the good sleepers (although I note that we cannot rule out that poor sleepers were having to put in more effort to achieve the same results). Although there was a tendency for poor sleepers to be in a worse mood after testing (perhaps because they had to put in more effort? My own speculation), this mood change didn’t predict the increased inflammatory response.

The findings add to evidence that poor sleep (unfortunately common as people age) is an independent risk factor for cognitive and physical health, and suggest we should put more effort into dealing with it, rather than just accepting it as a corollary of age.

REM sleep disorder doubles risk of MCI, Parkinson's

A recent Mayo Clinic study has also found that people with rapid eye movement sleep behavior disorder (RBD) have twice the risk of developing mild cognitive impairment or Parkinson’s disease. Some 34% of those diagnosed with probable RBD developed MCI or Parkinson's disease within four years of entering the study, a rate 2.2 times greater than those with normal REM sleep.

Earlier research has found that 45% of those with RBD developed MCI or Parkinson's disease within five years of diagnosis, but these findings were based on clinical patients. The present study involved cognitively healthy older adults (70-89) participating in a population-based study of aging, who were diagnosed for probable RBD on the basis of the Mayo Sleep Questionnaire.

A study involving 86 older women (aged 70-80) with probable MCI has compared the effectiveness of resistance and aerobic training in improving executive function. The women were randomly allocated either to resistance training, aerobic training, or balance and tone training (control group). The programs all ran twice weekly for six months.

The 60-minute classes involved lifting weights (resistance training), outdoor walking (aerobic training), or stretching, balancing, and relaxation exercises (control).

Executive function was primarily assessed by the Stroop Test (measuring selective attention/conflict resolution), and also by Trail Making Tests (set-shifting) and Verbal Digits Tests (working memory). Associative memory (face-scene pairs) and problem-solving ability (Everyday Problems Test) were also assessed.

The study found that resistance training significantly improved performance on the Stroop Test and also the associative memory task. These improvements were associated with changes in some brain regions. In contrast to previous studies in healthy older adults, aerobic training didn’t produce any significant cognitive improvement, although it did produce significantly better balance and mobility, and cardiovascular capacity, compared to the control.

Interestingly, a previous study from these researchers demonstrated that it took a year of resistance training to achieve such results in cognitively healthy women aged 65-75. This suggests that the benefits may be greater for those at greater risk.

It may be that the greater benefits of resistance training over aerobic training are not be solely due to physical differences in the exercise. The researchers point out that resistance training required more cognitive engagement (“If you’re lifting weights you have to monitor your sets, your reps, you use weight machines and you have to adjust the seat, etc.”) compared to walking.

Note that impaired associative memory is one of the earliest cognitive functions affected in Alzheimer’s.

It’s also worth noting that exercise compliance was low (55-60%), suggesting that benefits might have been greater if the participants had been more motivated — or found the programs more enjoyable! The failure of aerobic exercise to improve cognition is somewhat surprising, and perhaps it, too, may be attributed to insufficient engagement — in terms of intensity as well as amount.

The researchers have put up a YouTube video of the resistance training exercises used in the study.

A four-year study involving 716 elderly (average age 82) has revealed that those who were most physically active were significantly less likely to develop Alzheimer’s than those least active. The study is unique in that, in addition to self-reports of physical and social activity, activity was objectively measured (for up to 10 days) through a device worn on the wrist. This device (an actigraph) enabled everyday activity, such as cooking, washing the dishes, playing cards and even moving a wheelchair with a person's arms, to be included in the analysis.

Cognitive performance was assessed annually. Over the study period, 71 participants (10%) developed Alzheimer’s.

The study found that those in the bottom 10% of daily physical activity were more than twice as likely (2.3 times) to develop Alzheimer's disease as those in the top 10%. Those in the bottom 10% of intensity of physical activity were almost three times (2.8 times) as likely to develop Alzheimer's disease as people in the top 10%.

Moreover, the level of activity was associated with the rate of cognitive decline.

The association remained after motor function, depression, chronic health conditions, and APOE gene status were taken into account.

The findings should encourage anyone who feels that physical exercise is beyond them to nevertheless engage in milder forms of daily activity.

 

Addendum:

Another recent study, involving 331 cognitively healthy elderly, has also found that higher levels of physical activity were associated with better cognitive performance (specifically, a shorter time to complete the Trail-making test, and higher levels of verbal fluency) and less brain atrophy. Activity levels were based on the number of self-reported light and hard activities for at least 30 minutes per week. Participants were assessed in terms of MMSE score, verbal fluency, and visuospatial ability.

A study involving 1,575 older adults (aged 58-76) has found that those with DHA levels in the bottom 25% had smaller brain volume (equivalent to about 2 years of aging) and greater amounts of white matter lesions. Those with levels of all omega-3 fatty acids in the bottom quarter also scored lower on tests of visual memory, executive function, and abstract thinking.

The finding adds to the evidence that higher levels of omega-3 fatty acids reduce dementia risk.

For more about omega-3 oils and cognition

A review of 15 randomized controlled trials in which people with mild to moderate dementia were offered mental stimulation has concluded that such stimulation does indeed help slow down cognitive decline.

In total, 718 people with mild to moderate dementia, of whom 407 received cognitive stimulation, were included in the meta-analysis. The studies included in the review were identified from a search of the Cochrane Dementia and Cognitive Improvement Group Specialized Register, and included all randomized controlled trials of cognitive stimulation for dementia which incorporated a measure of cognitive change.

Participants were generally treated in small groups and activities ranged from discussions and word games to music and baking. Treatment was compared to those seen without treatment, with "standard treatments" (such as medicine, day care or visits from community mental health workers), or with alternative activities such as watching TV and physical therapy.

There was a “clear, consistent benefit” on cognitive function for those receiving cognitive stimulation, and these benefits were still seen one to three months after the treatment. Benefits were also seen for social interaction, communication and quality of life and well-being.

While no evidence was found for improvements in the mood of participants, or their ability to care for themselves or function independently, or in problem behaviors, this is not to say that lengthier or more frequent interventions might not be beneficial in these areas (that’s purely my own suggestion).

In one study, family members were trained to deliver cognitive stimulation on a one-to-one basis, and the reviewers suggested that this was an approach deserving of further attention.

The reviewers did note that the quality of the studies was variable, with small sample sizes. It should also be noted that this review builds on an earlier review, involving a subset of these studies, in which the opposite conclusion was drawn — that is, at that time, there was insufficient evidence that such interventions helped people with dementia. There is no doubt that larger and lengthier trials are needed, but these new results are very promising.

Data from 11,926 older twins (aged 65+) has found measurable cognitive impairment in 25% of them and subjective cognitive impairment in a further 39%, meaning that 64% of these older adults were experiencing some sort of cognitive impairment.

Although subjective impairment is not of sufficient magnitude to register on our measurement tools, that doesn’t mean that people’s memory complaints should be dismissed. It is likely, given the relative crudity of standard tests, that people are going to be aware of cognitive problems before they grow large enough to be measurable. Moreover, when individuals are of high intelligence or well-educated, standard tests can be insufficiently demanding. [Basically, subjective impairment can be thought of as a step before objective impairment, which itself is a step before mild cognitive impairment (MCI is a formal diagnosis, not simply a descriptive title), the precursor to Alzheimer’s. Note that I am calling these “steps” as a way of describing a continuum, not an inevitable process. None of these steps means that you will inevitably pass to the next step, but each later step will be preceded by the earlier steps.]

Those with subjective complaints were younger, more educated, more likely to be married, and to have higher socio-economic status, compared to those with objective impairment — supporting the idea that these factors provide some protection against cognitive decline.

The use of twins reveals that environment is more important than genes in determining whether you develop cognitive impairment in old age. For objective cognitive impairment, identical twins had a concordance rate of 52% compared to 50% in non-identical same-sex twins and 29% in non-identical different-gender twins. For subjective impairment, the rates were 63%, 63%, and 42%, respectively.

National variation in MCI prevalence

Another very large study, involving 15,376 older adults (65+), has explored the prevalence of amnestic MCI in low- and middle-income countries: Cuba, Dominican Republic, Peru, Mexico, Venezuela, Puerto Rico, China, and India. Differences between countries were marked, with only 0.6% of older adults in China having MCI compared to 4.6% in India (Cuba 1.5%, Dominican Republic 1.3%, Peru 2.6%, Mexico 2.8%, Venezuela 1%, Puerto Rico 3% — note that I have selected the numbers after they were standardized for age, gender, and education, but the raw numbers are not greatly different).

Studies to date have focused mainly on European and North American populations, and have provided prevalence estimates ranging from 2.1%-11.5%, generally hovering around 3-5% (for example, Finland 5.3%, Italy 4.9%, Japan 4.9%, the US 6% — but note South Korea 9.7% and Malaysia 15.4%).

What is clear is that there is considerable regional variation.

Interestingly, considering their importance in Western countries, the effects of both age and education on prevalence of aMCI were negligible. Granted that age and education norms were used in the diagnosis, this is still curious. It may be that there was less variance in educational level in these populations. Socioeconomic status was, however, a factor.

Participants were also tested on the 12-item WHO disability assessment schedule (WHODAS-12), which assesses five activity-limitation domains (communication, physical mobility, self-care, interpersonal interaction, life activities and social participation). MCI was found to be significantly associated with disability in Peru, India, and the Dominican Republic (but negatively associated in China). Depression (informant-rated) was also only associated with MCI in some countries.

All of this, I feel, emphasizes the situational variables that determine whether an individual will develop cognitive impairment.

Caracciolo B, Gatz M, Xu W, Pedersen NL, Fratiglioni L. 2012. Differential Distribution of Subjective and Objective Cognitive Impairment in the Population: A Nation-Wide Twin-Study. Journal of Alzheimer's Disease, 29(2), 393-403.

[2801] Sosa, A L., Albanese E., Stephan B. C. M., Dewey M., Acosta D., Ferri C. P., et al.
(2012).  Prevalence, Distribution, and Impact of Mild Cognitive Impairment in Latin America, China, and India: A 10/66 Population-Based Study.
PLoS Med. 9(2), e1001170 - e1001170.

Full text available at http://www.plosmedicine.org/article/info%3Adoi%2F10.1371%2Fjournal.pmed....

Another study adds to the evidence that changes in the brain that may lead eventually to Alzheimer’s begin many years before Alzheimer’s is diagnosed. The findings also add to the evidence that what we regard as “normal” age-related cognitive decline is really one end of a continuum of which the other end is dementia.

In the study, brain scans were taken of 137 highly educated people aged 30-89 (participants in the Dallas Lifespan Brain Study). The amount of amyloid-beta (characteristic of Alzheimer’s) was found to increase with age, and around a fifth of those over 60 had significantly elevated levels of the protein. These higher amounts were linked with worse performance on tests of working memory, reasoning and processing speed.

More specifically, across the whole sample, amyloid-beta levels affected processing speed and fluid intelligence (in a dose-dependent relationship — that is, as levels increased, these functions became more impaired), but not working memory, episodic memory, or crystallized intelligence. Among the elevated-levels group, increased amyloid-beta was significantly associated with poorer performance for processing speed, working memory, and fluid intelligence, but not episodic memory or crystallized intelligence. Among the group without elevated levels of the protein, increasing amyloid-beta only affected fluid intelligence.

These task differences aren’t surprising: processing speed, working memory, and fluid intelligence are the domains that show the most decline in normal aging.

Those with the Alzheimer’s gene APOE4 were significantly more likely to have elevated levels of amyloid-beta. While 38% of the group with high levels of the protein had the risky gene variant, only 15% of those who didn’t have high levels carried the gene.

Note that, while the prevalence of carriers of the gene variant matched population estimates (24%), the proportion was higher among those in the younger age group — 33% of those under 60, compared to 19.5% of those aged 60 or older. It seems likely that many older carriers have already developed MCI or Alzheimer’s, and thus been ineligible for the study.

The average age of the participants was 64, and the average years of education 16.4.

Amyloid deposits varied as a function of age and region: the precuneus, temporal cortex, anterior cingulate and posterior cingulate showed the greatest increase with age, while the dorsolateral prefrontal cortex, orbitofrontal cortex, parietal and occipital cortices showed smaller increases with age. However, when only those aged 60+ were analyzed, the effect of age was no longer significant. This is consistent with previous research, and adds to evidence that age-related cognitive impairment, including Alzheimer’s, has its roots in damage occurring earlier in life.

In another study, brain scans of 408 participants in the Mayo Clinic Study of Aging also found that higher levels of amyloid-beta were associated with poorer cognitive performance — but that this interacted with APOE status. Specifically, carriers of the Alzheimer’s gene variant were significantly more affected by having higher levels of the protein.

This may explain the inconsistent findings of previous research concerning whether or not amyloid-beta has significant effects on cognition in normal adults.

As the researchers of the first study point out, what’s needed is information on the long-term course of these brain changes, and they are planning to follow these participants.

In the meantime, all in all, the findings do provide more strength to the argument that your lifestyle in mid-life (and perhaps even younger) may have long-term consequences for your brain in old age — particularly for those with a genetic susceptibility to Alzheimer’s.

A small study of the sleep patterns of 100 people aged 45-80 has found a link between sleep disruption and level of amyloid plaques (characteristic of Alzheimer’s disease). The participants were recruited from the Adult Children Study, of whom half have a family history of Alzheimer’s disease.

Sleep was monitored for two weeks. Those who woke frequently (more than five times an hour!) and those who spent less than 85% of their time in bed actually asleep, were more likely to have amyloid plaques. A quarter of the participants had evidence of amyloid plaques.

The study doesn’t tell us whether disrupted sleep leads to the production of amyloid plaques, or whether brain changes in early Alzheimer's disease lead to changes in sleep, but evidence from other studies do, I think, give some weight to the first idea. At the least, this adds yet another reason for making an effort to improve your sleep!

The abstract for this not-yet-given conference presentation, or the press release, don’t mention any differences between those with a family history of Alzheimer’s and those without, suggesting there was none — but since the researchers made no mention either way, I wouldn’t take that for granted. Hopefully we’ll one day see a journal paper providing more information.

The main findings are supported by another recent study. A Polish study involving 150 older adults found that those diagnosed with Alzheimer’s after a seven-year observation period were more likely to have experienced sleep disturbances more often and with greater intensity, compared to those who did not develop Alzheimer’s.

Ju, Y., Duntley, S., Fagan, A., Morris, J. & Holtzman, D. 2012. Sleep Disruption and Risk of Preclinical Alzheimer Disease. To be presented April 23 at the American Academy of Neurology's 64th Annual Meeting in New Orleans.

Bidzan L, Grabowski J, Dutczak B, Bidzan M. 2011. [Sleep disorders in the preclinical period of the Alzheimer's disease]. Psychiatria Polska, 45(6), 851-60. http://www.ncbi.nlm.nih.gov/pubmed/22335128

New data from the ongoing validation study of the Alzheimer's Questionnaire (AQ), from 51 cognitively normal individuals (average age 78) and 47 aMCI individuals (average age 74), has found that the AQ is effective in identifying not only those with Alzheimer’s but also those older adults with mild cognitive impairment.

Of particular interest is that four questions were strong indicators of aMCI. These related to:

  • repeating questions and statements,
  • trouble knowing the date or time,
  • difficulties managing finances, and
  • decreased sense of direction.

The AQ consists of 21 yes/no questions designed to be answered by a relative or carer. The questions fall into five categories: memory, orientation, functional ability, visuospatial ability, and language. Six of these questions are known to be predictive of AD and are given extra weighting, resulting in a score out of 27. A score above 15 was indicative of AD, and between 5 and 14 of aMCI. Scores of 4 or lower indicate that the person does not have significant memory problems.

The questionnaire is not of course definitive, but is intended as an indicator for further testing. Note, too, that all participants in this study were Caucasian.

The value and limitations of brief cognitive screenings

The value of brief cognitive screenings combined with offering further evaluation is demonstrated in a recent large VA study, which found that, of 8,342 Veterans aged 70+ who were offered screening (the three-minute Mini-Cog), 8,063 (97%) accepted, 2,081 (26%) failed the screen, and 580 (28%) agreed to further evaluation. Among those accepting further evaluation, 93% were found to have cognitive impairment, including 75% with dementia.

Among those who declined further evaluation, 17% (259/1,501) were diagnosed with incident cognitive impairment through standard clinical care. In total, the use of brief cognitive screenings increased the numbers with cognitive impairment to 11% (902/8,063) versus 4% (1,242/28,349) in similar clinics without this program.

Importantly, the limits of such questionnaires were also demonstrated: 118 patients who passed the initial screen nevertheless requested further evaluation, and 87% were found to have cognitive impairment, including 70% with dementia.

This should not be taken as a reason not to employ such cognitive tests! There are two points that should, I think, be taken from this:

  • Routine screening of older adults is undoubtedly an effective strategy for identifying those with cognitive impairment.
  • Individuals who pass such tests but nevertheless believe they have cognitive problems should be taken seriously.

The study involved 74 non-smokers with amnestic MCI (average age 76), of whom half were given a nicotine patch of 15 mg a day for six months and half received a placebo. Cognitive tests were given at the start of the study and again after three and six months.

After 6 months of treatment, the nicotine-treated group showed significant improvement in attention, memory, speed of processing and consistency of processing. For example, the nicotine-treated group regained 46% of normal performance for age on long-term memory, whereas the placebo group worsened by 26%.

Nicotine is an interesting drug, in that, while predominantly harmful, it can have positive effects if the dose is just right, and if the person’s cognitive state is at a particular level (slipping below their normal state, but not too far below). Too much nicotine will make things worse, so it’s important not to self-medicate.

Nicotine has been shown to improve cognitive performance in smokers who have stopped smoking and previous short-term studies with nicotine have shown attention and memory improvement in people with Alzheimer's disease. Nicotine receptors in the brain are reduced in Alzheimer’s brains.

Because the dose is so crucial, and the effects so dependent on brain state (including, one assumes, whether the person has been a smoker or not), more research is needed before this can be used as a treatment.

[2736] Newhouse, P., Kellar K., Aisen P., White H., Wesnes K., Coderre E., et al.
(2012).  Nicotine treatment of mild cognitive impairment.
Neurology. 78(2), 91 - 101.

More data from the long-running Mayo Clinic Study of Aging has revealed that, in this one part of the U.S. at least, MCI develops at an overall rate of 6.4% a year among older adults (70+), with a higher rate for men and the less-educated.

The study involved 1,450 older adults (aged 70-89), who underwent memory testing every 15 months for an average of three years. By the end of the study period, 296 people had developed MCI, a rate of 6.4% per year. For men, the rate was 7.2% compared to 5.7% for women.

It should be noted that these rates apply to a relatively homogeneous group of people. Participants come from one county in Minnesota, an overwhelmingly white part of the U.S.

MCI comes in two types: amnestic (involving memory loss) and non-amnestic. Amnestic MCI was more than twice as common as non-amnestic MCI. The incidence rate of aMCI was also higher for men (4.4%) than women (3.3%), as was the risk of naMCI (2% vs 1.1%).

Those who had less education also had higher rates of MCI. For aMCI, the rate for those with 12 years or less of education was 4.3%, compared to 3.25% for those with more education. Similarly, for naMCI, the rates were 2% and 1%, respectively.

While the great majority of people diagnosed with MCI continued to have the disorder or progressed to dementia, some 12% were later re-diagnosed as not having it. This, I would presume, probably reflects temporary ‘dips’ in cognitive performance as a consequence of physical or emotional problems.

The differences between aMCI and naMCI, and between genders, suggest that risk factors for these should be considered separately.

We know that physical exercise greatly helps you prevent cognitive decline with aging. We know that mental stimulation also helps you prevent age-related cognitive decline. So it was only a matter of time before someone came up with a way of combining the two. A new study found that older adults improved executive function more by participating in virtual reality-enhanced exercise ("exergames") that combine physical exercise with computer-simulated environments and interactive videogame features, compared to the same exercise without the enhancements.

The Cybercycle Study involved 79 older adults (aged 58-99) from independent living facilities with indoor access to a stationary exercise bike. Of the 79, 63 participants completed the three-month study, meaning that they achieved at least 25 rides during the three months.

Unfortunately, randomization was not as good as it should have been — although the researchers planned to randomize on an individual basis, various technical problems led them to randomize on a site basis (there were eight sites), with the result that the cybercycle group and the control bike group were significantly different in age and education. Although the researchers took this into account in the analysis, that is not the same as having groups that match in these all-important variables. However, at least the variables went in opposite directions: while the cybercycle group was significantly younger (average 75.7 vs 81.6 years), it was significantly less educated (average 12.6 vs 14.8 years).

Perhaps also partly off-setting the age advantage, the cybercycle group was in poorer shape than the control group (higher BMI, glucose levels, lower physical activity level, etc), although these differences weren’t statistically significant. IQ was also lower for the cybercycle group, if not significantly so (but note the high averages for both groups: 117.6 vs 120.6). One of the three tests of executive function, Color Trails, also showed a marked group difference, but the large variability in scores meant that this difference was not statistically significant.

Although participants were screened for disorders such as Alzheimer’s and Parkinson’s, and functional disability, many of both groups were assessed as having MCI — 16 of the 38 in the cybercycle group and 14 of the 41 in the control bike group.

Participants were given cognitive tests at enrolment, one month later (before the intervention began), and after the intervention ended. The stationary bikes were identical for both groups, except the experimental bike was equipped with a virtual reality display. Cybercycle participants experienced 3D tours and raced against a "ghost rider," an avatar based on their last best ride.

The hypothesis was that cybercycling would particularly benefit executive function, and this was borne out. Executive function (measured by the Color Trails, Stroop test, and Digits Backward) improved significantly more in the cybercycle condition, and indeed was the only cognitive task to do so (other cognitive tests included verbal fluency, verbal memory, visuospatial skill, motor function). Indeed, the control group, despite getting the same amount of exercise, got worse at the Digits Backward test, and failed to show any improvement on the Stroop test.

Moreover, significantly fewer cybercyclists progressed to MCI compared to the control group (three vs nine).

There were no differences in exercise quantity or quality between the two groups — which does argue against the idea that cyber-enhanced physical activity would be more motivating. However, the cybercycling group did tend to comment on their enjoyment of the exercise. While the enjoyment may not have translated into increased activity in this situation, it may well do so in a longer, less directed intervention — i.e. real life.

It should also be remembered that the intervention was relatively short, and that other cognitive tasks might take longer to show improvement than the more sensitive executive function. This is supported by the fact that levels of the brain growth factor BDNF, assessed in 30 participants, showed a significantly greater increase of BDNF in cybercyclists.

I should also emphasize that the level of physical exercise really wasn't that great, but nevertheless the size of the cybercycle's effect on executive function was greater than usually produced by aerobic exercise (a medium effect rather than a small one).

The idea that activities that combine physical and mental exercise are of greater cognitive benefit than the sum of benefits from each type of exercise on its own is not inconsistent with previous research, and in keeping with evidence from animal studies that physical exercise and mental stimulation help the brain via different mechanisms. Moreover, I have an idea that enjoyment (in itself, not as a proxy for motivation) may be a factor in the cognitive benefits derived from activities, whether physical or mental. Mere speculation, derived from two quite separate areas of research: the idea of “flow” / “being in the zone”, and the idea that humor has physiological benefits.

Of course, as discussed, this study has a number of methodological issues that limit its findings, but hopefully it will be the beginning of an interesting line of research.  

[2724] Anderson-Hanley, C., Arciero P. J., Brickman A. M., Nimon J. P., Okuma N., Westen S. C., et al.
(2012).  Exergaming and Older Adult Cognition.
American Journal of Preventive Medicine. 42(2), 109 - 119.

The study involved 104 healthy older adults (average age 87) participating in the Oregon Brain Aging Study. Analysis of the nutrient biomarkers in their blood revealed that those with diets high in omega 3 fatty acids and in vitamins C, D, E and the B vitamins had higher scores on cognitive tests than people with diets low in those nutrients, while those with diets high in trans fats were more likely to score more poorly on cognitive tests.

These were dose-dependent, with each standard deviation increase in the vitamin BCDE score ssociated with a 0.28 SD increase in global cognitive score, and each SD increase in the trans fat score associated with a 0.30 SD decrease in global cognitive score.

Trans fats are primarily found in packaged, fast, fried and frozen food, baked goods and margarine spreads.

Brain scans of 42 of the participants found that those with diets high in vitamins BCDE and omega 3 fatty acids were also less likely to have the brain shrinkage associated with Alzheimer's, while those with high trans fats were more likely to show such brain atrophy.

Those with higher omega-3 scores also had fewer white matter hyperintensities. However, this association became weaker once depression and hypertension were taken into account.

Overall, the participants had good nutritional status, but 7% were deficient in vitamin B12 (I’m surprised it’s so low, but bear in mind that these are already a select group, being healthy at such an advanced age) and 25% were deficient in vitamin D.

The nutrient biomarkers accounted for 17% of the variation in cognitive performance, while age, education, APOE genotype (presence or absence of the ‘Alzheimer’s gene’), depression and high blood pressure together accounted for 46%. Diet was more important for brain atrophy: here, the nutrient biomarkers accounted for 37% of the variation, while the other factors accounted for 40% (meaning that diet was nearly as important as all these other factors combined!).

The findings add to the growing evidence that diet has a significant role in determining whether or not, and when, you develop Alzheimer’s disease.

A ten-year study involving 7,239 older adults (65+) has found that each common health complaint increased dementia risk by an average of about 3%, and that these individual risks compounded. Thus, while a healthy older adult had about an 18% chance of developing dementia after 10 years, those with a dozen of these health complaints had, on average, closer to a 40% chance.

It’s important to note that these complaints were not for serious disorders that have been implicated in Alzheimer’s. The researchers constructed a ‘frailty’ index, involving 19 different health and wellbeing factors: overall health, eyesight, hearing, denture fit, arthritis/rheumatism, eye trouble, ear trouble, stomach trouble, kidney trouble, bladder control, bowel control, feet/ankle trouble, stuffy nose/sneezing, bone fractures, chest problems, cough, skin problems, dental problems, other problems.

Not all complaints are created equal. The most common complaint — arthritis/rheumatism —was only slightly higher among those with dementia. Two of the largest differences were poor eyesight (3% of the non-demented group vs 9% of those with dementia) and poor hearing (3% and 6%).

At the end of the study, 4,324 (60%) were still alive, and of these, 416 (9.6%) had Alzheimer's disease, 191 (4.4%) had another sort of dementia and 677 (15.7%) had other cognitive problems (but note that 1,023 were of uncertain cognitive ability).

While these results need to be confirmed in other research — the study used data from broader health surveys that weren’t specifically designed for this purpose, and many of those who died during the study will have probably had dementia — they do suggest the importance of maintaining good general health.

Common irregular heartbeat raises risk of dementia

In another study, which ran from 1994 to 2008 and followed 3,045 older adults (mean age 74 at study start), those with atrial fibrillation were found to have a significantly greater risk of developing Alzheimer’s.

At the beginning of the study, 4.3% of the participants had atrial fibrillation (the most common kind of chronically irregular heartbeat); a further 12.2% developed it during the study. Participants were followed for an average of seven years. Over this time, those with atrial fibrillation had a 40-50% higher risk of developing dementia of any type, including probable Alzheimer's disease. Overall, 18.8% of the participants developed some type of dementia during the course of the study.

While atrial fibrillation is associated with other cardiovascular risk factors and disease, this study shows that atrial fibrillation increases dementia risk more than just through this association. Possible mechanisms for this increased risk include:

  • weakening the heart's pumping ability, leading to less oxygen going to the brain;
  • increasing the chance of tiny blood clots going to the brain, causing small, clinically undetected strokes;
  • a combination of these plus other factors that contribute to dementia such as inflammation.

The next step is to see whether any treatments for atrial fibrillation reduce the risk of developing dementia.

Stress may increase risk for Alzheimer's disease

And a rat study has shown that increased release of stress hormones leads to cognitive impairment and that characteristic of Alzheimer’s disease, tau tangles. The rats were subjected to stress for an hour every day for a month, by such means as overcrowding or being placed on a vibrating platform. These rats developed increased hyperphosphorylation of tau protein in the hippocampus and prefrontal cortex, and these changes were associated with memory deficits and impaired behavioral flexibility.

Previous research has shown that stress leads to that other characteristic of Alzheimer’s disease: the formation of beta-amyloid.

A telephone survey of around 17,000 older women (average age 74), which included questions about memory lapses plus standard cognitive tests, found that getting lost in familiar neighborhoods was highly associated with cognitive impairment that might indicate Alzheimer’s. Having trouble keeping up with a group conversation and difficulty following instructions were also significantly associated with cognitive impairment. But, as most of us will be relieved to know, forgetting things from one moment to the next was not associated with impairment!

Unsurprisingly, the more memory complaints a woman had, the more likely she was to score poorly on the cognitive test.

The 7 memory lapse questions covered:

  • whether they had recently experienced a change in their ability to remember things,
  • whether they had trouble remembering a short list of items (such as a shopping list),
  • whether they had trouble remembering recent events,
  • whether they had trouble remembering things from one second to the next,
  • whether they had difficulty following spoken or written instructions,
  • whether they had more trouble than usual following a group conversation or TV program due to memory problems,
  • whether they had trouble finding their way around familiar streets.

Because this survey was limited to telephone tests, we can’t draw any firm conclusions. But the findings may be helpful for doctors and others, to know which sort of memory complaints should be taken as a flag for further investigation.

In the first mouse study, when young and old mice were conjoined, allowing blood to flow between the two, the young mice showed a decrease in neurogenesis while the old mice showed an increase. When blood plasma was then taken from old mice and injected into young mice, there was a similar decrease in neurogenesis, and impairments in memory and learning.

Analysis of the concentrations of blood proteins in the conjoined animals revealed the chemokine (a type of cytokine) whose level in the blood showed the biggest change — CCL11, or eotaxin. When this was injected into young mice, they indeed showed a decrease in neurogenesis, and this was reversed once an antibody for the chemokine was injected. Blood levels of CCL11 were found to increase with age in both mice and humans.

The chemokine was a surprise, because to date the only known role of CCL11 is that of attracting immune cells involved in allergy and asthma. It is thought that most likely it doesn’t have a direct effect on neurogenesis, but has its effect through, perhaps, triggering immune cells to produce inflammation.

Exercise is known to at least partially reverse loss of neurogenesis. Exercise has also been shown to produce chemicals that prevent inflammation. Following research showing that exercise after brain injury can help the brain repair itself, another mouse study has found that mice who exercised regularly produced interleukin-6 (a cytokine involved in immune response) in the hippocampus. When the mice were then exposed to a chemical that destroys the hippocampus, the interleukin-6 dampened the harmful inflammatory response, and prevented the loss of function that is usually observed.

One of the actions of interleukin-6 that brings about a reduction in inflammation is to inhibit tumor necrosis factor. Interestingly, I previously reported on a finding that inhibiting tumor necrosis factor in mice decreased cognitive decline that often follows surgery.

This suggests not only that exercise helps protect the brain from the damage caused by inflammation, but also that it might help protect against other damage, such as that caused by environmental toxins, injury, or post-surgical cognitive decline. The curry spice cucurmin, and green tea, are also thought to inhibit tumor necrosis factor.

In a small study, 266 older adults with mild cognitive impairment (aged 70+) received a daily dose of 0.8 mg folic acid, 0.5 mg vitamin B12 and 20 mg vitamin B6 or a placebo for two years. Those treated with B vitamins had significantly lower levels of homocysteine at the end of the trial (high homocysteine is a known risk factor for age-related cognitive decline and dementia). Moreover, this was associated with a significantly slower rate of brain shrinkage.

However, while there were significant effects on homocysteine level, brain atrophy, and executive function, it wasn’t until results were separated on the basis of baseline homocysteine levels that we get really dramatic results.

It was the group with high homocysteine levels at the start of the study who really benefited from the high doses of B vitamins. For them, brain atrophy was cut by half, and there were clear benefits in episodic memory, semantic memory, and global cognitive function, not just executive function. Among those with high baseline homocysteine who received the placebo, significant cognitive decline occurred.

The level of B vitamins in the supplements was considerably greater than the recommended standard. However, caution must be taken in dosing yourself with supplements, because folic acid can have negative effects. Better to try and get your diet right first.

A longer and larger follow-up study is now planned, and hopefully that will tell us if such treatment can keep MCI developing into Alzheimer’s.

Comparison of 99 chimpanzee brains ranging from 10-51 years of age with 87 human brains ranging from 22-88 years of age has revealed that, unlike the humans, chimpanzee brains showed no sign of shrinkage with age. But the answer may be simple: we live much longer. In the wild, chimps rarely live past 45, and although human brains start shrinking as early as 25 (as soon as they reach maturity, basically!), it doesn’t become significant until around 50.

The answer suggests one reason why humans are uniquely vulnerable to Alzheimer’s disease — it’s all down to our combination of large brain and long life. There are other animals that experience some cognitive impairment and brain atrophy as they age, but nothing as extreme as that found in humans (a 10-15% decline in volume over the life-span). (Elephants and whales have the same two attributes as humans — large brains and long lives — but we lack information on how their brains change with age.)

The problem may lie in the fact that our brains use so much more energy than chimps’ (being more than three times larger than theirs) and thus produce a great deal more damaging oxidation. Over a longer life-span, this accumulates until it significantly damages the brain.

If that’s true, it reinforces the value of a diet high in antioxidants.

[2500] Sherwood, C. C., Gordon A. D., Allen J. S., Phillips K. A., Erwin J. M., Hof P. R., et al.
(2011).  Aging of the cerebral cortex differs between humans and chimpanzees.
Proceedings of the National Academy of Sciences. 108(32), 13029 - 13034.

A study involving 105 people with Alzheimer's disease and 125 healthy older adults has compared cognitive function and brain shrinkage in those aged 60-75 and those aged 80+.

It was found that the association between brain atrophy and cognitive impairment typically found in those with Alzheimer’s disease was less evident in the older group. This is partly because of the level of brain atrophy in healthy controls in that age group — there was less difference between the healthy controls and those with Alzheimer’s. Additionally, when compared to their healthy counterparts, executive function, immediate memory and attention/processing speed were less abnormal in the older group than they were in the younger group.

The finding suggests that mild Alzheimer’s in the very old may go undetected, and emphasize the importance of taking age into account when interpreting test performance and brain measures.

Dietary changes affect levels of biomarkers associated with Alzheimer's

In a study involving 20 healthy older adults (mean age 69.3) and 29 older adults who had amnestic mild cognitive impairment (mean age 67.6), half the participants were randomly assigned to a high–saturated fat/high–simple carbohydrate diet (HIGH) and half to a low–saturated fat/low–simple carbohydrate diet (LOW) for four weeks, in order to investigate the effects on biomarkers associated with Alzheimer’s.

For the healthy participants, the LOW diet decreased the level of amyloid-beta 42 in the cerebrospinal fluid, while the HIGH diet increased its level. The HIGH diet also lowered the CSF insulin concentration. For those with aMCI, the LOW diet increased the levels of amyloid-beta 42 and increased the CSF insulin concentration. For both groups, the level of apolipoprotein E in the CSF increased in the LOW diet and decreased in the HIGH diet.

For both groups, the LOW diet improved performance on delayed visual recall tests, but didn’t affect scores on other cognitive measures (bear in mind that the diet was only followed for a month).

The researchers suggest that the different results of the unhealthy diet in participants with aMCI may be due to the diet’s short duration. The fact that diet was bringing about measurable changes in CSF biomarkers so quickly, and that the HIGH diet moved healthy brains in the direction of Alzheimer’s, speaks to the potential of dietary intervention.

Why coffee helps protect against Alzheimer's disease

Support for the value of coffee in decreasing the risk of Alzheimer’s comes from a mouse study, which found that an as yet unidentified ingredient in coffee interacts with caffeine in such a way that blood levels of a growth factor called GCSF (granulocyte colony stimulating factor) increases. GCSF is a substance greatly decreased in patients with Alzheimer's disease and demonstrated to improve memory in Alzheimer's mice.

The finding points to the value of caffeinated coffee, as opposed to decaffeinated coffee or to other sources of caffeine. Moreover, only "drip" coffee was used; the researchers caution that they don’t know whether instant caffeinated coffee would provide the same GCSF response.

There are three ways that GCSF seems to improve memory performance in the Alzheimer's mice: by recruiting stem cells from bone marrow to enter the brain and remove beta-amyloid protein; by increasing the growth of new synapses; by increasing neurogenesis.

The amount of coffee needed to provide this protection, however, is estimated to be about 4 to 5 cups a day. The researchers also believe that this daily coffee intake is best begun at least by middle age (30s – 50s), although starting even in older age does seem to have some protective effect.

Weirdly (I thought), the researchers remarked that "The average American gets most of their daily antioxidants intake through coffee". Perhaps this points more to the defects in their diet than to the wonders of coffee! But the finding is consistent with other research showing an association between moderate consumption of coffee and decreased risk of Parkinson's disease, Type II diabetes and stroke.

A just-completed clinical trial has investigated GCSF treatment to prevent Alzheimer's in patients with mild cognitive impairment, and the results should be known soon.

[2442] Bayer-Carter, J. L., Green P. S., Montine T. J., VanFossen B., Baker L. D., Watson S. G., et al.
(2011).  Diet Intervention and Cerebrospinal Fluid Biomarkers in Amnestic Mild Cognitive Impairment.
Arch Neurol. 68(6), 743 - 752.

Cao, C., Wang, L., Lin, X., Mamcarz, M., Zhang, C., Bai, G., Nong, J., Sussman, S. & Arendash, G.  2011.Caffeine Synergizes with Another Coffee Component to Increase Plasma GCSF: Linkage to Cognitive Benefits in Alzheimer's Mice. Journal of Alzheimer's Disease, 25(2), 323-335.

Sleep apnea linked to later dementia

A study involving 298 older women with sleep problems found that those who had disordered breathing (such as sleep apnea) were significantly more likely to develop dementia or mild cognitive impairment.

Around a third of the women (average age 82) had disordered breathing (slowing down or stopping breathing during sleep and often having to gasp to catch up). None showed signs of cognitive impairment at the time of the sleep testing. When re-tested some five years later, 45% of those who had disordered breathing had developed dementia or MCI, compared with 31% of those with no breathing irregularities.

Those whose sleep irregularities had been particularly severe (15 or more breathing stoppages per hour and more than 7% of sleep time not breathing) during the earlier part of the study were nearly twice as likely as those without breathing problems to develop dementia or MCI. Other measures of sleep quality — waking after sleep onset, sleep fragmentation, sleep duration — were not associated with cognitive impairment.

The finding adds to the evidence for the importance of treating sleep apnea. Previous research has found that CPAP treatment effectively counteracts cognitive impairment caused by sleep apnea.

Brain injury raises dementia risk

Analysis of medical records on 281,540 U.S. military veterans aged at least 55 at the beginning of the study has found that over the next seven years those who had at one time suffered a traumatic brain injury were more than twice as likely to develop dementia than those who had not suffered such an injury. Around 1.7% (4,902) had incurred a traumatic brain injury, in many cases during the Vietnam War, and over 15% of these developed dementia. In contradiction of the prevailing belief that only moderate or severe brain injuries predispose people to dementia, severity of the injury made no difference.

Injuries due to strokes were weeded out of the study.

In another study, following up on nearly 4,000 retired National Football League players surveyed in 2001, 35% appeared to have significant cognitive problems (as assessed by questionnaire). When 41 of them were tested, they were found to have mild cognitive impairment that resembled a comparison group of much older patients from the general population.

The findings are a reminder of the importance of treating even mild head injuries, and of following a regime designed to mitigate damage: exercising, eating a healthy diet, reducing stress, and so on.

[2444] Yaffe, K., Laffan A. M., Harrison S L., Redline S., Spira A. P., Ensrud K. E., et al.
(2011).  Sleep-Disordered Breathing, Hypoxia, and Risk of Mild Cognitive Impairment and Dementia in Older Women.
JAMA: The Journal of the American Medical Association. 306(6), 613 - 619.

The brain injury studies were reported in July at the Alzheimer's Association International Conference in France. http://www.alz.org/aaic/

Functional impairment good indicator of mild cognitive impairment

Evaluation of 816 older adults, of whom 229 had no cognitive problems, 394 had a diagnosis of amnestic mild cognitive impairment, and 193 had a diagnosis of mild Alzheimer’s, has revealed that most of those with aMCI (72%) or AD (97%) had trouble with at least one type of function on the Pfeffer Functional Activities Questionnaire. Only 8% of controls had any difficulty. In both impaired groups, those who had the most difficulty functioning also tended to score worse on cognition tests, have smaller hippocampal volumes, and carry the APOe4 gene.

Two of the ten items in the questionnaire were specific in differentiating the control group from the impaired groups. Those items concerned "remembering appointments, family occasions, holidays, and medications” and "assembling tax records, business affairs, or other papers." Only 34% of those with aMCI and 3.6% of those with AD had no difficulty with these items.

The findings suggest that even mild disruptions in daily functioning may be an important clinical indicator of disease.

Early-onset Alzheimer’s poorly diagnosed when initial symptoms aren’t memory related

Post-mortem analysis of 40 people diagnosed  with early-onset Alzheimer’s has revealed that about 38% experienced initial symptoms other than memory problems, such as behavior, vision or language problems and a decline in executive function, or the ability to carry out tasks. Of these, 53% were incorrectly diagnosed when first seen by a doctor, compared to 4% of those who had memory problems. Of those with unusual initial symptoms, 47% were still incorrectly diagnosed at the time of their death.

The mean age at onset was 54.5 years (range 46-60). The average duration of the disease was 11 years, with an average diagnostic delay of 3 years.

GPs misidentify and fail to identify early dementia and MCI

A review of 30 studies involving 15,277 people seen in primary care for cognitive disorders, has found that while GPs managed to identify eight out of ten people with moderate to severe dementia, they only identified 45% of those with early dementia and mild cognitive impairment. Moreover, they were very poor at recording such diagnoses. Thus, though they recognized 45% of the MCI cases, they only recorded 11% of these cases in their medical notes. Although they identified 73% of people with dementia, they made correct annotations in medical records in only 38% of cases.

But the problem is not simply one of failing to diagnose — they were even more likely to misidentify dementia, and this was particularly true for those with depression or hearing problems.

The findings point to the need for more widespread use of simple cognitive screening tests.

Prevalence of dementia & MCI in 'oldest old' women

Data from 1,299 women enrolled in the Women Cognitive Impairment Study of Exceptional Aging suggests that the incidence of dementia almost doubles with every 5 years of age and prevalence rises from approximately 2-3% in those 65 to 75 years to 35% in those 85+.

Among those with mild cognitive impairment, amnestic multiple domain was most common (34%), followed by non-amnestic single domain (29%). Amnestic single domain (affecting only one type of cognitive function, including memory difficulty) affected 22%.

Alzheimer's disease and mixed dementia accounted for nearly 80% of dementia cases, and vascular dementia for 12.1%.

Those with dementia tended to be older, less likely to have completed high school, more likely to have reported depression, a history of stroke, and to have the APOEe4 gene.

The women in the study had an average age of 88.2 years and 27% were older than 90. 41% had clinical cognitive impairment (17.8% with dementia and 23.2% with mild cognitive impairment).

The high prevalence of cognitive impairment in this age group points to the importance of screening for cognitive disorders, particularly among high-risk groups.

As we get older, when we suffer memory problems, we often laughingly talk about our brain being ‘full up’, with no room for more information. A new study suggests that in some sense (but not the direct one!) that’s true.

To make new memories, we need to recognize that they are new memories. That means we need to be able to distinguish between events, or objects, or people. We need to distinguish between them and representations already in our database.

We are all familiar with the experience of wondering if we’ve done something. Is it that we remember ourselves doing it today, or are we remembering a previous occasion? We go looking for the car in the wrong place because the memory of an earlier occasion has taken precedence over today’s event. As we age, we do get much more of this interference from older memories.

In a new study, the brains of 40 college students and older adults (60-80) were scanned while they viewed pictures of everyday objects and classified them as either "indoor" or "outdoor." Some of the pictures were similar but not identical, and others were very different. It was found that while the hippocampus of young students treated all the similar pictures as new, the hippocampus of older adults had more difficulty with this, requiring much more distinctiveness for a picture to be classified as new.

Later, the participants were presented with completely new pictures to classify, and then, only a few minutes later, shown another set of pictures and asked whether each item was "old," "new" or "similar." Older adults tended to have fewer 'similar' responses and more 'old' responses instead, indicating that they could not distinguish between similar items.

The inability to recognize information as "similar" to something seen recently is associated with “representational rigidity” in two areas of the hippocampus: the dentate gyrus and CA3 region. The brain scans from this study confirm this, and find that this rigidity is associated with changes in the dendrites of neurons in the dentate/CA3 areas, and impaired integrity of the perforant pathway — the main input path into the hippocampus, from the entorhinal cortex. The more degraded the pathway, the less likely the hippocampus is to store similar memories as distinct from old memories.

Apart from helping us understand the mechanisms of age-related cognitive decline, the findings also have implications for the treatment of Alzheimer’s. The hippocampus is one of the first brain regions to be affected by the disease. The researchers plan to conduct clinical trials in early Alzheimer's disease patients to investigate the effect of a drug on hippocampal function and pathway integrity.

For the first time in 27 years, clinical diagnostic criteria for Alzheimer's disease dementia have been revised, and research guidelines updated. They mark a major change in how experts think about and study Alzheimer's disease.

The updated guidelines now cover three distinct stages of Alzheimer's disease:

  • Preclinical – is currently relevant only for research. It describes the use of biomarkers that may precede the development of Alzheimer’s.
  • Mild Cognitive Impairment– Current biomarkers include elevated levels of tau or decreased levels of beta-amyloid in the cerebrospinal fluid, reduced glucose uptake in the brain, and atrophy of certain brain regions. Primarily for researchers, these may be used in specialized clinical settings.
  • Alzheimer's Dementia – Criteria outline ways clinicians should approach evaluating causes and progression of cognitive decline, and expand the concept of Alzheimer's dementia beyond memory loss to other aspects of cognition, such as word-finding, vision/spatial issues, and impaired reasoning or judgment.

The criteria are available at http://www.alzheimersanddementia.org/content/ncg

Growing evidence has pointed to the benefits of social and mental stimulation in preventing dementia, but until now no one has looked at the role of physical environment.

A study involving 1294 healthy older adults found that those whose life-space narrowed to their immediate home were almost twice as likely to develop the condition as those with the largest life-space (out-of-town). The homebound also had an increased risk of MCI and a faster rate of global cognitive decline.

By the end of the eight-year study (average follow-up of 4.4 years), 180 people (13.9%) had developed Alzheimer’s. The association remained after physical function, disability, depressive symptoms, social network size, vascular disease burden, and vascular risk factors, were taken into account.

It may be that life-space is an indicator of how engaged we are with the world, with the associated cognitive stimulation that offers.

A study following 837 people with MCI, of whom 414 (49.5%) had at least one vascular risk factor, has found that those with risk factors such as high blood pressure, diabetes, cerebrovascular disease and high cholesterol were twice as likely to develop Alzheimer's disease. Over five years, 52% of those with risk factors developed Alzheimer's, compared to 36% of those with no risk factors In total, 298 people (35.6%) developed Alzheimer's.

However, of those with vascular risk factors, those receiving full treatment for their vascular problems were 39% less likely to develop Alzheimer's disease than those receiving no treatment, and those receiving some treatments were 26% less likely to develop the disease.

Treatment of risk factors included using high blood pressure medicines, insulin, cholesterol-lowering drugs and diet control. Smoking and drinking were considered treated if the person stopped smoking or drinking at the start of the study.

Adding to the growing evidence that social activity helps prevent age-related cognitive decline, a longitudinal study involving 1,138 older adults (mean age 80) has found that those who had the highest levels of social activity (top 10%) experienced only a quarter of the rate of cognitive decline experienced by the least socially active individuals (bottom 10%). The participants were followed for up to 12 years (mean of 5 years).

Social activity was measured using a questionnaire that asked participants whether, and how often, in the previous year they had engaged in activities that involve social interaction—for example, whether they went to restaurants, sporting events or the teletract (off-track betting) or played bingo; went on day trips or overnight trips; did volunteer work; visited relatives or friends; participated in groups such as the Knights of Columbus; or attended religious services.

Analysis adjusted for age, sex, education, race, social network size, depression, chronic conditions, disability, neuroticism, extraversion, cognitive activity, and physical activity.

There has been debate over whether the association between social activity and cognitive decline is because inactivity leads to impairment, or because impairment leads to inactivity. This study attempted to solve this riddle. Participants were evaluated yearly, and analysis indicates that the inactivity precedes decline, rather than the other way around. Of course, it’s still possible that there are factors common to both that affect social engagement before showing up in a cognitive test. But even in such a case, it seems likely that social inactivity increases the rate of cognitive decline.

[2228] James, B. D., Wilson R. S., Barnes L. L., & Bennett D. A.
(2011).  Late-Life Social Activity and Cognitive Decline in Old Age.
Journal of the International Neuropsychological Society. FirstView, 1 - 8.

A training program designed to help older adults with MCI develop memory strategies has found that their brains were still sufficiently flexible to learn new ways to compensate for impairment in some brain regions. The study involved 30 older adults, of whom 15 had MCI. Participants’ brains were scanned 6 weeks prior to memory training, one week prior to training and one week after training.

Before training, those with MCI showed less activity in brain regions associated with memory. After training they showed increased activation in these areas as well as in areas associated with language processing, spatial and object memory and skill learning. In particular, new activity in the right inferior parietal gyrus was associated with improvement on a memory task.

The findings demonstrate that even once diagnosed with MCI (a precursor to Alzheimer’s disease), brains can still be ‘rewired’ to use undamaged brain regions for tasks customarily done by now-damaged regions.

The new label of ‘metabolic syndrome’ applies to those having three or more of the following risk factors: high blood pressure, excess belly fat, higher than normal triglycerides, high blood sugar and low high-density lipoprotein (HDL) cholesterol (the "good" cholesterol). Metabolic syndrome has been linked to increased risk of heart attack.

A new French study, involving over 7,000 older adults (65+) has found that those with metabolic syndrome were 20% more likely to show cognitive decline on a memory test (MMSE) over a two or four year interval. They were also 13% more likely to show cognitive decline on a visual working memory test. Specifically, higher triglycerides and low HDL cholesterol were linked to poorer memory scores; diabetes (but not higher fasting blood sugar) was linked to poorer visual working memory and word fluency scores.

The findings point to the importance of managing the symptoms of metabolic syndrome.

High cholesterol and blood pressure in middle age tied to early memory problems

Another study, involving some 4800 middle-aged adults (average age 55), has found that those with higher cardiovascular risk were more likely to have lower cognitive function and a faster rate of cognitive decline over a 10-year period. A 10% higher cardiovascular risk was associated not only with increased rate of overall mental decline, but also poorer cognitive test scores in all areas except reasoning for men and fluency for women.

The cardiovascular risk score is based on age, sex, HDL cholesterol, total cholesterol, systolic blood pressure and whether participants smoked or had diabetes.

Memory problems may be sign of stroke risk

A very large study (part of the REGARDS study) tested people age 45 and older (average age 67) who had never had a stroke. Some 14,842 people took a verbal fluency test, and 17,851 people took a word recall memory test. In the next 4.5 years, 123 participants who had taken the verbal fluency test and 129 participants who had taken the memory test experienced a stroke.

Those who had scored in the bottom 20% for verbal fluency were 3.6 times more likely to develop a stroke than those who scored in the top 20%. For the memory test, those who scored in the bottom 20% were 3.5 times more likely to have a stroke than those in the top quintile.

The effect was greatest at the younger ages. At age 50, those who scored in the bottom quintile of the memory test were 9.4 times more likely to later have a stroke than those in the top quintile.

 

Together, these studies, which are consistent with many previous studies, confirm that cardiovascular problems and diabetes add to the risk of greater cognitive decline (and possible dementia) in old age. And point to the importance of treating these problems as soon as they appear.

[2147] Raffaitin, C., Féart C., Le Goff M., Amieva H., Helmer C., Akbaraly T. N., et al.
(2011).  Metabolic syndrome and cognitive decline in French elders.
Neurology. 76(6), 518 - 525.

The findings of the second and third studies are to be presented at the American Academy of Neurology's 63rd Annual Meeting in Honolulu April 9 to April 16, 2011

In a study in which 78 healthy elders were given 5 different tests and then tested for cognitive performance 18 months later, two tests combined to correctly predict nearly 80% of those who developed significant cognitive decline. These tests were a blood test to identify presence of the ‘Alzheimer’s gene’ (APOE4), and a 5-minute fMRI imaging scan showing brain activity during mental tasks.

The gene test in itself correctly classified 61.5% of participants (aged 65-88; mean age 73), showing what a strong risk factor this is, but when taken with activity on the fMRI test, the two together correctly classified 78.9% of participants. Age, years of education, gender and family history of dementia were not accurate predictors of future cognitive decline. A smaller hippocampus was also associated with a greater risk of cognitive decline.

These two tests are readily available and not time-consuming, and may be useful in identifying those at risk of MCI and dementia.

Woodard, J.L.  et al. 2010. Prediction of Cognitive Decline in Healthy Older Adults using fMRI. Journal of Alzheimer’s Disease, 21 (3), 871-885.

Clinical records of 211 patients diagnosed with probable Alzheimer's disease have revealed that those who have spoken two or more languages consistently over many years experienced a delay in the onset of their symptoms by as much as five years. It’s thought that lifelong bilingualism may contribute to cognitive reserve in the brain, enabling it to compensate for memory loss, confusion, and difficulties with problem-solving and planning.

Of the 211 patients of the Sam and Ida Ross Memory Clinic at Baycrest, 102 patients were classified as bilingual and 109 as monolingual. Bilingual patients had been diagnosed with Alzheimer's 4.3 years later than the monolingual patients on average, and had reported the onset of symptoms 5.1 years later. The groups were equivalent on measures of cognitive and occupational level, there was no apparent effect of immigration status, and there were no gender differences.

The findings confirm an earlier study from the same researchers, from the clinical records of 184 patients diagnosed with probable Alzheimer's and other forms of dementia.

[2039] Craik, F. I. M., Bialystok E., & Freedman M.
(2010).  Delaying the onset of Alzheimer disease.
Neurology. 75(19), 1726 - 1729.

A study involving 68 healthy older adults (65-85) has compared brain activity among four groups, determined whether or not they carry the Alzheimer’s gene ApoE4 and whether their physical activity is reported to be high or low. The participants performed a task involving the discrimination of famous people, which engages 15 different functional regions of the brain. Among those carrying the gene, those with higher physical activity showed greater activation in many regions than those who were sedentary. Moreover, physically active people with the gene had greater brain activity than physically active people without the gene.

And adding to the evidence supporting the potential for exercise to lower the risk of dementia, another recent study has found that after ten years exercise (in terms of the number of different types of exercises performed and number of exercise sessions lasting at least 20 minutes) was inversely associated with the onset of cognitive impairment. The study used data from the National Long Term Care Survey.

A six-year study involving over 1200 older women (70+) has found that low amounts of albumin in the urine, at levels not traditionally considered clinically significant, strongly predict faster cognitive decline in older women. Participants with a urinary albumin-to-creatinine ratio of >5 mcg/mg at the start of the study experienced cognitive decline at a rate 2 to 7 times faster in all cognitive measures than that attributed to aging alone over an average 6 years of follow-up. The ability most affected was verbal fluency. Albuminuria may be an early marker of diffuse vascular disease.

Data from 19,399 individuals participating in the Renal Reasons for Geographic and Racial Differences in Stroke (REGARDS) study, of whom 1,184 (6.1%) developed cognitive impairment over an average follow-up of 3.8 years, has found that those with albuminuria were 1.31-1.57 times more likely to develop cognitive impairment compared to individuals without albuminuria. This association was strongest for individuals with normal kidney function. Conversely, low kidney function was associated with a higher risk for developing cognitive impairment only among individuals without albuminuria. Surprisingly, individuals with albuminuria and normal kidney function had a higher probability for developing cognitive impairment as compared to individuals with moderate reductions in kidney function in the absence of albuminuria.

Both albuminuria and low kidney function are characteristics of kidney disease.

Lin, J., Grodstein, F., Kang, J.H. & Curhan, G. 2010. A Prospective Study of Albuminuria and Cognitive Decline in Women. Presented at ASN Renal Week 2010 on November 20 in Denver, CO.

Tamura, M.K. et al. 2010. Albuminuria, Kidney Function and the Incidence of Cognitive Impairment in US Adults. Presented at ASN Renal Week 2010 on November 20 in Denver, CO.

More evidence that vascular disease plays a crucial role in age-related cognitive impairment and Alzheimer’s comes from data from participants in the Alzheimer's Disease Neuroimaging Initiative.

The study involved more than 800 older adults (55-90), including around 200 cognitively normal individuals, around 400 people with mild cognitive impairment, and 200 people with Alzheimer's disease. The first two groups were followed for 3 years, and the Alzheimer’s patients for two. The study found that the extent of white matter hyperintensities (areas of damaged brain tissue typically caused by cardiovascular disease) was an important predictor of cognitive decline.

Participants whose white matter hyperintensities were significantly above average at the beginning of the study lost more points each year in cognitive testing than those whose white matter hyperintensities were average at baseline. Those with mild cognitive impairment or Alzheimer's disease at baseline had additional declines on their cognitive testing each year, meaning that the presence of white matter hyperintensities and MCI or Alzheimer's disease together added up to even faster and steeper cognitive decline.

The crucial point is that this was happening in the absence of major cardiovascular events such as heart attacks, indicating that it’s not enough to just reduce your cardiovascular risk factors to a moderate level — every little bit of vascular damage counts.

A simple new cognitive assessment tool with only 16 items appears potentially useful for identifying problems in thinking, learning and memory among older adults. The Sweet 16 scale is scored from zero to 16 (with 16 representing the best score) and includes questions that address orientation (identification of person, place, time and situation), registration, digit spans (tests of verbal memory) and recall. The test requires no props (not even pencil and paper) and is easy to administer with a minimum of training. It only takes an average of 2 minutes to complete.

A score of 14 or less correctly identified 80% of those with cognitive impairment (as identified by the Informant Questionnaire on Cognitive Decline in the Elderly) and correctly identified 70% of those who did not have cognitive impairment. In comparison, the standard MMSE correctly identified 64% of those with cognitive impairment and 86% of those who were not impaired. In other words, the Sweet 16 missed diagnosing 20% of those who were (according to this other questionnaire) impaired and incorrectly diagnosed as impaired 30% of those who were not impaired, while the MMSE missed 36% of those who were impaired but only incorrectly diagnosed as impaired 14% of those not impaired.

Thus, the Sweet 16 seems to be a great ‘first cut’, since its bias is towards over-diagnosing impairment. It should also be remembered that the IQCDE is not the gold standard for cognitive impairment; its role here is to provide a basis for comparison between the new test and the more complex MMSE. In comparison with a clinician’s diagnosis, Sweet 16 scores of 14 or less occurred in 99% of patients diagnosed by a clinician to have cognitive impairment and 28% of those without such a diagnosis.

The great benefit of the new test is of course its speed and simplicity, and it seems to offer great promise as an initial screening tool. Another benefit is that it supposedly is unaffected by the patient’s education, unlike the MMSE. The tool is open access.

The Sweet 16 was developed using information from 774 patients who completed the MMSE, and then validated using a different group of 709 older adults.

[1983] Fong, T. G., Jones R. N., Rudolph J. L., Yang F. M., Tommet D., Habtemariam D., et al.
(2010).  Development and Validation of a Brief Cognitive Assessment Tool: The Sweet 16.
Arch Intern Med. archinternmed.2010.423 - archinternmed.2010.423.

There have been mixed findings about the benefits of DHA (an omega-3 fatty acid), but in a study involving 485 older adults (55+) with age-related cognitive impairment, those randomly assigned to take DHA for six months improved the score on a visuospatial learning and episodic memory test. Higher levels of DHA in the blood correlated with better scores on the paired associate learning task. DHA supplementation was also associated with better verbal recognition, but not better working memory or executive function.

Other research has found no benefit from DHA to those already with Alzheimer’s, although those with Alzheimer’s tend to have lower levels of DHA in the blood. These findings reinforce the idea that the benefit of many proactive lifestyle strategies, such as diet and exercise, may depend mainly on their use before systems deteriorate.

The daily dose of algal DHA was 900 mg. The study took place at 19 clinical sites in the U.S., and those involved had an MMSE score greater than 26.

A Chinese study involving 153 older men (55+; average age 72), of whom 47 had mild cognitive impairment, has found that 10 of those in the MCI group developed probable Alzheimer's disease within a year. These men also had low testosterone, high blood pressure, and elevated levels of the ApoE4 protein.

The findings support earlier indications that low testosterone is associated with increased risk of Alzheimer's in men, but it’s interesting to note the combination with high blood pressure and having the ApoE4 gene. I look forward to a larger study.

Chu, L-W. et al. 2010. Bioavailable Testosterone Predicts a Lower Risk of Alzheimer’s Disease in Older Men. Journal of Alzheimer's Disease, 21 (4), 1335-45.

Data from 21,123 people, surveyed between 1978 and 1985 when in their 50s and tracked for dementia from 1994 to 2008, has revealed that those who smoked more than two packs per day in middle age had more than twice the risk of developing dementia, both Alzheimer's and vascular dementia, compared to non-smokers.

A quarter of the participants (25.4%) were diagnosed with dementia during the 23 years follow-up, of whom a little over 20% were diagnosed with Alzheimer's disease and nearly 8% with vascular dementia.

Former smokers, or those who smoked less than half a pack per day, did not appear to be at increased risk. Associations between smoking and dementia did not vary by race or sex.

Smoking is a well-established risk factor for stroke, and is also known to contribute to oxidative stress and inflammation.

[1934] Rusanen, M., Kivipelto M., Quesenberry C. P., Zhou J., & Whitmer R. A.
(2010).  Heavy Smoking in Midlife and Long-term Risk of Alzheimer Disease and Vascular Dementia.
Arch Intern Med. archinternmed.2010.393 - archinternmed.2010.393.

A long-running study involving 299 older adults (average age 78) has found that those who walked at least 72 blocks during a week of recorded activity (around six to nine miles) had greater gray matter volume nine years later. Gray matter does shrink as we get older, so this is not about growth so much as counteracting decline. Walking more than 72 blocks didn’t appear to confer any additional benefit (in terms of gray matter volume). Moreover, when assessed four years after that, those who had shown this increased brain size were only half as likely to have developed dementia (40% of the participants had developed dementia by this point).

Beginning in 1971, healthy older adults in Gothenburg, Sweden, have been participating in a longitudinal study of their cognitive health. The first H70 study started in 1971 with 381 residents of Gothenburg who were 70 years old; a new one began in 2000 with 551 residents and is still ongoing. For the first cohort (born in 1901-02), low scores on non-memory tests turned out to be a good predictor of dementia; however, these tests were not predictive for the generation born in 1930. Those from the later cohort also performed better in the intelligence tests at age 70 than their predecessors had.

It’s suggested that the higher intelligence is down to the later cohort’s better pre and postnatal care, better nutrition, higher quality education, and better treatment of high blood pressure and cholesterol. And possibly the cognitive demands of modern life.

Nevertheless, the researchers reported that the incidence of dementia at age 75 was little different (5% in the first cohort and 4.4% in the later). However, since a substantially greater proportion of the first cohort were dead by that age (15.7% compared to 4.4% of the 2nd cohort), it seems quite probable that there really was a higher incidence of dementia in the earlier cohort.

The fact that low scores on non-memory cognitive tests were predictive in the first cohort of both dementia and death by age 75 supports this argument.

The fact that low scores on non-memory cognitive tests were not predictive of dementia or death in the later cohort is in keeping with the evidence that higher levels of education help delay dementia. We will need to wait for later findings from this study to see whether that is what is happening.

The findings are not inconsistent with those from a very large U.S. national study that found older adults (70+) are now less likely to be cognitively impaired (see below). It was suggested then also that better healthcare and more education were factors behind this decline in the rate of cognitive impairment.

Previous study:

A new nationally representative study involving 11,000 people shows a downward trend in the rate of cognitive impairment among people aged 70 and older, from 12.2% to 8.7% between 1993 and 2002. It’s speculated that factors behind this decline may be that today’s older people are much likelier to have had more formal education, higher economic status, and better care for risk factors such as high blood pressure, high cholesterol and smoking that can jeopardize their brains. In fact the data suggest that about 40% of the decrease in cognitive impairment over the decade was likely due to the increase in education levels and personal wealth between the two groups of seniors studied at the two time points. The trend is consistent with a dramatic decline in chronic disability among older Americans over the past two decades.

Previous research has indicated that obesity in middle-age is linked to higher risk of cognitive decline and dementia in old age. Now a study of 32 middle-aged adults (40-60) has revealed that although obese, overweight and normal-weight participants all performed equally well on a difficult cognitive task (a working memory task called the 2-Back task), obese individuals displayed significantly lower activation in the right inferior parietal cortex. They also had lower insulin sensitivity than their normal weight and overweight peers (poor insulin sensitivity may ultimately lead to diabetes). Analysis pointed to the impaired insulin sensitivity mediating the relationship between task-related activation in that region and BMI.

This suggests that it is insulin sensitivity that is responsible for the higher risk of cognitive impairment later in life. The good news is that insulin sensitivity is able to be modified through exercise and diet.

A follow-up study to determine if a 12-week exercise intervention can reverse the differences is planned.

Inflammation in the brain appears to be a key contributor to age-related memory problems, and it may be that this has to do with the dysregulation of microglia that, previous research has shown, occurs with age. As these specialized support cells in the brain do normally when there’s an infection, with age microglia start to produce excessive cytokines, some of which result in the typical behaviors that accompany illness (sleepiness, appetite loss, cognitive deficits and depression).

Now new cell and mouse studies suggests that the flavenoid luteolin, known to have anti-inflammatory properties, apparently has these benefits because it acts directly on the microglial cells to reduce their production of inflammatory cytokines. It was found that although microglia exposed to a bacterial toxin produced inflammatory cytokines that killed neurons, if the microglia were first exposed to luteolin, the neurons lived. Exposing the neuron to luteolin had no effect.

Old mice fed a luteolin-supplemented diet for four weeks did better on a working memory test than old mice on an ordinary diet, and restored levels of inflammatory cytokines in their brains to that of younger mice.

Luteolin is found in many plants, including carrots, peppers, celery, olive oil, peppermint, rosemary and chamomile.

A long-running study involving 1,157 healthy older adults (65+) who were scored on a 5-point scale according to how often they participated in mental activities such as listening to the radio, watching television, reading, playing games and going to a museum, has found that this score is correlated to the rate of cognitive decline in later years.

Some 5 ½ years after this initial evaluation, 395 (34%) were found to have mild cognitive impairment and 148 (13%) to have Alzheimer’s. Participants were then tested at 3-yearly intervals for the next 6 years. The rate of cognitive decline in those without cognitive impairment was reduced by 52% for each point on the cognitive activity scale, but for those with Alzheimer's disease, the average rate of decline per year increased by 42% for each point on the cognitive activity scale. Rate of decline was unrelated to earlier cognitive activity in those with MCI (presumably they were at the balance point).

This is not terribly surprising when you think of it, if you assume that the benefit of mental stimulation is to improve your brain function so that it can better cope with the damage happening to it. But eventually it reaches the point where it can no longer compensate for that damage because it is so overwhelming.

Low levels of DHA, an omega-3 fatty acid, have been found in the brains of those with Alzheimer's disease, but the reason has not been known. A new study has found that lower levels of DHA in the liver (where most brain DHA is manufactured) were correlated with greater cognitive problems in the Alzheimer’s patients. Moreover, comparison of postmortem livers from Alzheimer’s patients and controls found reduced expression of a protein that converts a precursor acid into DHA, meaning the liver was less able to make DHA from food.

The findings may explain why clinical trials in which Alzheimer's patients are given omega-3 fatty acids have had mixed results. They also suggest that it might be possible to identify at-risk persons using specific blood tests, and perhaps delay the development of Alzheimer’s with a chemically enhanced form of DHA.

Findings from the long-running Religious Orders Study, from 354 Catholic nuns and priests who were given annual cognitive tests for up to 13 years before having their brains examined post-mortem, has revealed that even the very early cognitive impairments we regard as normal in aging are associated with dementia pathology. Although pathology in the form of neurofibrillary tangles, Lewy bodies, and cerebral infarctions were all associated with rapid decline, they were also associated with “normal” mild impairment. In the absence of any of these lesions, there was almost no cognitive decline.

Previous research has shown that white matter lesions are very common in older adults, and mild cognitive impairment is more likely in those with quickly growing white matter lesions; importantly, the crucial factor appears to be the rate of growth, not the amount of lesions. This new study extends the finding, suggesting that any age-related cognitive impairment reflects the sort of brain pathology that ultimately leads to dementia (if given enough time). It suggests that we should be more proactive in fighting such damage, instead of simply regarding it as normal.

Type 2 diabetes is known to increase the risk of cognitive impairment in old age. Now analysis of data from 41 older diabetics (aged 55-81) and 458 matched controls in the Victoria Longitudinal Study has revealed that several other factors make it more likely that an older diabetic will develop cognitive impairment. These factors are: having higher (though still normal) blood pressure, having gait and balance problems, and/or reporting yourself to be in bad health regardless of actual problems.

Diabetes and hypertension often go together, and both are separately associated with greater cognitive impairment and dementia risk, so it is not surprising that higher blood pressure is one of the significant factors that increases risk. The other factors are less expected, although gait and balance problems have been linked to cognitive impairment in a recent study, and they may be connected to diabetes through diabetes’ effect on nerves. Negativity about one’s health may reflect emotional factors such as anxiety, stress, or depression, although depression and well-being measures were not themselves found to be mediating effects for cognitive impairment in diabetics (Do note that this study is not investigating which factors, in general, are associated with age-related cognitive impairment; it is trying to establish which factors are specifically sensitive to cognitive impairment in older diabetics).

In the U.S., type 2 diabetes occurs in over 23% of those over 60; in Canada (where this study took place) the rate is 19%. It should be noted that the participants in this study are not representative of the general population, in that they were fairly well-educated older Canadians, most of whom have benefited from a national health care system. Moreover, the study did not have longitudinal data on these various factors, meaning that we don’t know the order of events (which health problems come first? How long between the development of the different problems?). Nevertheless, the findings provide useful markers to alert diabetics and health providers.

A study involving 2,050 people aged 70 to 89 has found that mild cognitive impairment was 1.5 times more common in men than women. Among the 1,969 who did not have dementia, over 16% (329) had MCI — around 11% amnestic MCI (MCI-A) and 5% non-amnestic (MCI-MCD). A total of 19% of men had MCI, compared to 14% of women. MCI was also more common among the never-married, those with the APOEe4 (Alzheimer’s risk) gene, and those with less education.

This is the first study conducted among community-dwelling persons to find a higher prevalence of MCI in men. However, I note that some years ago I reported on a Dutch study involving some 600 85-year-olds, that found that significantly more women than men had a good memory (41% vs 29%; good mental speed on word and number recognition tests was also found in more women than men: 33% vs 28%). This was considered particularly surprising, given that significantly more of the women had limited formal education compared to the men.

The researchers suggested biological factors such as the relative absence of cardiovascular disease in the women might account for the difference. I would suggest another factor might be social, given that social stimulation has been shown to help prevent cognitive decline, and women are more likely than men to keep up social links in old age.

A two-year study involving 271 older adults (70+) with mild cognitive impairment has found that the rate of brain atrophy in those taking folic acid (0.8 mg/d), vitamin B12 (0.5 mg/d) and vitamin B6 (20 mg/d), was significantly slower than in those taking a placebo, with those taking the supplements experiencing on average 30% less brain atrophy. Higher rates of atrophy were associated with lower cognitive performance. Moreover those who with the highest levels of homocysteine at the beginning of the trial benefited the most, with 50% less brain shrinkage. High levels of homocysteine are a risk factor for Alzheimer’s, and folate, B12 and B6 help regulate it.

The finding that atrophy can be slowed in those with MCI offers hope that the treatment could delay the development of Alzheimer’s, since MCI is a major risk factor for Alzheimer’s, and faster brain atrophy is typical of those who go on to develop Alzheimer’s.

Commercial use is a long way off, but research with mice offers hope for a ‘smart drug’ that doesn’t have the sort of nasty side-effects that, for example, amphetamines have. The mice, genetically engineered to produce dramatically less (70%) kynurenic acid, had markedly better cognitive abilities. The acid, unusually, is produced not in neurons but in glia, and abnormally high levels are produced in the brains of people with disorders such as schizophrenia, Alzheimer's and Huntington's. More acid is also typically produced as we get older.

The acid is produced in our brains after we’ve eaten food containing the amino acid tryptophan, which helps us produce serotonin (turkey is a food well-known for its high tryptophan levels). But serotonin helps us feel good (low serotonin levels are linked to depression), so the trick is to block the production of kynurenic acid without reducing the levels of serotonin. The next step is therefore to find a chemical that blocks production of the acid in the glia, and can safely be used in humans. Although no human tests have yet been performed, several major pharmaceutical companies are believed to be following up on this research.

I have often spoken of the mantra: What’s good for your heart is good for your brain. The links between cardiovascular risk factors and cognitive decline gets more confirmation in this latest finding that people whose hearts pumped less blood had smaller brains than those whose hearts pumped more blood. The study involved 1,504 participants of the decades-long Framingham Offspring Cohort who did not have a history of stroke, transient ischemic attack or dementia. Participants were 34 to 84 years old.

Worryingly, it wasn’t simply those with the least amount of blood pumping from the heart who had significantly more brain atrophy (equivalent to almost two years more brain aging) than the people with the highest cardiac index. Those with levels at the bottom end of normal showed similar levels of brain atrophy. Moreover, although only 7% of the participants had heart disease, 30% had a low cardiac index.

On the subject of the benefits of walking for seniors, it’s intriguing to note a recent pilot study that found frail seniors who walked slowly (no faster than one meter per second) benefited from a brain fitness program known as Mindfit. After eight weeks of sessions three times weekly (each session 45-60 minutes), all ten participants walked a little faster, and significantly faster while talking. Walking while talking requires considerably more concentration than normal walking. The success of this short intervention (which needs to be replicated in a larger study) offers the hope that frail elderly who may be unable to participate in physical exercise, could improve their mobility through brain fitness programs. Poor gait speed is also correlated with a higher probability of falls.

The connection between gait speed and cognitive function is an interesting one. Previous research has indicated that slow gait should alert doctors to check for cognitive impairment. One study found severe white matter lesions were more likely in those with gait and balance problems. Most recently, a longitudinal study involving over 900 older adults has found poorer global cognitive function, verbal memory, and executive function, were all predictive of greater decline in gait speed.

Data from the long-running Framingham Heart Study has revealed that depression significantly increased the risk of developing dementia. Of the 125 people (13%) who were classified as having depression at the start of the study, 21.6% had developed dementia by the end of the study (17 years later). This compares to around 16.6% of those who weren’t depressed. When age, gender, education, homocysteine, and APOE gene status were taken into account, depressed participants had a more than 50% increased risk of developing dementia. Moreover, for each 10-point increase on the self-report scale used to measure depression (CES-D), there was a significant increase in the dementia risk. These findings, from one of the largest and longest population-based studies, should clarify the inconsistent results from earlier research.

There are several possible ways depression might increase the risk of dementia — for example, through the brain inflammation or the increased level of certain proteins that occurs during depression; or through the effects on lifestyle (reduced exercise, social engagement, poor diet).

Anticholinergics are widely used for a variety of common medical conditions including insomnia, allergies, or incontinence, and many are sold over the counter. Now a large six-year study of older African-Americans has found that taking one anticholinergic significantly increased an individual's risk of developing mild cognitive impairment and taking two of these drugs doubled this risk. The risk was greater for those who didn’t have the ‘Alzheimer’s gene’, APOE-e4.

This class of drugs includes Benadryl®, Dramamine®, Excedrin PM®, Nytol®, Sominex®, Tylenol PM®, Unisom®, Paxil®, Detrol®, Demerol® and Elavil® (for a more complete list of medications with anticholinergic effects, go to http://www.indydiscoverynetwork.org/AnticholienrgicCognitiveBurdenScale....).

Another study has come out showing that older adults with low levels of vitamin D are more likely to have cognitive problems. The six-year study followed 858 adults who were age 65 or older at the beginning of the study. Those who were severely deficient in vitamin D were 60% more likely to have substantial cognitive decline, and 31% more likely to have specific declines in executive function, although there was no association with attention. Vitamin D deficiency is common in older adults in the United States and Europe (levels estimated from 40% to 100%!), and has been implicated in a wide variety of physical disease.

A three-year study involving 169 people with mild cognitive impairment has found that those who later developed Alzheimer's disease showed 10-30% greater atrophy in two specific locations within the hippocampus, the cornu ammonis (CA1) and the subiculum. A second study comparing the brains of 10 cognitively normal elderly people and seven who were diagnosed with MCI between two and three years after their initial brain scan and with Alzheimer's some seven years after the initial scan, has confirmed the same pattern of hippocampal atrophy, from the CA1 to the subiculum, and then other regions of the hippocampus.

Apostolova, L.G. et al. In press. Subregional hippocampal atrophy predicts Alzheimer's dementia in the cognitively normal. Neurobiology of Aging, Available online 24 September 2008.

[392] Apostolova, L. G., Thompson P. M., Green A. E., Hwang K. S., Zoumalan C., Jack, Jr C. R., et al.
(2010).  3D comparison of low, intermediate, and advanced hippocampal atrophy in MCI.
Human Brain Mapping. 9999(9999), NA - NA.

Subjective cognitive impairment (SCI), marked by situations such as when a person recognizes they can't remember a name like they used to or where they recently placed important objects the way they used to, is experienced by between one-quarter and one-half of the population over the age of 65. A seven-year study involving 213 adults (mean age 67) has found that healthy older adults reporting SCI are dramatically more likely to progress to MCI or dementia than those free of SCI (54% vs 15%). Moreover, those who had SCI declined significantly faster.

Reisberg, B. et al. 2010. Outcome over seven years of healthy adults with and without subjective cognitive impairment. Alzheimer's & Dementia, 6 (1), 11-24.

A German study involving nearly 4000 older adults (55+) has found that physical activity significantly reduced the risk of developing mild cognitive impairment over a two-year period. Nearly 14% of those with no physical activity at the start of the study developed cognitive impairment, compared to 6.7% of those with moderate activity, and 5.1% of those with high activity. Moderate activity was defined as less than 3 times a week.

In another report, a study involving 1,324 individuals without dementia found those who reported performing moderate exercise during midlife or late life were significantly less likely to have MCI. Midlife moderate exercise was associated with 39% reduction in the odds of developing MCI, and moderate exercise in late life was associated with a 32% reduction. Light exercise (such as bowling, slow dancing or golfing with a cart) or vigorous exercise (including jogging, skiing and racquetball) were not significantly associated with reduced risk for MCI.

And in a clinical trial involving 33 older adults (55-85) with MCI has found that women who exercised at high intensity levels with an aerobics trainer for 45 to 60 minutes per day, four days per week, significantly improved performance on multiple tests of executive function, compared to those who engaged in low-intensity stretching exercises. The results for men were less significant: high-intensity aerobics was associated only with improved performance on one cognitive task, Trail-making test B, a test of visual attention and task-switching.

A number of rodent studies have shown that blueberries can improve aging memory; now for the first time, a human study provides evidence. In the small study, nine older adults (mean age 76) with mild cognitive impairment (MCI) drank the equivalent of 2-2 l/2 cups of a commercially available blueberry juice every day. After three months they showed significantly improved paired associate learning and word list recall. The findings will of course have to be confirmed by larger trials, but they are consistent with other research.

A companion study involving 12 older adults (75-80) with MCI found that those who drank a pure variety of Concord grape juice for 12 weeks also saw their performance progressively improve on tests in which they had to learn lists and remember items placed in a certain order.

An analysis technique using artificial neural networks has revealed that the most important factors for predicting whether amnestic mild cognitive impairment (MCI-A) would develop into Alzheimer’s within 2 years were hyperglycemia, female gender and having the APOE4 gene (in that order). These were followed by the scores on attentional and short memory tests.

Tabaton, M. et al. 2010. Artificial Neural Networks Identify the Predictive Values of Risk Factors on the Conversion of Amnestic Mild Cognitive Impairment. Journal of Alzheimer's Disease, 19 (3), 1035-1040.

Amnestic mild cognitive impairment often leads to Alzheimer's disease, but what predicts aMCI? A study involving 94 older adults has revealed that lower performance on tests measuring learning, in conjunction with either slower visuomotor processing speed or depressive symptoms, predicted the development of aMCI a year later with an accuracy of 80-100%. It is worth emphasizing that poor learning alone was not predictive in that time-frame, although one learning measure was predictive of aMCI two years later. Interestingly, neither gender nor possession of the ‘Alzheimer’s gene’ —long believed to be risk factors for mild cognitive impairment — had any substantial influence on later impairment.

[1690] Han, D. S., Suzuki H., Jak A. J., Chang Y-L., Salmon D. P., & Bondi M. W.
(2010).  Hierarchical Cognitive and Psychosocial Predictors of Amnestic Mild Cognitive Impairment.
Journal of the International Neuropsychological Society. 16(04), 721 - 729.

Older news items (pre-2010) brought over from the old website

Specific hippocampal atrophy early sign of MCI & Alzheimer's

A three-year study involving 169 people with MCI has found that those who later developed Alzheimer's disease showed 10-30% greater atrophy in two specific locations within the hippocampus, the cornu ammonis (CA1) and the subiculum. A second study comparing the brains of 10 cognitively normal elderly people and seven who were diagnosed with MCI between two and three years after their initial brain scan and with Alzheimer's some seven years after the initial scan, has confirmed the same pattern of hippocampal atrophy, from the CA1 to the subiculum, and then other regions of the hippocampus.

Apostolova, L. G., Thompson, P. M., Green, A. E., Hwang, K. S., Zoumalan, C., Jr, C. R. J., Harvey, D. J., et al. (2010). 3D comparison of low, intermediate, and advanced hippocampal atrophy in MCI. Human Brain Mapping, 9999(9999), NA. doi:10.1002/hbm.20905

Apostolova, L.G. et al. In press. Subregional hippocampal atrophy predicts Alzheimer's dementia in the cognitively normal. Neurobiology of Aging, Available online 24 September 2008.

http://www.eurekalert.org/pub_releases/2010-01/uoc--uri012810.php

Characteristics of age-related cognitive decline in semantic memory

A study involving 117 healthy elderly (aged 60-91) has found that, while increasing age was associated with poorer memory for names of famous people, age didn’t affect memory for biographical details about them. It also found that names served as better cues to those details than faces did. A follow-up study (to be published in Neuropsychologia) found that, in contrast, those with mild cognitive impairment and early Alzheimer’s showed not only an increased inability to remember names, but also a decline in memory for biographical details.

Langlois, R. et al. 2009. Manque du nom propre et effet de la modalité sur la capacité à reconnaître des personnes connues au cours du vieillissement normal. Canadian Journal on Aging/ La Revue canadienne du vieillissement, 28 (4), 337-345.

http://www.eurekalert.org/pub_releases/2009-12/uom-whn121809.php

Apathy common in dementia patients with white matter changes

A study involving 176 patients with Alzheimer's, vascular dementia or mixed dementia, or mild cognitive impairment, has found that 82% of the patients with changes in their white matter were apathetic, compared to an overall rate of 58%. This discovery suggests that there is a common biological reason behind this apathy, irrespective of which type of dementia a patient has. White matter changes were also associated with age, gender, blood pressure, hypertension, ischaemic heart disease, mental slowness, disinhibition, gait disturbance and focal neurologic symptoms. Apathy, mental slowness and age were the most consistent predicting factors for WMCs.

Jonsson, M., Edman, Å., Lind, K., Rolstad, S., Sjögren, M., & Wallin, A. (2009). Apathy is a prominent neuropsychiatric feature of radiological white-matter changes in patients with dementia. International Journal of Geriatric Psychiatry, 9999(9999), n/a. doi: 10.1002/gps.2379.

http://www.eurekalert.org/pub_releases/2009-12/uog-aci120209.php

Difficulties with daily activities associated with progression to dementia

A study involving 111 older adults with mild cognitive impairment, of whom 28 progressed from mild cognitive impairment to dementia over the next 2 ½ years, found only one factor predicted conversion from mild cognitive impairment to dementia: the degree of functional impairment (ability to perform routine activities) at the beginning of the study. Other cognitive and neurological variables were not predictive.

Farias, S.T. et al. 2009. Progression of Mild Cognitive Impairment to Dementia in Clinic- vs Community-Based Cohorts. Archives of Neurology, 66(9), 1151-1157.

http://www.eurekalert.org/pub_releases/2009-09/jaaj-dwd091009.php

Problems managing money may be early sign of Alzheimer's

A study involving 76 older people with no memory problems and 87 older people with amnestic mild cognitive impairment has found that those (25 of the 87) who had developed Alzheimer’s a year later were significantly worse at a money management task. Compared to those with no memory problems, as well as those with MCI who did not develop dementia, those who did develop Alzheimer’s not only dropped 9% on checkbook management abilities and 6% on overall financial knowledge and skills, but also performed more poorly at the beginning of the study. The task included counting coins, making grocery purchases, understanding and using a checkbook, understanding and using a bank statement, preparing bills for mailing, and detecting fraud situations.

Triebel, K.L. et al. 2009. Declining financial capacity in mild cognitive impairment: A 1-year longitudinal study. Neurology, 73, 928-934.

http://www.eurekalert.org/pub_releases/2009-09/aaon-pmm091509.php
http://www.eurekalert.org/pub_releases/2009-09/uoaa-pmm091609.php

Different effects of ministrokes & strokes

A study involving 679 seniors (65+) has found that those with small areas of brain damage called white matter hyperintensities, often referred to as ministrokes, were nearly twice as likely to have mild cognitive impairment that included memory loss (amnestic MCI), while those who had infarcts (areas of dead tissue usually called strokes) were more likely to experience mild cognitive impairment in abilities other than memory loss (non-amnestic MCI). In other words, ministrokes predicted memory problems, while strokes predicted non-memory problems.

Luchsinger, J.A. et al. 2009. Subclinical cerebrovascular disease in mild cognitive impairment. Neurology, 73, 450-456.

http://www.eurekalert.org/pub_releases/2009-08/aaon-bds080409.php

White matter changes may predict dementia risk

In a study in which 49 seniors (65+) were followed for an average of 9.5 years, of whom 24 developed mild cognitive impairment, those with the fastest rate of growth in white matter lesions were more likely to develop mild cognitive impairment than those with a slow rate of growth. The amount of lesions in healthy brains at the start of the study was not a factor; the crucial factor was the rate of progression.

Silbert, L.C. et al. 2009. Cognitive impairment risk: White matter hyperintensity progression matters. Neurology, 73, 120-125.

http://www.eurekalert.org/pub_releases/2009-07/aaon-wmc070709.php

Measuring brain atrophy in patients with mild cognitive impairment

A study involving 269 patients with mild cognitive impairment provides evidence that a fully automated procedure called Volumetric MRI (that can be done in a clinical setting) can accurately and quickly measure parts of the medial temporal lobe and compare them to expected size. It also found that not only atrophy in the hippocampus but also the amygdala is associated with a greater risk of conversion to Alzheimer’s.

Kovacevic, S. et al. 2009. High-throughput, Fully Automated Volumetry for Prediction of MMSE and CDR Decline in Mild Cognitive Impairment. Alzheimer Disease & Associated Disorders, 23 (2), 139-145.

http://www.eurekalert.org/pub_releases/2009-06/uoc--mba061609.php

Cerebrospinal fluid shows Alzheimer's disease deterioration much earlier

A study involving 60 patients with subjective cognitive impairment, 37 patients with non-amnestic mild cognitive impairment, and 71 with amnestic mild cognitive impairment, has found that 52% of those with SCI, 68% of those with naMCI, and 79% of those with aMCI showed decreased concentrations of Aβ42 and increased concentrations of tau protein in the cerebrospinal fluid. The findings confirm the use of biomarkers in the CSF for very early diagnosis.

Visser, P.J. et al. 2009. Prevalence and prognostic value of CSF markers of Alzheimer's disease pathology in patients with subjective cognitive impairment and mild cognitive impairment in the DESCRIPA study: a prospective, case-control study. The Lancet Neurology, 8 (7), 619–627.

http://www.eurekalert.org/pub_releases/2009-06/uog-cfs061809.php

Effective new cognitive screening test for detection of Alzheimer's

A new cognitive test for detecting Alzheimer's has been developed, and designed to be suitable for non-specialist use. The TYM ("test your memory") involves 10 tasks including ability to copy a sentence, semantic knowledge, calculation, verbal fluency and recall ability. It has been tested on 540 healthy individuals and139 patients with diagnosed Alzheimer's or mild cognitive impairment. Healthy controls completed the test in an average time of five minutes and gained an average score of 47 out of 50, compared to 45 for those with mild cognitive impairment, 39 for those with non-Alzheimer dementias and 33 for those with Alzheimer’s. Among controls, the average score was not affected by age until after 70, when it showed a small decline. There were no gender or geographical background differences in performance. The TYM detected 93% of patients with Alzheimer's, compared to only 52% by the widely used mini-mental state examination.

Brown, J. et al. 2009. Self administered cognitive screening test (TYM) for detection of Alzheimer’s disease: cross sectional study. BMJ, 338:b2030, doi: 10.1136/bmj.b2030 Full text available here.

Eye tracking test detects mild cognitive impairment

A test first developed for use with nonhuman primates is now being used to detect mild cognitive impairment (MCI) in humans. The infrared eye-tracking test involves showing one image and then another after a 2-second delay, and then repeating the test 2 minutes later. Those without cognitive impairment spend most of their time looking at the new image, but it was found that those with MCI spent less time looking at the new picture, presumably because they have less memory of seeing the original image before. Those with Alzheimer's disease look at both images equally. It’s hoped that this test may allow dementia to be spotted much earlier.

Crutcher, M.D. et al. 2009. Eye Tracking During a Visual Paired Comparison Task as a Predictor of Early Dementia. American Journal of Alzheimer's Disease and Other Dementias, Published online February 26 2009.

http://www.eurekalert.org/pub_releases/2009-04/eu-yru041509.php

Biomarker signatures predict conversion from MCI to Alzheimer's

Cerebrospinal fluid samples from 410 volunteers (100 with mild Alzheimer’s; 196 with MCI; 114 cognitively normal older adults) has revealed that concentrations of amyloid beta-42 peptide and tau protein successfully assessed brain status and predicted development. The test diagnosed Alzheimer’s with 96% accuracy; ruled out Alzheimer’s with 95% accuracy; and predicted the conversion from MCI to Alzheimer’s with 82% accuracy.

Shaw, L.M. et al. 2009. Cerebrospinal fluid biomarker signature in Alzheimer's disease neuroimaging initiative subjects. Annals of Neurology, Published Online March 18 2009.

http://www.eurekalert.org/pub_releases/2009-03/uops-pmp031609.php

Less risk of developing dementia than thought

Data from 41 studies has revealed the risk of those with mild cognitive impairment developing dementia is much less than thought. MCI is found in about 1 in 6 people seen in general practice, and it was thought that the risk of developing dementia was up to 15% per year, making deterioration almost inevitable within 5 to 10 years. It now appears that the risk is 10% per year in high risk groups (9.6% for dementia overall; 8% for Alzheimers; 2% for vascular dementia) and only 5% per year in low risk groups (5% for dementia overall; 7% for Alzheimers; 1.6% for vascular dementia). More importantly, only 20-40% developed dementia even after 10 years and the risk appeared to reduce slightly with time.

Mitchell, A.J. & Shiri-Feshki, M. 2009. Rate of progression of mild cognitive impairment to dementia – meta-analysis of 41 robust inception cohort studies. Acta Psychiatrica Scandinavica, 119 (4), 252-265.

http://www.eurekalert.org/pub_releases/2009-03/uol-nrh032309.php

Brain atrophy pattern in some MCI patients predicts Alzheimer's

A study of 84 patients with mild Alzheimer's, 175 patients with MCI and 139 healthy controls has revealed a pattern of regional brain atrophy in patients with MCI that indicates a greater likelihood of progression to Alzheimer's. Brain scans results showed widespread cortical atrophy in some patients with MCI, most importantly, atrophy in parts of the medial and lateral temporal lobes and in the frontal lobes — a pattern also present in the patients with mild Alzheimer's disease. Those exhibiting such atrophy declined significantly over a year and were more likely to progress to a probable diagnosis of Alzheimer's. MCI patients without that pattern of atrophy remained stable after a year. It should be noted that such atrophy affects not only memory, but also planning, organization, problem solving and language.

McEvoy, L.K. et al. 2009. Alzheimer Disease: Quantitative Structural Neuroimaging for Detection and Prediction of Clinical and Structural Changes in Mild Cognitive Impairment. Radiology, Published online February 6.

http://www.eurekalert.org/pub_releases/2009-02/rson-msb020309.php

Technique shows brain aging before symptoms appear

A new chemical marker called FDDNP, which binds to plaque and tangle deposits in the brain, has enabled PET scans to reveal exactly where these abnormal protein deposits are accumulating, and has found that older age correlated with higher concentrations of FDDNP in the medial and lateral temporal regions of the brain, areas involved with memory, where plaques and tangles usually collect. Of the 76 study volunteers, 34 carried the ‘Alzheimer’s gene’. This group demonstrated higher FDDNP levels in the frontal region of the brain than those without the gene variant. Thirty-six of the volunteers had mild cognitive impairment, and these had higher measures of FDDNP in the medial temporal brain regions than normal volunteers. Those who had both MCI and the APOE-4 gene also had higher concentrations of FDDNP in the medial temporal brain regions than those who had MCI but not APOE-4. The pilot study offers hope of early diagnosis of brain impairment, before symptoms show themselves.

Small, G.W. et al. 2009. Influence of Cognitive Status, Age, and APOE-4 Genetic Risk on Brain FDDNP Positron-Emission Tomography Imaging in Persons Without Dementia. Archives of General Psychiatry, 66(1), 81-87.

http://www.eurekalert.org/pub_releases/2009-01/uoc--uat010509.php

Occasional memory loss tied to lower brain volume

A study of 503 seniors (aged 50-85) with no dementia found that 453 of them (90%) reported having occasional memory problems such as having trouble thinking of the right word or forgetting things that happened in the last day or two, or thinking problems such as having trouble concentrating or thinking more slowly than they used to. Such problems have been attributed to white matter lesions, which are very common in older adults, but all of the participants in the study had white matter lesions in their brains, and the amount of lesions was not tied to occasional memory problems. However it was found that those who reported having such problems had a smaller hippocampus than those who had no cognitive problems. This was most noteworthy in subjects with good objective cognitive performance.

van Norden, A.G.W. et al. 2008. Subjective cognitive failures and hippocampal volume in elderly with white matter lesions. Neurology, 71, 1152-1159.

http://www.eurekalert.org/pub_releases/2008-10/aaon-oml093008.php

Moderate exercise helps mild cognitive impairment

An Australian study involving 138 older adults (50 years and over) with mild cognitive impairment, has found that those who undertook to achieve 2 ½ hours of physical activity each week (three 50 minute sessions), ranging from walking, ballroom dancing to swimming, for a six month period, continually out-scored the control group on cognitive tests during the 18 month testing period — showing that memory improvement was still evident a year after the supervised exercise period.

Lautenschlager, N.T. et al. 2008. Effect of Physical Activity on Cognitive Function in Older Adults at Risk for Alzheimer Disease: A Randomized Trial. Journal of the American Medical Association, 300(9), 1027-1037.

http://www.eurekalert.org/pub_releases/2008-09/ra-wtp090108.php
http://www.eurekalert.org/pub_releases/2008-09/uom-aow090108.php
http://www.eurekalert.org/pub_releases/2008-09/jaaj-emh082808.php

New 'everyday cognition' scale tracks how older adults function in daily life

A new, carefully validated questionnaire called Everyday Cognition (ECog) has been developed by seven psychologists. The 39-question screening tool is designed to enable mild functional problems in older adults to be quickly and easily identified. The questionnaire needs to be filled out by someone who knows an older adult well, such as a spouse, adult child, or close friend. It looks at everyday function in seven key cognitive domains: memory, language, semantic (factual) knowledge, visuospatial abilities, planning, organization and divided attention. The test has been shown to be sensitive to early changes present in Mild Cognitive Impairment, and unlike other cognitive tests, does not appear to be strongly influenced by education level. The test even differentiated between people diagnosed with mild impairment in memory only and those mildly impaired in several areas.

Farias, S.T. et al. 2008. The Measurement of Everyday Cognition (ECog): Scale Development and Psychometric Properties. Neuropsychology, 22 ( 4), 531-544.

http://www.eurekalert.org/pub_releases/2008-07/apa-nc062408.php

Mild cognitive impairment more likely in men

A study involving over 2000 people between 70 and 89 years old, found 15% had mild cognitive impairment, and men were one-and-a-half times more likely to have MCI than women.

The research was presented at the American Academy of Neurology Annual Meeting in Chicago, April 12–19.

http://www.eurekalert.org/pub_releases/2008-04/aaon-mml040208.php

High blood pressure associated with risk for mild cognitive impairment

A study of nearly 1000 older adults (average age 76.3) without mild cognitive impairment at the start of the study found that over the follow-up period (average: 4.7 years), 334 individuals developed mild cognitive impairment, of which 160 were amnestic (reduced memory) and 174 were non-amnestic. Hypertension (high blood pressure) was associated with an increased risk of non-amnestic mild cognitive impairment; but not with amnestic mild cognitive impairment.

Reitz, C. et al. 2007. Hypertension and the Risk of Mild Cognitive Impairment. Archives of Neurology, 64(12), 1734-1740.

http://www.eurekalert.org/pub_releases/2007-12/jaaj-hbp120607.php

Difficulty identifying odors may predict cognitive decline

Older adults who have difficulty identifying common odors may have a greater risk of developing mild cognitive impairment, increasingly recognized as a precursor to Alzheimer’s disease.  A study of nearly 600 older adults (average age 79.9) found that 30.1% developed mild cognitive impairment over the five-year period of the study. Risk of developing mild cognitive impairment was greater for those who scored worse on an odor identification test given at the start of the study. For example, those who scored below average (eight) were 50% more likely to develop MCI than those who scored above average (11). This association did not change when stroke, smoking habits or other factors that might influence smell or cognitive ability were considered. Impaired odor identification was also associated with lower cognitive scores at the beginning of the study and with a more rapid decline in episodic memory (memory of past experiences), semantic memory (memory of words and symbols) and perceptual speed. The odor test involved identifying 12 familiar odors given four possible alternatives to choose from.

Wilson, R.S., Schneider, J.A., Arnold, S.E., Tang, Y., Boyle, P.A. & Bennett, D.A. 2007. Olfactory Identification and Incidence of Mild Cognitive Impairment in Older Age. Archives of General Psychiatry, 64, 802-808.

http://www.eurekalert.org/pub_releases/2007-07/jaaj-dio062807.php

Senior’s memory complaints should be taken seriously

A study involving 120 people over 60 found those who complained of significant memory problems who still performed normally on memory tests had a 3% reduction in gray matter density in their brains. This compares to 4% in those diagnosed with mild cognitive impairment. This suggests that significant memory loss complaints may indicate a very early "pre-MCI" stage of dementia for some people.

Saykin, A.J. et al. 2006. Older adults with cognitive complaints show brain atrophy similar to that of amnestic MCI. Neurology, 67, 834-842.

http://www.eurekalert.org/pub_releases/2006-09/aaon-fym090506.php

Link between size of hippocampus and progression to Alzheimer's

A study of 20 older adults with mild cognitive impairment has found that the hippocampus was smaller in those who developed into Alzheimer's during the 3 year period.

Apostolova, L.G. et al. 2006. Conversion of Mild Cognitive Impairment to Alzheimer Disease Predicted by Hippocampal Atrophy Maps. Archives of Neurology, 63, 693-699.

http://www.eurekalert.org/pub_releases/2006-05/uoc--rml050406.php

Post-mortem brain studies reveal features of mild cognitive impairment

Autopsies have revealed that the brains of patients with mild cognitive impairment display pathologic features that appear to place them at an intermediate stage between normal aging and Alzheimer's disease. For instance, the patients had begun developing neurofibrillary tangles, but the number of plaques was similar to that in healthy patients. All patients with mild cognitive impairment had abnormalities in their temporal lobes, which likely caused their cognitive difficulties, and many also had abnormalities in other areas that did not relate to the features of Alzheimer's disease. In a second study, of 34 patients with mild cognitive impairment who had progressed to clinical dementia before their deaths, 24 were diagnosed (post-mortem) with Alzheimer’s, and 10 with other types of dementia. As in the other study, all patients had abnormalities in their temporal lobes.

Petersen, R.C. et al. 2006. Neuropathologic Features of Amnestic Mild Cognitive Impairment. Archives of Neurology, 63, 665-672.

Jicha, G.A. et al. 2006. Neuropathologic Outcome of Mild Cognitive Impairment Following Progression to Clinical Dementia. Archives of Neurology, 63, 674-681.

http://www.eurekalert.org/pub_releases/2006-05/jaaj-pbs050406.php

Risk of mild cognitive impairment increases with less education

A study of 3,957 people from the general population of Olmsted County, Minnesota is currently in train to find how many of those who did not have dementia might have mild cognitive impairment. A report on the findings so far suggests 9% of those aged 70 to 79 and nearly 18% of those 80 to 89 have MCI. Prevalence varied not only with age but also years of education: 25% in those with up to eight years of education, 14% in those with nine to 12 years, 9% in those with 13 to 16 years, and 8.5% in those with greater than 16 years.

Findings from this study were presented April 4 at the American Academy of Neurology meeting in San Diego.

http://www.eurekalert.org/pub_releases/2006-04/mc-mci033006.php

Two pathways lead to Alzheimer's disease

Mild cognitive impairment (MCI), a transitional stage between normal cognition and Alzheimer's disease, has been categorized into two sub-types on the basis of differing symptoms. Those with the amnesic subtype (MCI-A) have memory impairments only, while those with the multiple cognitive domain subtype (MCI-MCD) have other types of mild impairments, such as in judgment or language, and mild or no memory loss. Both sub-types progress to Alzheimer's disease at the same rate. A new imaging technique has now revealed that these types do in fact have different pathologies. The hippocampus of patients with MCI-A was not significantly different from that of Alzheimer's patients (who show substantial shrinkage), but the hippocampus of those with MCI-MCD was not significantly different from that of the healthy controls.

Becker, J.T. et al. 2006. Three-dimensional Patterns of Hippocampal Atrophy in Mild Cognitive Impairment. Archives of Neurology, 63, 97-101.

http://www.eurekalert.org/pub_releases/2006-01/uopm-tpf010606.php

Concussions increase chance of age-related cognitive impairment

A study involving retired National Football League players found that they had a 37% higher risk of Alzheimer's than other U.S. males of the same age. Some 60.8% of the retired players reported having sustained at least one concussion during their professional playing career, and 24% reported sustaining three or more concussions. Those with three or more concussions had a five-fold greater chance of having been diagnosed with mild cognitive impairment and a three-fold prevalence of reported significant memory problems compared to those players without a history of concussion. As the study was based on self-reported answers to the health questions, further studies are needed to confirm the findings, but it does seem likely that head injuries earlier in life increase the chance of developing dementia or mild cognitive impairment.

Guskiewicz, K.M., Marshall, S.W., Bailes, J., McCrea, M., Cantu, R.C., Randolph, C. & Jordan, B.D. 2005. Association between Recurrent Concussion and Late-Life Cognitive Impairment in Retired Professional Football Players. Neurosurgery, 57(4), 719-726.

http://www.eurekalert.org/pub_releases/2005-10/uonc-nsa101005.php

New computer program may enable early prediction of Alzheimer's risk

Researchers have developed a brain scan-based computer program that quickly and accurately measures metabolic activity in the hippocampus, a key brain region that shrinks with the development of Alzheimer’s. The study followed 53 normal subjects aged 54 to 80 for at least 9 years and in some cases for as long as 24 years, and found that hippocampal glucose metabolism was significantly reduced on the first scan of those 25 individuals who would later experience cognitive decline related to either mild cognitive impairment or to Alzheimer's. The findings bring hope of being able to predict who will develop Alzheimer’s at least 9 years ahead of symptoms.

Mosconi, L., Tsui, W-H., De Santi, S., Li, J., Rusinek, H., Convit, A., Li, Y., Boppana, M. & de Leon, M.J. 2005. Reduced hippocampal metabolism in MCI and AD: Automated FDG-PET image analysis. Neurology, 64, 1860-1867.

http://www.eurekalert.org/pub_releases/2005-06/nyum-ncp061505.php

Rate of brain volume loss predicts dementia

A new study has found that rates of total brain volume loss may help identify patients with mild cognitive impairment who are at high risk of developing dementia. The study followed 55 people over 14 years, and found that loss of volume in the hippocampus predicted which mildly cognitively impaired individuals would stay stable and which would decline to Alzheimer's with 70% accuracy, while the rate of total brain volume loss was 62% accurate in predicting cognitive outcome. Combining both variables produced the strongest model: 75% accuracy. The discovery could help doctors plan early treatment strategies and prevention studies.

The study was presented at the 56th annual meeting of the American Academy of Neurology in San Francisco.

http://www.eurekalert.org/pub_releases/2004-04/ohs-osr042804.php

More sensitive test norms better predict who might develop Alzheimer's disease

Early diagnosis of Alzheimer's is becoming more important with new medical and psychological interventions that can slow (but not stop) the course of the disease. Given this, it is suggested that more sensitive testing may be necessary for highly intelligent people, who, on average, show clinical signs of Alzheimer's later than the general population. Once they show such signs, they decline much faster. A study of 42 older people with IQ's of 120 or more, used two different test norms to forecast problems: the standard norm, derived from a large cross-section of the population, or an adjusted high-IQ norm that measured changes against the individual's higher ability level. The raised cutoffs predicted that 11 of the 42 individuals were at risk for future decline – compared with standard cutoffs, which indicated they were normal. True to the former prediction, three and a half years later, nine of those 11 people had declined. Six of those went on to develop mild cognitive impairment (MCI), a transitional illness from normal aging to a dementia (of which one type is Alzheimer's). Five of these individuals have since received a diagnosis of Alzheimer's disease, two years after this study was submitted. It is also suggested that, at the other end of the scale, those with below-average intelligence have the potential for being misdiagnosed as 'demented' when they are not, and the norms should be adjusted downwards accordingly.

Rentz, D.M., Huh, T.J., Faust, R.R., Budson, A.E., Scinto, L.F.M., Sperling, R.A. & Daffner, K.R. 2004. Use of IQ-Adjusted Norms to Predict Progressive Cognitive Decline in Highly Intelligent Older Individuals. Neuropsychology, 18 (1).

http://www.eurekalert.org/pub_releases/2004-01/apa-mst122903.php

Brief telephone questionnaire screens for early signs of dementia

Researchers have developed a brief telephonic questionnaire that helps distinguish between persons with early signs of dementia and persons with normal cognitive function. The questionnaire provides a way to reach out to persons with dementia whose impairment otherwise may go undetected until substantial cognitive deterioration has occurred. The questionnaire consists of a test of delayed recall and 2 questions that ask whether the person needs help with remembering to take medications or with planning a trip for errands. It is estimated that of 100 people who score positive on this test, 42 will actually have cognitive impairment. In other words, this does not provide a diagnosis of Alzheimer’s, but provides evidence that further evaluation is required. The rate of false positives compares favorably to other types of screening tests. A further study is underway to confirm the validity and reliability of the test.

Fillit, H. et al. 2003. A Brief Telephonic Instrument to Screen for Cognitive Impairment in a Managed Care Population. Journal of Clinical Outcomes Management, , 419-429.

http://www.eurekalert.org/pub_releases/2003-09/twc-btq091603.php

Early diagnosis of Alzheimer's

An analysis of data from 40 participants enrolled in a long-term study at the UCSD Alzheimer’s Disease Research Center (ADRC) found that "paper-and-pencil" cognitive skills tests administered to normal subjects averaging 75 years of age contained early signs of cognitive decline in those subjects who later developed Alzheimer’s disease. All participants were symptom-free when they took the test. The differences were quite subtle - only some performance measures were affected.

http://www.eurekalert.org/pub_releases/2002-04/uoc--trs040502.php

Brain scans predict cognitive impairment

A three-year study of 48 healthy people from 60 to 80 years old, by New York University School of Medicine researchers, predicted which healthy elderly men and women would develop memory impairment based on scans of their brains. At the beginning of the study, everyone scored within the normal range on a battery of tests typically used to detect early loss of memory and other mental skills. However, PET scans revealed a reduction in glucose metabolism in an area of the brain called the entorhinal cortex among 12 people. Three years later, 11 of these people had experienced mild cognitive impairment and one had developed Alzheimer's disease. "Our work extends the use of PET scanning to identifying in normal aging subjects the earliest metabolic abnormalities that may lead to the memory losses referred to as mild cognitive impairment (MCI). The diagnosis of MCI carries a high risk for future Alzheimer's disease."

The study is published in the September 11 issue of The Proceedings of the National Academy of Sciences.

http://www.eurekalert.org/pub_releases/2001-09/nyum-bps090701.php

 

Error | About memory

Error

The website encountered an unexpected error. Please try again later.