Alzheimers

Alzheimer's & other dementias

Inhibitory control deficits common in those with MCI

January, 2013

Impairment in executive function is apparently far more common in those with MCI than previously thought, with the most common and severe impairment occurring in inhibitory control.

Providing some support for the finding I recently reported — that problems with semantic knowledge in those with mild cognitive impairment (MCI) and Alzheimer’s might be rooted in an inability to inhibit immediate perceptual information in favor of conceptual information — a small study has found that executive function (and inhibitory control in particular) is impaired in far more of those with MCI than was previously thought.

The study involved 40 patients with amnestic MCI (single or multiple domain) and 32 healthy older adults. Executive function was tested across multiple sub-domains: divided attention, working memory, inhibitory control, verbal fluency, and planning.

As a group, those with MCI performed significantly more poorly in all 5 sub-domains. All MCI patients showed significant impairment in at least one sub-domain of executive functioning, with almost half performing poorly on all of the tests. The sub-domain most frequently and severely impaired was inhibitory control.

The finding is in sharp contrast with standard screening tests and clinical interviews, which have estimated executive function impairment in only 15% of those with MCI.

Executive function is crucial for many aspects of our behavior, from planning and organization to self-control to (as we saw in the previous news report) basic knowledge. It is increasingly believed that inhibitory control might be a principal cause of age-related cognitive decline, through its effect on working memory.

All this adds weight to the idea that we should be focusing our attention on ways to improve inhibitory control when it declines. Although training to improve working memory capacity has not been very successful, specific training targeted at inhibitory control might have more luck. Something to hope for!

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

Chewing ability linked to reduced dementia risk

January, 2013

A large study of older adults suggests that being able to bite into a hard food such as an apple puts you in a better state to fight cognitive decline and dementia.

Previous research has pointed to an association between not having teeth and a higher risk of cognitive decline and dementia. One reason might have to do with inflammation — inflammation is a well-established risk factor, and at least one study has linked gum disease to a higher dementia risk. Or it might have to do with the simple mechanical act of chewing, reducing blood flow to the brain. A new study has directly investigated chewing ability in older adults.

The Swedish study, involving 557 older adults (77+), found that those with multiple tooth loss, and those who had difficulty chewing hard food such as apples, had a significantly higher risk of developing cognitive impairments (cognitive status was measured using the MMSE). However, when adjusted for sex, age, and education, tooth loss was no longer significant, but chewing difficulties remained significant.

In other words, what had caused the tooth loss didn’t matter. The important thing was to maintain chewing ability, whether with your own natural teeth or dentures.

This idea that the physical act of chewing might affect your cognitive function (on a regular basis; I don’t think anyone is suggesting that you’re brighter when you chew!) is an intriguing and unexpected one. It does, however, give even more emphasis to the importance of physical exercise, which is a much better way of increasing blood flow to the brain.

The finding also reminds us that there are many things going on in the brain that may deteriorate with age and thus lead to cognitive decline and even dementia.

Reference: 

Source: 

Topics: 

tags: 

tags development: 

tags lifestyle: 

tags problems: 

Ginkgo biloba doesn’t prevent Alzheimer’s

January, 2013

The second large-scale study investigating whether gingko biloba helps prevent Alzheimer’s has confirmed that it doesn’t.

Sad to say, another large study has given the thumbs down to ginkgo biloba preventing Alzheimer’s disease.

The randomized, double-blind trial took place over five years, involving 2854 older adults (70+) who had presented to their primary care physician with memory complaints. Half were given a twice-daily dose of 120 mg standardised ginkgo biloba extract and half a placebo.

After five years, 4% of those receiving ginkgo biloba had been diagnosed with probable Alzheimer's disease, compared with 5% in the placebo group — an insignificant difference. There was no significant difference between the groups in mortality, stroke, or cardiovascular events, either.

The French study confirms the findings of an earlier American trial, and is also consistent with another large, long-running study that found no benefits of ginkgo biloba for age-related cognitive decline.

Reference: 

Source: 

Topics: 

tags development: 

tags lifestyle: 

tags problems: 

Popular cognitive test for Alzheimer’s insufficiently sensitive

January, 2013

The most common cognitive test used in clinical trials for Alzheimer’s treatments has been shown to have significant flaws that underestimate cognitive change.

New research suggests that reliance on the standard test Alzheimer's Disease Assessment Scale—Cognitive Behavior Section (ADAS-Cog) to measure cognitive changes in Alzheimer’s patients is a bad idea. The test is the most widely used measure of cognitive performance in clinical trials.

Using a sophisticated method of analysis ("Rasch analysis"), analysis of ADAS-Cog data from the AD Neuroimaging Initiative (675 measurements from people with mild Alzheimer's disease, across four time points over two years) revealed that although final patient score seemed reasonable, at the component level, a ceiling effect was revealed for eight out of the 11 parts of the ADAS-Cog for many patients (32-83%).

Additionally, for six components (commands, constructional praxis, naming objects and fingers, ideational praxis, remembering test instructions, spoken language), the thresholds (points of transition between response categories) were not ordered sequentially. The upshot of this is that, for these components, a higher score did not in fact confirm more cognitive impairment.

The ADAS-Cog has 11 component parts including memory tests, language skills, naming objects and responding to commands. Patients get a score for each section resulting in a single overall figure; different sections have different score ranges. A low total score signals better cognitive performance; total score range is 0-70, with 70 being the worst.

It seems clear from this that the test seriously underestimates cognitive differences between people and changes over time. Given that this is the most common cognitive test used in clinical trials, we have to consider whether these flaws account for the failure of so many drug trials to find significant benefits.

Among the recommended ways to improve the ADAS-Cognitive (including the need to clearly define what is meant by cognitive performance!), the researchers suggest that a number of the components should be made more difficult, and that the scoring function of those six components needs to be investigated.

Reference: 

Source: 

Topics: 

tags development: 

tags problems: 

Simple semantic task reveals early cognitive problems in older adults

January, 2013

A study finds early semantic problems in those with MCI, correlating with a reduced capacity to carry out everyday tasks.

A small study shows how those on the road to Alzheimer’s show early semantic problems long before memory problems arise, and that such problems can affect daily life.

The study compared 25 patients with amnestic MCI, 27 patients with mild-to-moderate Alzheimer's and 70 cognitively fit older adults (aged 55-90), on a non-verbal task involving size differences (for example, “What is bigger: a key or a house?”; “What is bigger: a key or an ant?”). The comparisons were presented in three different ways: as words; as images reflecting real-world differences; as incongruent images (e.g., a big ant and a small house).

Both those with MCI and those with AD were significantly less accurate, and significantly slower, in all three conditions compared to healthy controls, and they had disproportionately more difficulty on those comparisons where the size distance was smaller. But MCI and AD patients experienced their biggest problems when the images were incongruent – the ant bigger than the house. Those with MCI performed at a level between that of healthy controls and those with AD.

This suggests that perceptual information is having undue influence in a judgment task that requires conceptual knowledge.

Because semantic memory is organized according to relatedness, and because this sort of basic information has been acquired a long time ago, this simple test is quite a good way to test semantic knowledge. As previous research has indicated, the problem doesn’t seem to be a memory (retrieval) one, but one reflecting an actual loss or corruption of semantic knowledge. But perhaps, rather than a loss of data, it reflects a failure of selective attention/inhibition — an inability to inhibit immediate perceptual information in favor of more relevant conceptual information.

How much does this matter? Poor performance on the semantic distance task correlated with impaired ability to perform everyday tasks, accounting (together with delayed recall) for some 35% of the variance in scores on this task — while other cognitive abilities such as processing speed, executive function, verbal fluency, naming, did not have a significant effect. Everyday functional capacity was assessed using a short form of the UCSD Skills Performance Assessment scale (a tool generally used to identify everyday problems in patients with schizophrenia), which presents scenarios such as planning a trip to the beach, determining a route, dialing a telephone number, and writing a check.

The finding indicates that semantic memory problems are starting to occur early in the deterioration, and may be affecting general cognitive decline. However, if the problems reflect an access difficulty rather than data loss, it may be possible to strengthen these semantic processing connections through training — and thus improve general cognitive processing (and ability to perform everyday tasks).

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

Feeling lonely linked to increased dementia risk

January, 2013

A study that attempts to separate the effects of social isolation from subjective feelings of loneliness concludes that feelings of loneliness have a greater effect on dementia risk.

There's quite a bit of evidence now that socializing — having frequent contact with others — helps protect against cognitive impairment in old age. We also know that depression is a risk factor for cognitive impairment and dementia. There have been hints that loneliness might also be a risk factor. But here’s the question: is it being alone, or feeling lonely, that is the danger?

A large Dutch study, following 2173 older adults for three years, suggests that it is the feeling of loneliness that is the main problem.

At the start of the study, some 46% of the participants were living alone, and some 50% were no longer or never married (presumably the discrepancy is because many older adults have a spouse in a care facility). Some 73% said they had no social support, while 20% reported feelings of loneliness.

Those who lived alone were significantly more likely to develop dementia over the three year study period (9.3% compared with 5.6% of those who lived with others). The unmarried were also significantly more likely to develop dementia (9.2% vs 5.3%).

On the other hand, among those without social support, 5.6% developed dementia compared with 11.4% with social support! This seems to contradict everything we know, not to mention the other results of the study, but the answer presumably lies in what is meant by ‘social support’. Social support was assessed by the question: Do you get help from family, neighbours or home support? It doesn’t ask the question of whether help would be there if they needed it. So this is not a question of social networks, but more one of how much you need help. This interpretation is supported by the finding that those receiving social support had more health problems.

So, although the researchers originally counted this question as part of the measure of social isolation, it is clearly a poor reflection of it. Effectively, then, that leaves cohabitation and marriage as the only indices of social isolation, which is obviously inadequate.

However, we still have the interesting question re loneliness. The study found that 13.4% of those who said they felt lonely developed dementia compared with 5.7% of those who didn’t feel this way. This is a greater difference than that found with the ‘socially isolated’ (as measured!). Moreover, once other risk factors, such as age, education, and other health factors, were accounted for, the association between living alone and dementia disappeared, while the association with feelings of loneliness remained.

Of course, this still doesn’t tell us what the association is! It may be that feelings of loneliness simply reflect cognitive changes that precede Alzheimer’s, but it may be that the feelings themselves are decreasing cognitive and social activity. It may also be that those who are prone to such feelings have personality traits that are in themselves risk factors for cognitive impairment.

I would like to see another large study using better metrics of social isolation, but, still, the study is interesting for its distinction between being alone and feeling lonely, and its suggestion that it is the subjective feeling that is more important.

This is not to say there is no value in having people around! For a start, as discussed, the measures of social isolation are clearly inadequate. Moreover, other people play an important role in helping with health issues, which in turn greatly impact cognitive decline.

Although there was a small effect of depression, the relationship between feeling lonely and dementia remained after this was accounted for, indicating that this is a separate factor (on the other hand feelings of loneliness were a risk factor for depression).

A decrease in cognitive score (MMSE) was also significantly greater for those experiencing feelings of loneliness, suggesting that this is also a factor in age-related cognitive decline.

The point is not so much that loneliness is more detrimental than being alone, but that loneliness in itself is a risk factor for cognitive decline and dementia. This suggests that we should develop a better understanding of loneliness, how to identify the vulnerable, and how to help them.

Reference: 

Source: 

Topics: 

tags: 

tags development: 

tags memworks: 

tags problems: 

Timing of hormone therapy critical for Alzheimer's risk

November, 2012

A large long-running study adds to evidence that the timing of hormone therapy is critical in deciding whether it reduces or increases the risk of developing Alzheimer’s.

It’s been unclear whether hormone therapy helps older women reduce their risk of Alzheimer’s or in fact increases the risk. To date, the research has been inconsistent, with observational studies showing a reduced risk, and a large randomized controlled trial showed an increased risk. As mentioned before, the answer to the inconsistency may lie in the timing of the therapy. A new study supports this view.

The 11-year study (part of the Cache County Study) involved 1,768 older women (65+), of whom 1,105 women had used hormone therapy (either estrogen alone or in combination with a progestin). During the study, 176 women developed Alzheimer's disease. This included 87 (7.9%) of the 1,105 women who had taken hormone therapy, and 89 (13.4%) of the 663 others.

Women who began hormone therapy, of any kind, within five years of menopause had a 30% lower risk of developing Alzheimer's within the study period (especially if they continued the therapy for 10 or more years). Those who began treatment more than five years after menopause, had a ‘normal’ risk (i.e., not reduced or increased). However, those who had started a combined therapy of estrogen and progestin when they were at least 65 years old had a significantly higher risk of developing Alzheimer’s.

The findings support the idea that the timing of hormone therapy, and the type, are critical factors, although the researchers cautiously note that more research is needed before they can make new clinical recommendations.

Reference: 

Source: 

Topics: 

tags development: 

tags lifestyle: 

tags problems: 

Are sleep problems a key factor in Alzheimer’s?

October, 2012

A mouse study shows that sleep deprivation and aggregation of amyloid beta go hand in hand, and may be key players on the road to Alzheimer’s.

I reported a few months ago on some evidence of a link between disturbed sleep and the development of Alzheimer’s. Now a mouse study adds to this evidence.

The mouse study follows on from an earlier study showing that brain levels of amyloid beta naturally rise when healthy young mice are awake and drop after they go to sleep, and that sleep deprivation disrupted this cycle and accelerated the development of amyloid plaques. This natural rhythm was confirmed in humans.

In the new study, it was found that this circadian rhythm showed the first signs of disruption as soon as Alzheimer’s plaques began forming in the mice’s brains. When the genetically engineered mice were given a vaccine against amyloid beta, the mice didn’t develop plaques in old age, the natural fluctuations in amyloid beta levels continued, and sleep patterns remained normal.

Research with humans in now underway to see whether patients with early markers of Alzheimer’s show sleep problems, and what the nature of these problems is.

Just to make it clear: the point is not so much that Alzheimer’s patients are more likely to have sleep problems, but that the sleep problems may in fact be part of the cause of Alzheimer’s disease development. The big question, of course, is whether you can prevent its development by attacking the dysfunction in circadian rhythm. (See more on this debate at Biomed)

Reference: 

Source: 

Topics: 

tags: 

tags development: 

tags lifestyle: 

tags problems: 

Cut ‘visual clutter’ to help MCI & Alzheimer’s

October, 2012

A small study shows that those with MCI perform poorly on a visual discrimination task under high interference conditions, suggesting that reducing interference may improve cognitive performance.

Memory problems in those with mild cognitive impairment may begin with problems in visual discrimination and vulnerability to interference — a hopeful discovery in that interventions to improve discriminability and reduce interference may have a flow-on effect to cognition.

The study compared the performance on a complex object discrimination task of 7 patients diagnosed with amnestic MCI, 10 older adults considered to be at risk for MCI (because of their scores on a cognitive test), and 19 age-matched controls. The task involved the side-by-side comparison of images of objects, with participants required to say, within 15 seconds, whether the two objects were the same or different.

In the high-interference condition, the objects were blob-like and presented as black and white line-drawings, with some comparison pairs identical, while others only varied slightly in either shape or fill pattern. Objects were rotated to discourage a simple feature-matching strategy. In the low-interference condition, these line-drawings were interspersed with color photos of everyday objects, for which discriminability was dramatically easier. The two conditions were interspersed by a short break, with the low interference condition run in two blocks, before and after the high interference condition.

A control task, in which the participants compared two squares that could vary in size, was run at the end.

The study found that those with MCI, as well as those at risk of MCI, performed significantly worse than the control group in the high-interference condition. There was no difference in performance between those with MCI and those at risk of MCI. Neither group was impaired in the first low-interference condition, although the at-risk group did show significant impairment in the second low-interference condition. It may be that they had trouble recovering from the high-interference experience. However, the degree of impairment was much less than it was in the high-interference condition. It’s also worth noting that the performance on this second low-interference task was, for all groups, notably higher than it was on the first low-interference task.

There was no difference between any of the groups on the control task, indicating that fatigue wasn’t a factor.

The interference task was specifically chosen as one that involved the perirhinal cortex, but not the hippocampus. The task requires the conjunction of features — that is, you need to be able to see the object as a whole (‘feature binding’), not simply match individual features. The control task, which required only the discrimination of a single feature, shows that MCI doesn’t interfere with this ability.

I do note that the amount of individual variability on the interference tasks was noticeably greater in the MCI group than the others. The MCI group was of course smaller than the other groups, but variability wasn’t any greater for this group in the control task. Presumably this variability reflects progression of the impairment, but it would be interesting to test this with a larger sample, and map performance on this task against other cognitive tasks.

Recent research has suggested that the perirhinal cortex may provide protection from visual interference by inhibiting lower-level features. The perirhinal cortex is strongly connected to the hippocampus and entorhinal cortex, two brain regions known to be affected very early in MCI and Alzheimer’s.

The findings are also consistent with other evidence that damage to the medial temporal lobe may impair memory by increasing vulnerability to interference. For example, one study has found that story recall was greatly improved in patients with MCI if they rested quietly in a dark room after hearing the story, rather than being occupied in other tasks.

There may be a working memory component to all this as well. Comparison of two objects does require shifting attention back and forth. This, however, is separate to what the researchers see as primary: a perceptual deficit.

All of this suggests that reducing “visual clutter” could help MCI patients with everyday tasks. For example, buttons on a telephone tend to be the same size and color, with the only difference lying in the numbers themselves. Perhaps those with MCI or early Alzheimer’s would be assisted by a phone with varying sized buttons and different colors.

The finding also raises the question: to what extent is the difficulty Alzheimer’s patients often have in recognizing a loved one’s face a discrimination problem rather than a memory problem?

Finally, the performance of the at-risk group — people who had no subjective concerns about their memory, but who scored below 26 on the MoCA (Montreal Cognitive Assessment — a brief screening tool for MCI) — suggests that vulnerability to visual interference is an early marker of cognitive impairment that may be useful in diagnosis. It’s worth noting that, across all groups, MoCA scores predicted performance on the high-interference task, but not on any of the other tasks.

So how much cognitive impairment rests on problems with interference?

Reference: 

Newsome, R. N., Duarte, A., & Barense, M. D. (2012). Reducing Perceptual Interference Improves Visual Discrimination in Mild Cognitive Impairment : Implications for a Model of Perirhinal Cortex Function, Hippocampus, 22, 1990–1999. doi:10.1002/hipo.22071

Della Sala S, Cowan N, Beschin N, Perini M. 2005. Just lying there, remembering: Improving recall of prose in amnesic patients with mild cognitive impairment by minimising interference. Memory, 13, 435–440.

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

Why HIV-associated dementia occurs & implications for other disorders

October, 2012

A new understanding of why dementia sometimes occurs with HIV, even when treated, may also suggest a new approach to other neurological disorders, including age-related cognitive decline.

HIV-associated dementia occurs in around 30% of untreated HIV-positive patients. Surprisingly, it also is occasionally found in some patients (2-3%) who are being successfully treated for HIV (and show no signs of AIDS).

A new study may have the answer for this mystery, and suggest a solution. Moreover, the answer may have general implications for those experiencing cognitive decline in old age.

The study found that HIV, although it doesn’t directly infect neurons, tries to stop the development of BDNF. Long known to be crucial for memory and learning, the reduced production of mature BDNF results in axons and dendrites shortening — meaning connections between neurons are lost. That in turn, brings about the death of some neurons.

It seems that the virus interferes with the normal process of development in BDNF, whereby one form of it, called proBDNF, is cut by certain enzymes into a new form called mature BDNF. It is in this form that it has its beneficial effect on neuron growth. Unfortunately, in its earlier form it is toxic to neurons.

This imbalance in the proportions of mature BDNF and proBDNF also appears to occur as we age, and in depression. It may also be a risk factor in Parkinson's and Huntington's diseases.

However, these findings suggest a new therapeutic approach.

Compounds in green tea and chocolate may help protect brain cells

In which context, it is interesting to note another new study, which has been busy analyzing the effects on brain cells of 2000 compounds, both natural and synthetic. Of the 256 that looked to have protective effects, nine were related to epicatechin, which is found in cocoa and green tea leaves.

While we’ve been aware for some time of these positive qualities, the study specifically identified epicatechin and epigallocatechin gallate (EGCG) as being the most effective at helping protect neurons by inducing production of BDNF.

One of the big advantages these compounds have is in their ability to cross the blood-brain barrier, making them a good candidate for therapy.

While green tea, dark chocolate, and cocoa are particularly good sources, many fruits also have good levels, in particular, black grapes, blackberries, apples, cherries, pears, and raspberries. (see this University of Davis document (pdf) for more detail)

Reference: 

Source: 

Topics: 

tags development: 

tags lifestyle: 

tags problems: 

Pages

Subscribe to RSS - Alzheimers