attention training

This brain training program cuts dementia risk

  • A large 10-year study investigating the benefits of a brain training program for older adults found that training designed to improve processing speed & visual attention in particular reduced dementia risk.

Findings from the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) Study, which followed 2,802 healthy older adults for 10 years, has found that those who participated in computer training designed to improve processing speed and visual attention had a 29% lower risk of developing dementia compared to controls, with more training producing lower risk. Those who received instruction in memory or reasoning strategies showed no change in dementia risk.

Participants were randomly placed into a control group or one of three different cognitive training groups. One was instructed in memory strategies, another in reasoning strategies, and one was given individualized, computerized speed of processing training.

There were 10 initial sessions of training, each 60 to 75 minutes, over six weeks. Participants were assessed at the beginning of the study, after the first six weeks, and at one, two, three, five, and 10 years. Some of each group received four additional “booster” training sessions in months 11 and 35.

Among those who completed the most sessions (5 or more booster sessions), indicators of dementia were evident in 5.9% of the computerized speed training group; 9.7% of the memory strategy group; 10.1% of the reasoning strategy group. The control group had a dementia incidence rate of 10.8%.

14% of those who received no training developed dementia in the next 10 years, compared with 12.1% of those who received the initial processing speed training, and 8.2% of those who also received the additional booster training.

A decade after training began, the scientists found that 22.7% of people in the speed training group had dementia, compared with 24.2% in both memory and reasoning groups. In a control group of people who had no training, the dementia rate was 28.8%. This effect is greater than the protection offered by antihypertensive medications against major cardiovascular events.

It's suggested that some of the reason for this effect may be that the training builds up brain reserve, perhaps by improving brain efficiency, or in some way improving the health of brain tissue.

Some of the participants told researchers that the training encouraged them to enroll in classes at a local college or keep driving, and it’s possible that the motivational boost for continued social and intellectual engagement might also help explain the benefits.

Other research has found that processing speed training is associated with a lower risk of depression and improved physical function, as well as better everyday functioning.

The processing speed training was designed to improve the speed and accuracy of visual attention, with both divided and selective attention exercises. To perform the divided attention training task, participants identified a central object—such as a truck—while simultaneously locating a target in the periphery—the car. The speed of these objects became increasingly faster as participants mastered each set. In the more difficult training tasks, adding distracting objects made the task even more challenging, thus engaging selective attention.

The training program is available as the “Double Decision” exercise in the BrainHQ.com commercial product.

Of the 1220 who completed the 10-year follow-up, 260 developed dementia during the period.

http://www.futurity.org/speed-of-processing-training-dementia-1613322/

https://www.eurekalert.org/pub_releases/2017-11/uosf-ibf111417.php

https://www.theguardian.com/society/2017/nov/16/can-brain-training-reduce-dementia-risk-despite-new-research-the-jury-is-still-out

http://www.scientificamerican.com/article/brain-training-cuts-dementia-risk-a-decade-later/

Reference: 

[4490] Edwards, J. D., Xu H., Clark D. O., Guey L. T., Ross L. A., & Unverzagt F. W.
(2017).  Speed of processing training results in lower risk of dementia.
Alzheimer's & Dementia: Translational Research & Clinical Interventions. 3(4), 603 - 611.

Full text available at https://www.trci.alzdem.com/article/S2352-8737(17)30059-8/fulltext

Source: 

Topics: 

tags development: 

tags lifestyle: 

tags problems: 

tags strategies: 

Adding ADHD drug to therapy improves cognitive outcomes in TBI patients

  • A preliminary study suggests adults with persistent difficulties after TBI may benefit from a metacognitive training program in conjunction with use of Ritalin.

A small study involving 71 adults who struggled with persistent cognitive difficulties after suffering a traumatic brain injury at least four months before has compared two cognitive training programs with and without drug therapy.

The two six-week programs were

  • Memory and Attention Adaptation Training program, a brief cognitive-behavioral therapy aimed at enhancing skills for self-managing and coping with cognitive failures in daily life. It includes four components:
    • education regarding ‘normal’ cognitive failures, as well as potential effects of TBI on cognitive function
    • self-awareness training to identify ‘at-risk’ situations where cognitive failures are likely to occur
    • self-regulation training emphasizing applied relaxation techniques and stress management
    • cognitive compensatory strategy training
  • Attention Builders Training, involving
    • repetitive cognitive tasks to build skills through ‘mental exercise’
    • an educational component discussing common cognitive symptoms after TBI

Participants of both groups also received either the drug methylphenidate (Ritalin) or a placebo.

The best improvement (still modest) was noted in those who received methylphenidate along with the Memory and Attention Adaptation Training. They were better able to learn lists of words, while their working memory and their attention improved.

Do note, however, that these findings must be considered preliminary, due to the relatively small number of participants in the each group (17-19 people).

https://www.eurekalert.org/pub_releases/2016-11/s-hfp112216.php

https://www.eurekalert.org/pub_releases/2016-11/iu-aad112216.php

Paper available at https://www.nature.com/articles/npp2016261

Reference: 

McDonald, B.C. et al. 2016. Methylphenidate and Memory and Attention Adaptation Training for Persistent Cognitive Symptoms after Traumatic Brain Injury: A Randomized, Placebo-Controlled Trial, Neuropsychopharmacology. doi: 10.1038/npp.2016.261

 

Topics: 

tags problems: 

tags strategies: 

Some cognitive training helps less-educated older adults more

  • A large study in which older adults underwent various types of cognitive training has found that less-educated adults benefited more from training designed to speed processing.

Data from 2,800 participants (aged 65+) in the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) study has revealed that one type of cognitive training benefits less-educated people more than it does the more-educated.

While the effects of reasoning and memory training did not differ as a function of how much education the individual had, those older adults with less than a complete high school education experienced a 50% greater benefit from speed of information processing training than college graduates. This advantage was maintained for three years after the end of the training.

The training involved ten 60 to 75-minute sessions over six weeks that focused on visual search and processing information in shorter and shorter times.

Both reasoning and information processing speed training resulted in improved targeted cognitive abilities for 10 years among participants, but memory training did not. Memory training focused on mnemonic strategies for remembering lists and sequences of items, text material, and main ideas and details of stories and other text-based information. Reasoning training focused on improving the ability to solve problems containing a serial pattern.

The researchers speculate that speed of information processing training might help those with less than 12 years of education, who are at greater risk of dementia, close the gap between them and those with more education.

The training modules have been translated into online games delivered by Posit Science.

Less educated study participants were slightly older, less likely to be married, more likely to be African-American, and more likely to have hypertension or diabetes as well as heart disease than the more educated older adults.

http://www.eurekalert.org/pub_releases/2016-01/iu-irs012816.php

Reference: 

Source: 

Topics: 

tags development: 

tags strategies: 

Review shows computerized training can help TBI and stroke victims

  • The first review of computerized training programs to improve attention in those who have suffered a brain injury has reported favorably.

A systematic literature review of computerized training for attention and executive function in adults who suffered a brain injury (TBI or stroke) has concluded that there is encouraging evidence that such programs can help.

The review found 23 of 28 studies reported significant improvements in attention and executive function, with the remaining five showing promising trends. The studies included 11 that focused on TBI, of which 8 reported significant improvement; 5 that focused on stroke, of which all 5 showed significant improvement; 12 mixed-populations, of which 10 showed significant improvement.

Further studies are needed to confirm these results, as various methodological issues, such as a small number of participants, and inadequate controls, need to be addressed. The 28 studies included 9 that were rates as "class I" (the highest standard), 9 class II, and 7 that were class III (no controls). Almost all (26/28) of the studies involved fewer than 50 participants, with some having as few as 1 to 4. Most studies didn't specify how severe the injuries were, something which makes a big difference to treatment and expectations. Over a third of the studies (11) didn't have any control group, and only a few used the best sort of control - a comparable activity (as opposed to, say, no treatment). Only four studies provided any long-term follow-up.

As you can see, a lot of work is needed yet. Moreover, most programs were unique to the study, so we're still some way off producing recommended protocols. Only one program was used on multiple occasions (5): Cogmed QM (originally called RoboMemo).

Still, notwithstanding all these caveats, the review does support the value of specific training for those suffering brain injury.

http://www.eurekalert.org/pub_releases/2016-02/bumc-cra021016.php

Reference: 

Topics: 

tags problems: 

tags strategies: 

Why older adults lose working memory capacity

The root of age-related cognitive decline may lie in a reduced ability to ignore distractors. A new study indicates that older adults put more effort into focusing during encoding, in order to compensate for a reduced ability to hold information in working memory. The finding suggests a multi-pronged approach to improving cognitive ability in older adults.

I've reported before on the idea that the drop in working memory capacity commonly seen in old age is related to the equally typical increase in distractability. Studies of brain activity have also indicated that lower WMC is correlated with greater storage of distractor information. So those with higher WMC, it's thought, are better at filtering out distraction and focusing only on the pertinent information. Older adults may show a reduced WMC, therefore, because their ability to ignore distraction and irrelevancies has declined.

Why does that happen?

A new, large-scale study using a smartphone game suggests that the root cause is a change in the way we hold items in working memory.

The study involved 29,631 people aged 18—69, who played a smartphone game in which they had to remember the positions of an increasing number of red circles. Yellow circles, which had to be ignored, could also appear — either at the same time as the red circles, or after them. Data from this game revealed both WMC (how many red circle locations the individual could remember), and distractability (how many red circle locations they could remember in the face of irrelevant yellow circles).

Now this game isn't simply a way of measuring WMC. It enables us to make an interesting distinction based on the timing of the distraction. If the yellow circles appeared at the same time as the red ones, they are providing distraction when you are trying to encode the information. If they appear afterward, the distraction occurs when you are trying to maintain the information in working memory.

Now it would seem commonsensical that distraction at the time of encoding must be the main problem, but the fascinating finding of this study is that it was distraction during the delay (while the information is being maintained in working memory) that was the greater problem. And it was this distraction that became more and more marked with increasing age.

The study is a follow-up to a smaller 2014 study that included two experiments: a lab experiment involving 21 young adults, and data from the same smartphone game involving only the younger cohort (18-29 years; 3247 participants).

This study demonstrated that distraction during encoding and distraction during delay were independent contributory factors to WMC, suggesting that separate mechanisms are involved in filtering out distraction at encoding and maintenance.

Interestingly, analysis of the data from the smartphone game did indicate some correlation between the two in that context. One reason may be that participants in the smartphone game were exposed to higher load trials (the lab study kept WM load constant); another might be that they were in more distracting environments.

While in general researchers have till now assumed that the two processes are not distinct, it has been theorized that distractor filtering at encoding may involve a 'selective gating mechanism', while filtering during WM maintenance may involve a shutting down of perception. The former has been linked to a gating mechanism in the striatum in the basal ganglia, while the latter has been linked to an increase in alpha waves in the frontal cortex, specifically, the left middle frontal gyrus. The dorsolateral prefrontal cortex may also be involved in distractor filtering at encoding.

To return to the more recent study:

  • there was a significant decrease in WMC with increasing age in all conditions (no distraction; encoding distraction; delay distraction)
  • for older adults, the decrease in WMC was greatest in the delay distraction condition
  • when 'distraction cost' was calculated (((ND score − (ED or DD score))/ND score) × 100), there was a significant correlation between delay distraction cost and age, but not between encoding distraction cost and age
  • for older adults, performance in the encoding distraction condition was better predicted by performance in the no distraction condition than it was among the younger groups
  • this correlation was significantly different between the 30-39 age group and the 40-49 age group, between the 40s and the 50s, and between the 50s and the 60s — showing that this is a progressive change
  • older adults with a higher delay distraction cost (ie, those more affected by distractors during delay) also showed a significantly greater correlation between their no-distraction performance and encoding-distraction performance.

All of this suggests that older adults are focusing more attention during attention even when there is no distraction, and they are doing so to compensate for their reduced ability to maintain information in working memory.

This suggests several approaches to improving older adults' ability to cope:

  • use perceptual discrimination training to help improve WMC
  • make working memory training more about learning to ignore certain types of distraction
  • reduce distraction — modify daily tasks to make them more "older adult friendly"
  • (my own speculation) use meditation training to improve frontal alpha rhythms.

You can participate in the game yourself, at http://thegreatbrainexperiment.com/

http://medicalxpress.com/news/2015-05-smartphone-reveals-older.html

Reference: 

[3921] McNab, F., Zeidman P., Rutledge R. B., Smittenaar P., Brown H. R., Adams R. A., et al.
(2015).  Age-related changes in working memory and the ability to ignore distraction.
Proceedings of the National Academy of Sciences. 112(20), 6515 - 6518.

McNab, F., & Dolan, R. J. (2014). Dissociating distractor-filtering at encoding and during maintenance. Journal of Experimental Psychology. Human Perception and Performance, 40(3), 960–7. doi:10.1037/a0036013

Topics: 

tags memworks: 

tags problems: 

tags strategies: 

New direction for cognitive training in the elderly

October, 2012

A pilot study suggests declines in temporal processing are an important part of age-related cognitive decline, and shows how temporal training can significantly improve some cognitive abilities.

Here’s an exciting little study, implying as it does that one particular aspect of information processing underlies much of the cognitive decline in older adults, and that this can be improved through training. No, it’s not our usual suspect, working memory, it’s something far less obvious: temporal processing.

In the study, 30 older adults (aged 65-75) were randomly assigned to three groups: one that received ‘temporal training’, one that practiced common computer games (such as Solitaire and Mahjong), and a no-activity control. Temporal training was provided by a trademarked program called Fast ForWord Language® (FFW), which was developed to help children who have trouble reading, writing, and learning.

The training, for both training groups, occupied an hour a day, four days a week, for eight weeks.

Cognitive assessment, carried out at the beginning and end of the study, and for the temporal training group again 18 months later, included tests of sequencing abilities (how quickly two sounds could be presented and still be accurately assessed for pitch or direction), attention (vigilance, divided attention, and alertness), and short-term memory (working memory span, pattern recognition, and pattern matching).

Only in the temporal training group did performance on any of the cognitive tests significantly improve after training — on the sequencing tests, divided attention, matching complex patterns, and working memory span. These positive effects still remained after 18 months (vigilance was also higher at the end of training, but this improvement wasn’t maintained).

This is, of course, only a small pilot study. I hope we will see a larger study, and one that compares this form of training against other computer training programs. It would also be good to see some broader cognitive tests — ones that are less connected to the temporal training. But I imagine that, as I’ve discussed before, an effective training program will include more than one type of training. This may well be an important component of such a program.

Reference: 

[3075] Szelag, E., & Skolimowska J.
(2012).  Cognitive function in elderly can be ameliorated by training in temporal information processing.
Restorative Neurology and Neuroscience. 30(5), 419 - 434.

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

tags strategies: 

Video gamers don’t become expert multitaskers

August, 2012

A comparison of skilled action gamers and non-gamers reveals that all that multitasking practice doesn’t make you any better at multitasking in general.

The research is pretty clear by this point: humans are not (with a few rare exceptions) designed to multitask. However, it has been suggested that the modern generation, with all the multitasking they do, may have been ‘re-wired’ to be more capable of this. A new study throws cold water on this idea.

The study involved 60 undergraduate students, of whom 34 were skilled action video game players (all male) and 26 did not play such games (19 men and 7 women). The students were given three visual tasks, each of which they did on its own and then again while answering Trivial Pursuit questions over a speakerphone (designed to mimic talking on a cellphone).

The tasks included a video driving game (“TrackMania”), a multiple-object tracking test (similar to a video version of a shell game), and a visual search task (hidden pictures puzzles from Highlights magazine).

While the gamers were (unsurprisingly) significantly better at the video driving game, the non-gamers were just as good as them at the other two tasks. In the dual-tasking scenarios, performance declined on all the tasks, with the driving task most affected. While the gamers were affected less by multitasking during the driving task compared to the non-gamers, there was no difference in the amount of decline between gamers and non-gamers on the other two tasks.

Clearly, the smaller effect of dual-tasking on the driving game for gamers is a product of their greater expertise at the driving game, rather than their ability to multitask better. It is well established that the more skilled you are at a task, the more automatic it becomes, and thus the less working memory capacity it will need. Working memory capacity / attention is the bottleneck that prevents us from being true multitaskers.

In other words, the oft-repeated (and somewhat depressing) conclusion remains: you can’t learn to multitask in general, you can only improve specific skills, enabling you to multitask reasonably well while doing those specific tasks.

Reference: 

[3001] Donohue, S., James B., Eslick A., & Mitroff S.
(2012).  Cognitive pitfall! Videogame players are not immune to dual-task costs.
Attention, Perception, & Psychophysics. 74(5), 803 - 809.

Source: 

Topics: 

tags lifestyle: 

tags memworks: 

tags problems: 

tags strategies: 

tags study: 

How action videogames change some people’s brains

May, 2012

A small study has found that ten hours of playing action video games produced significant changes in brainwave activity and improved visual attention for some (but not all) novices.

Following on from research finding that people who regularly play action video games show visual attention related differences in brain activity compared to non-players, a new study has investigated whether such changes could be elicited in 25 volunteers who hadn’t played video games in at least four years. Sixteen of the participants played a first-person shooter game (Medal of Honor: Pacific Assault), while nine played a three-dimensional puzzle game (Ballance). They played the games for a total of 10 hours spread over one- to two-hour sessions.

Selective attention was assessed through an attentional visual field task, carried out prior to and after the training program. Individual learning differences were marked, and because of visible differences in brain activity after training, the action gamers were divided into two groups for analysis — those who performed above the group mean on the second attentional visual field test (7 participants), and those who performed below the mean (9). These latter individuals showed similar brain activity patterns as those in the control (puzzle) group.

In all groups, early-onset brainwaves were little affected by video game playing. This suggests that game-playing has little impact on bottom–up attentional processes, and is in keeping with earlier research showing that players and non-players don’t differ in the extent to which their attention is captured by outside stimuli.

However, later brainwaves — those thought to reflect top–down control of selective attention via increased inhibition of distracters — increased significantly in the group who played the action game and showed above-average improvement on the field test. Another increased wave suggests that the total amount of attention allocated to the task was also greater in that group (i.e., they were concentrating more on the game than the below-average group, and the control group).

The improved ability to select the right targets and ignore other stimuli suggests, too, that these players are also improving their ability to make perceptual decisions.

The next question, of course, is what personal variables underlie the difference between those who benefit more quickly from the games, and those who don’t. And how much more training is necessary for this latter group, and are there some people who won’t achieve these benefits at all, no matter how long they play? Hopefully, future research will be directed to these questions.

Reference: 

[2920] Wu, S., Cheng C K., Feng J., D'Angelo L., Alain C., & Spence I.
(2012).  Playing a First-person Shooter Video Game Induces Neuroplastic Change.
Journal of Cognitive Neuroscience. 24(6), 1286 - 1293.

Source: 

Topics: 

tags lifestyle: 

tags memworks: 

tags strategies: 

Pycnogenol improves cognition in college students in small trial

March, 2012

Another small study indicates that the plant extract Pycnogenol may improve working memory.

Back in 2008, I reported on a small study that found that daily doses of Pycnogenol® for three months improved working memory in older adults, and noted research indicating that the extract from the bark of the French maritime pine tree had reduced symptoms in children with ADHD. Now another study, involving 53 Italian university students, has found that cognitive performance improved in those taking 100 mg of Pycnogenol every day for eight weeks.

Students taking the supplement had higher scores on university exams than the control group, and they were apparently happier, less anxious, and more alert. It seems plausible that the improvement in academic performance results from working memory benefits.

The plant extract is an antioxidant, and benefits may have something to do with improved vascular function and blood flow in the brain.

However, the control group was apparently not given a placebo (I’m relying on the abstract and press release here, as this journal is not one to which I have access), they were simply “a group of equivalent students”. I cannot fathom why a double-blind, placebo procedure wasn’t followed, and it greatly lessens the conclusions of this study. Indeed, I wouldn’t ordinarily report on it, except that I have previously reported on this dietary supplement, and I am in hopes that a better study will come along. In the meantime, this is another small step, to which I wouldn’t give undue weight.

Reference: 

Luzzi R., Belcaro G., Zulli C., Cesarone M. R., Cornelli U., Dugall M., Hosoi M., Feragalli B. 2011. Pycnogenol® supplementation improves cognitive function, attention and mental performance in students. Panminerva Medica, 53(3 Suppl 1), 75-82.

Source: 

Topics: 

tags lifestyle: 

tags memworks: 

tags strategies: 

Running faster changes brain rhythms associated with learning

September, 2011

A mouse study finds that gamma waves in the hippocampus, critically involved in learning, grow stronger as mice run faster.

I’ve always felt that better thinking was associated with my brain working ‘in a higher gear’ — literally working at a faster rhythm. So I was particularly intrigued by the findings of a recent mouse study that found that brainwaves associated with learning became stronger as the mice ran faster.

In the study, 12 male mice were implanted with microelectrodes that monitored gamma waves in the hippocampus, then trained to run back and forth on a linear track for a food reward. Gamma waves are thought to help synchronize neural activity in various cognitive functions, including attention, learning, temporal binding, and awareness.

We know that the hippocampus has specialized ‘place cells’ that record where we are and help us navigate. But to navigate the world, to create a map of where things are, we need to also know how fast we are moving. Having the same cells encode both speed and position could be problematic, so researchers set out to find how speed was being encoded. To their surprise and excitement, they found that the strength of the gamma rhythm grew substantially as the mice ran faster.

The results also confirmed recent claims that the gamma rhythm, which oscillates between 30 and 120 times a second, can be divided into slow and fast signals (20-45 Hz vs 45-120 Hz for mice, consistent with the 30-55 Hz vs 45-120 Hz bands found in rats) that originate from separate parts of the brain. The slow gamma waves in the CA1 region of the hippocampus were synchronized with slow gamma waves in CA3, while the fast gamma in CA1 were synchronized with fast gamma waves in the entorhinal cortex.

The two signals became increasingly separated with increasing speed, because the two bands were differentially affected by speed. While the slow waves increased linearly, the fast waves increased logarithmically. This differential effect could have to do with mechanisms in the source regions (CA3 and the medial entorhinal cortex, respectively), or to mechanisms in the different regions in CA1 where the inputs terminate (the waves coming from CA3 and the entorhinal cortex enter CA1 in different places).

In the hippocampus, gamma waves are known to interact with theta waves. Further analysis of the data revealed that the effects of speed on gamma rhythm only occurred within a narrow range of theta phases — but this ‘preferred’ theta phase also changed with running speed, more so for the slow gamma waves than the fast gamma waves (which is not inconsistent with the fact that slow gamma waves are more affected by running speed than fast gamma waves). Thus, while slow and fast gamma rhythms preferred similar phases of theta at low speeds, the two rhythms became increasingly phase-separated with increasing running speed.

What’s all this mean? Previous research has shown that if inputs from CA3 and the entorhinal cortex enter CA1 at the same time, the kind of long-term changes at the synapses that bring about learning are stronger and more likely in CA1. So at low speeds, synchronous inputs from CA3 and the entorhinal cortex at similar theta phases make them more effective at activating CA1 and inducing learning. But the faster you move, the more quickly you need to process information. The stronger gamma waves may help you do that. Moreover, the theta phase separation of slow and fast gamma that increases with running speed means that activity in CA3 (slow gamma source) increasingly anticipates activity in the medial entorhinal cortex (fast gamma source).

What does this mean at the practical level? Well at this point it can only be speculation that moving / exercising can affect learning and attention, but I personally am taking this on board. Most of us think better when we walk. This suggests that if you’re having trouble focusing and don’t have time for that, maybe walking down the hall or even jogging on the spot will help bring your brain cells into order!

Pushing speculation even further, I note that meditation by expert meditators has been associated with changes in gamma and theta rhythms. And in an intriguing comparison of the effect of spoken versus sung presentation on learning and remembering word lists, the group that sang showed greater coherence in both gamma and theta rhythms (in the frontal lobes, admittedly, but they weren’t looking elsewhere).

So, while we’re a long way from pinning any of this down, it may be that all of these — movement, meditation, music — can be useful in synchronizing your brain rhythms in a way that helps attention and learning. This exciting discovery will hopefully be the start of an exploration of these possibilities.

Reference: 

Source: 

Topics: 

tags: 

tags lifestyle: 

tags memworks: 

tags strategies: 

Pages

Subscribe to RSS - attention training
Error | About memory

Error

The website encountered an unexpected error. Please try again later.