episodic memory

How your brain chunks ‘moments’ into ‘events’

image from study

We talk about memory for ‘events’, but how does the brain decide what an event is? How does it decide what is part of an event and what isn’t? A new study suggests that our brain uses categories it creates based on temporal relationships between people, objects, and actions — i.e., items that tend to—or tend not to—pop up near one another at specific times.

Self-imagination helps memory in both healthy and memory-impaired

A small study involving patients with TBI has found that the best learning strategies are ones that call on the self-schema rather than episodic memory, and the best involves self-imagination.

Sometime ago, I reported on a study showing that older adults could improve their memory for a future task (remembering to regularly test their blood sugar) by picturing themselves going through the process. Imagination has been shown to be a useful strategy in improving memory (and also motor skills). A new study extends and confirms previous findings, by testing free recall and comparing self-imagination to more traditional strategies.

The study involved 15 patients with acquired brain injury who had impaired memory and 15 healthy controls. Participants memorized five lists of 24 adjectives that described personality traits, using a different strategy for each list. The five strategies were:

  • think of a word that rhymes with the trait (baseline),
  • think of a definition for the trait (semantic elaboration),
  • think about how the trait describes you (semantic self-referential processing),
  • think of a time when you acted out the trait (episodic self-referential processing), or
  • imagine acting out the trait (self-imagining).

For both groups, self-imagination produced the highest rates of free recall of the list (an average of 9.3 for the memory-impaired, compared to 3.2 using the baseline strategy; 8.1 vs 3.2 for the controls — note that the controls were given all 24 items in one list, while the memory-impaired were given 4 lists of 6 items).

Additionally, those with impaired memory did better using semantic self-referential processing than episodic self-referential processing (7.3 vs 5.7). In contrast, the controls did much the same in both conditions. This adds to the evidence that patients with brain injury often have a particular problem with episodic memory (knowledge about specific events). Episodic memory is also particularly affected in Alzheimer’s, as well as in normal aging and depression.

It’s also worth noting that all the strategies that involved the self were more effective than the two strategies that didn’t, for both groups (also, semantic elaboration was better than the baseline strategy).

The researchers suggest self-imagination (and semantic self-referential processing) might be of particular benefit for memory-impaired patients, by encouraging them to use information they can more easily access (information about their own personality traits, identity roles, and lifetime periods — what is termed the self-schema), and that future research should explore ways in which self-imagination could be used to support everyday memory tasks, such as learning new skills and remembering recent events.

Dopamine decline underlies episodic memory decline in old age

Findings supporting dopamine’s role in long-term episodic memory point to a decline in dopamine levels as part of the reason for cognitive decline in old age, and perhaps in Alzheimer’s.

The neurotransmitter dopamine is found throughout the brain and has been implicated in a number of cognitive processes, including memory. It is well-known, of course, that Parkinson's disease is characterized by low levels of dopamine, and is treated by raising dopamine levels.

A new study of older adults has now demonstrated the effect of dopamine on episodic memory. In the study, participants (aged 65-75) were shown black and white photos of indoor scenes and landscapes. The subsequent recognition test presented them with these photos mixed in with new ones, and required them to note which photos they had seen before. Half of the participants were first given Levodopa (‘L-dopa’), and half a placebo.

Recognition tests were given two and six hours after being shown the photos. There was no difference between the groups at the two-hour test, but at the six-hour test, those given L-dopa recognized up to 20% more photos than controls.

The failure to find a difference at the two-hour test was expected, if dopamine’s role is to help strengthen the memory code for long-term storage, which occurs after 4-6 hours.

Individual differences indicated that the ratio between the amount of Levodopa taken and body weight is key for an optimally effective dose.

The findings therefore suggest that at least part of the reason for the decline in episodic memory typically seen in older adults is caused by declining levels of dopamine.

Given that episodic memory is one of the first and greatest types of memory hit by Alzheimer’s, this finding also has implications for Alzheimer’s treatment.

Caffeine improves recognition of positive words

Another recent study also demonstrates, rather more obliquely, the benefits of dopamine. In this study, 200 mg of caffeine (equivalent to 2-3 cups of coffee), taken 30 minutes earlier by healthy young adults, was found to improve recognition of positive words, but had no effect on the processing of emotionally neutral or negative words. Positive words are consistently processed faster and more accurately than negative and neutral words.

Because caffeine is linked to an increase in dopamine transmission (an indirect effect, stemming from caffeine’s inhibitory effect on adenosine receptors), the researchers suggest that this effect of caffeine on positive words demonstrates that the processing advantage enjoyed by positive words is driven by the involvement of the dopaminergic system.

Should ‘learning facts by rote’ be central to education?

Being able to read or discuss a topic requires you to have certain concepts well-learned, so that they are readily accessible when needed.

Rote memorization is a poor tool for acquiring this base knowledge.

‘Core’ knowledge is smaller than you might think.

Building up strong concepts is best done by working through many, diverse examples.

Education is not solely or even mainly about stuffing your head with ‘facts’. Individualized knowledge, built up from personally relevant examples illuminating important concepts, needs to be matched by an equal emphasis on curating knowledge, and practice in replacing outdated knowledge.

Michael Gove is reported as saying that ‘Learning facts by rote should be a central part of the school experience’, a philosophy which apparently underpins his shakeup of school exams. Arguing that "memorisation is a necessary precondition of understanding", he believes that exams that require students to memorize quantities of material ‘promote motivation, solidify knowledge, and guarantee standards’.

How stress affects your learning

A small study shows that stress makes it more likely for learning to use more complicated and subconscious processes that involve brain regions involved in habit and procedural learning.

We know that stress has a complicated relationship with learning, but in general its effect is negative, and part of that is due to stress producing anxious thoughts that clog up working memory. A new study adds another perspective to that.

The brain scanning study involved 60 young adults, of whom half were put under stress by having a hand immersed in ice-cold water for three minutes under the supervision of a somewhat unfriendly examiner, while the other group immersed their hand in warm water without such supervision (cortisol and blood pressure tests confirmed the stress difference).

About 25 minutes after this (cortisol reaches peak levels around 25 minutes after stress), participants’ brains were scanned while participants alternated between a classification task and a visual-motor control task. The classification task required them to look at cards with different symbols and learn to predict which combinations of cards announced rain and which sunshine. Afterward, they were given a short questionnaire to determine their knowledge of the task. The control task was similar but there were no learning demands (they looked at cards on the screen and made a simple perceptual decision).

In order to determine the strategy individuals used to do the classification task, ‘ideal’ performance was modeled for four possible strategies, of which two were ‘simple’ (based on single cues) and two ‘complex’ (based on multiple cues).

Here’s the interesting thing: while both groups were successful in learning the task, the two groups learned to do it in different ways. Far more of the non-stressed group activated the hippocampus to pursue a simple and deliberate strategy, focusing on individual symbols rather than combinations of symbols. The stressed group, on the other hand, were far more likely to use the striatum only, in a more complex and subconscious processing of symbol combinations.

The stressed group also remembered significantly fewer details of the classification task.

There was no difference between the groups on the (simple, perceptual) control task.

In other words, it seems that stress interferes with conscious, purposeful learning, causing the brain to fall back on more ‘primitive’ mechanisms that involve procedural learning. Striatum-based procedural learning is less flexible than hippocampus-based declarative learning.

Why should this happen? Well, the non-conscious procedural learning going on in the striatum is much less demanding of cognitive resources, freeing up your working memory to do something important — like worrying about the source of the stress.

Unfortunately, such learning will not become part of your more flexible declarative knowledge base.

The finding may have implications for stress disorders such as depression, addiction, and PTSD. It may also have relevance for a memory phenomenon known as “forgotten baby syndrome”, in which parents forget their babies in the car. This may be related to the use of non-declarative memory, because of the stress they are experiencing.

Reference: 

[3071] Schwabe, L., & Wolf O. T. (2012).  Stress Modulates the Engagement of Multiple Memory Systems in Classification Learning. The Journal of Neuroscience. 32(32), 11042 - 11049.

Sleep learning making a comeback?

Two new studies provide support for the judicious use of sleep learning — as a means of reactivating learning that occurred during the day.

Back when I was young, sleep learning was a popular idea. The idea was that a tape would play while you were asleep, and learning would seep into your brain effortlessly. It was particularly advocated for language learning. Subsequent research, unfortunately, rejected the idea, and gradually it has faded (although not completely). Now a new study may presage a come-back.

In the study, 16 young adults (mean age 21) learned how to ‘play’ two artificially-generated tunes by pressing four keys in time with repeating 12-item sequences of moving circles — the idea being to mimic the sort of sensorimotor integration that occurs when musicians learn to play music. They then took a 90-minute nap. During slow-wave sleep, one of the tunes was repeatedly played to them (20 times over four minutes). After the nap, participants were tested on their ability to play the tunes.

A separate group of 16 students experienced the same events, but without the playing of the tune during sleep. A third group stayed awake, during which 90-minute period they played a demanding working memory task. White noise was played in the background, and the melody was covertly embedded into it.

Consistent with the idea that sleep is particularly helpful for sensorimotor integration, and that reinstating information during sleep produces reactivation of those memories, the sequence ‘practiced’ during slow-wave sleep was remembered better than the unpracticed one. Moreover, the amount of improvement was positively correlated with the proportion of time spent in slow-wave sleep.

Among those who didn’t hear any sounds during sleep, improvement likewise correlated with the proportion of time spent in slow-wave sleep. The level of improvement for this group was intermediate to that of the practiced and unpracticed tunes in the sleep-learning group.

The findings add to growing evidence of the role of slow-wave sleep in memory consolidation. Whether the benefits for this very specific skill extend to other domains (such as language learning) remains to be seen.

However, another recent study carried out a similar procedure with object-location associations. Fifty everyday objects were associated with particular locations on a computer screen, and presented at the same time with characteristic sounds (e.g., a cat with a meow and a kettle with a whistle). The associations were learned to criterion, before participants slept for 2 hours in a MR scanner. During slow-wave sleep, auditory cues related to half the learned associations were played, as well as ‘control’ sounds that had not been played previously. Participants were tested after a short break and a shower.

A difference in brain activity was found for associated sounds and control sounds — associated sounds produced increased activation in the right parahippocampal cortex — demonstrating that even in deep sleep some sort of differential processing was going on. This region overlapped with the area involved in retrieval of the associations during the earlier, end-of-training test. Moreover, when the associated sounds were played during sleep, parahippocampal connectivity with the visual-processing regions increased.

All of this suggests that, indeed, memories are being reactivated during slow-wave sleep.

Additionally, brain activity in certain regions at the time of reactivation (mediotemporal lobe, thalamus, and cerebellum) was associated with better performance on the delayed test. That is, those who had greater activity in these regions when the associated sounds were played during slow-wave sleep remembered the associations best.

The researchers suggest that successful reactivation of memories depends on responses in the thalamus, which if activated feeds forward into the mediotemporal lobe, reinstating the memories and starting the consolidation process. The role of the cerebellum may have to do with the procedural skill component.

The findings are consistent with other research.

All of this is very exciting, but of course this is not a strategy for learning without effort! You still have to do your conscious, attentive learning. But these findings suggest that we can increase our chances of consolidating the material by replaying it during sleep. Of course, there are two practical problems with this: the material needs an auditory component, and you somehow have to replay it at the right time in your sleep cycle.

Reference: 

Alzheimer’s biomarkers present decades before symptoms

People with a strong genetic risk of early-onset Alzheimer’s have revealed a progression of brain changes that begin 25 years before symptoms are evident.

A study involving those with a strong genetic risk of developing Alzheimer’s has found that the first signs of the disease can be detected 25 years before symptoms are evident. Whether this is also true of those who develop the disease without having such a strong genetic predisposition is not yet known.

The study involved 128 individuals with a 50% chance of inheriting one of three mutations that are certain to cause Alzheimer’s, often at an unusually young age. On the basis of participants’ parents’ medical history, an estimate of age of onset was calculated.

The first observable brain marker was a drop in cerebrospinal fluid levels of amyloid-beta proteins, and this could be detected 25 years before the anticipated age of onset. Amyloid plaques in the precuneus became visible on brain scans 15-20 years before memory problems become apparent; elevated cerebrospinal fluid levels of the tau protein 10-15 years, and brain atrophy in the hippocampus 15 years. Ten years before symptoms, the precuneus showed reduced use of glucose, and slight impairments in episodic memory (as measured in the delayed-recall part of the Wechsler’s Logical Memory subtest) were detectable. Global cognitive impairment (measured by the MMSE and the Clinical Dementia Rating scale) was detected 5 years before expected symptom onset, and patients met diagnostic criteria for dementia at an average of 3 years after expected symptom onset.

Family members without the risky genes showed none of these changes.

The risky genes are PSEN1 (present in 70 participants), PSEN2 (11), and APP (7) — note that together these account for 30-50% of early-onset familial Alzheimer’s, although only 0.5% of Alzheimer’s in general. The ‘Alzheimer’s gene’ APOe4 (which is a risk factor for sporadic, not familial, Alzheimer’s), was no more likely to be present in these carriers (25%) than noncarriers (22%), and there were no gender differences. The average parental age of symptom onset was 46 (note that this pushes back the first biomarker to 21! Can we speculate a connection to noncarriers having significantly more education than carriers — 15 years vs 13.9?).

The results paint a clear picture of how Alzheimer’s progresses, at least in this particular pathway. First come increases in the amyloid-beta protein, followed by amyloid pathology, tau pathology, brain atrophy, and decreased glucose metabolism. Following this biological cascade, cognitive impairment ensues.

The degree to which these findings apply to the far more common sporadic Alzheimer’s is not known, but evidence from other research is consistent with this progression.

It must be noted, however, that the findings are based on cross-sectional data — that is, pieced together from individuals at different ages and stages. A longitudinal study is needed to confirm.

The findings do suggest the importance of targeting the first step in the cascade — the over-production of amyloid-beta — at a very early stage.

Researchers encourage people with a family history of multiple generations of Alzheimer’s diagnosed before age 55 to register at http://www.DIANXR.org/, if they would like to be considered for inclusion in any research.

Reference: 

[2997] Bateman, R. J., Xiong C., Benzinger T. L. S., Fagan A. M., Goate A., Fox N. C., et al. (2012).  Clinical and Biomarker Changes in Dominantly Inherited Alzheimer's Disease. New England Journal of Medicine. 120723122607004 - 120723122607004.

Effect of blood pressure on the aging brain depends on genetics

For those with the Alzheimer’s gene, higher blood pressure, even though within the normal range, is linked to greater brain shrinkage and reduced cognitive ability.

I’ve reported before on the evidence suggesting that carriers of the ‘Alzheimer’s gene’, APOE4, tend to have smaller brain volumes and perform worse on cognitive tests, despite being cognitively ‘normal’. However, the research hasn’t been consistent, and now a new study suggests the reason.

The e4 variant of the apolipoprotein (APOE) gene not only increases the risk of dementia, but also of cardiovascular disease. These effects are not unrelated. Apoliproprotein is involved in the transportation of cholesterol. In older adults, it has been shown that other vascular risk factors (such as elevated cholesterol, hypertension or diabetes) worsen the cognitive effects of having this gene variant.

This new study extends the finding, by looking at 72 healthy adults from a wide age range (19-77).

Participants were tested on various cognitive abilities known to be sensitive to aging and the effects of the e4 allele. Those abilities include speed of information processing, working memory and episodic memory. Blood pressure, brain scans, and of course genetic tests, were also performed.

There are a number of interesting findings:

  • The relationship between age and hippocampal volume was stronger for those carrying the e4 allele (shrinkage of this brain region occurs with age, and is significantly greater in those with MCI or dementia).
  • Higher systolic blood pressure was significantly associated with greater atrophy (i.e., smaller volumes), slower processing speed, and reduced working memory capacity — but only for those with the e4 variant.
  • Among those with the better and more common e3 variant, working memory was associated with lateral prefrontal cortex volume and with processing speed. Greater age was associated with higher systolic blood pressure, smaller volumes of the prefrontal cortex and prefrontal white matter, and slower processing. However, blood pressure was not itself associated with either brain atrophy or slower cognition.
  • For those with the Alzheimer’s variant (e4), older adults with higher blood pressure had smaller volumes of prefrontal white matter, and this in turn was associated with slower speed, which in turn linked to reduced working memory.

In other words, for those with the Alzheimer’s gene, age differences in working memory (which underpin so much of age-related cognitive impairment) were produced by higher blood pressure, reduced prefrontal white matter, and slower processing. For those without the gene, age differences in working memory were produced by reduced prefrontal cortex and prefrontal white matter.

Most importantly, these increases in blood pressure that we are talking about are well within the normal range (although at the higher end).

The researchers make an interesting point: that these findings are in line with “growing evidence that ‘normal’ should be viewed in the context of individual’s genetic predisposition”.

What it comes down to is this: those with the Alzheimer’s gene variant (and no doubt other genetic variants) have a greater vulnerability to some of the risk factors that commonly increase as we age. Those with a family history of dementia or serious cognitive impairment should therefore pay particular attention to controlling vascular risk factors, such as hypertension and diabetes.

This doesn’t mean that those without such a family history can safely ignore such conditions! When they get to the point of being clinically diagnosed as problems, then they are assuredly problems for your brain regardless of your genetics. What this study tells us is that these vascular issues appear to be problematic for Alzheimer’s gene carriers before they get to that point of clinical diagnosis.

Event boundaries and working memory capacity

The brain breaks events and activities into segments, which are hierarchically organized.

The segments begin and end at points where significant changes occur — in movement, in time, in space, in goals, or in participating actors or objects.

Working memory has to be updated at these ‘event boundaries’.

Accordingly, your memory of something that occurred before an event boundary is less than your awareness of what is occurring during the current segmented event.

Processing also slows down at event boundaries (because you’re busy updating working memory) — meaning that understanding is more difficult at these points.

However, these boundaries provide strong anchors for your long-term memory codes of these events.

This may explain why beginnings and endings are so much better remembered (the primacy and recency effects).

Finer segmentation is generally helpful for memory, but segmentation is hierarchical and ‘higher-order’ segments are useful too.

Texts are more easily understood and new skills are more quickly learned when they are explicitly and appropriately structured, segmented at the most useful boundaries and hierarchically structured.

Segments are useful as shared elements between events. Transfer (the benefits of previously learned knowledge for new learning) reflects the degree of shared segments between the familiar event and the new.

Working memory capacity may reflect, at least in part, the ability to choose the most relevant event boundaries.

In a recent news report, I talked about how walking through doorways creates event boundaries, requiring us to update our awareness of current events and making information about the previous location less available. I commented that we should be aware of the consequences of event boundaries for our memory, and how these contextual factors are important elements of our filing system. I want to talk a bit more about that.

Walking through doorways causes forgetting

A series of experiments indicates that walking through doorways creates event boundaries, requiring us to update our awareness of current events and making information about the previous location less available.

We’re all familiar with the experience of going to another room and forgetting why we’ve done so. The problem has been largely attributed to a failure of attention, but recent research suggests something rather more specific is going on.

In a previous study, a virtual environment was used to explore what happens when people move through several rooms. The virtual environment was displayed on a very large (66 inch) screen to provide a more immersive experience. Each ‘room’ had one or two tables. Participants ‘carried’ an object, which they would deposit on a table, before picking up a different object. At various points, they were asked if the object was, say, a red cube (memory probe). The objects were not visible at the time of questioning. It was found that people were slower and less accurate if they had just moved to a new room.

To assess whether this effect depends on a high degree of immersion, a recent follow-up to this study replicated the study using standard 17” monitors rather than the giant screens. The experiment involved 55 students and once again demonstrated a significant effect of shifting rooms. Specifically, when the probe was positive, the error rate was 19% in the shift condition compared to 12% on trials when the participant ‘traveled’ the same distance but didn’t change rooms. When the probe was negative, the error rate was 22% in the shift condition vs 7% for the non-shift condition. Reaction time was less affected — there was no difference when the probes were positive, but a marginally significant difference on negative-probe trials.

The second experiment went to the other extreme. Rather than reducing the immersive experience, researchers increased it — to a real-world environment. Unlike the virtual environments, distances couldn’t be kept constant across conditions. Three large rooms were used, and no-shift trials involved different tables at opposite ends of the room. Six objects, rather than just one, were moved on each trial. Sixty students participated.

Once again, more errors occurred when a room-shift was involved. On positive-probe trials, the error rate was 28% in the shift condition vs 23% in the non-shift. On negative-probe trials, the error rate was 21% and 18%, respectively. The difference in reaction times wasn’t significant.

The third experiment, involving 48 students, tested the idea that forgetting might be due to the difference in context at retrieval compared to encoding. To do this, the researchers went back to using the more immersive virtual environment (the 66” screen), and included a third condition. In this, either the participant returned to the original room to be tested (return) or continued on to a new room to be tested (double-shift) — the idea being to hold the number of spatial shifts the same.

There was no evidence that returning to the original room produced the sort of advantage expected if context-matching was the important variable. Memory was best in the no-shift condition, next best in the shift and return conditions (no difference between them), and worst in the double shift condition. In other words, it was the number of new rooms entered that appears to be important.

This is in keeping with the idea that we break the action stream into separate events using event boundaries. Passing through a doorway is one type of event boundary. A more obvious type is the completion of an action sequence (e.g., mixing a cake — the boundary is the action of putting it in the oven; speaking on the phone — the boundary is the action of ending the call). Information being processed during an event is more available, foregrounded in your attention. Interference occurs when two or more events are activated, increasing errors and sometimes slowing retrieval.

All of this has greater ramifications than simply helping to explain why we so often go to another room and forget why we’re there. The broader point is that everything that happens to us is broken up and filed, and we should look for the boundaries to these events and be aware of the consequences of them for our memory. Moreover, these contextual factors are important elements of our filing system, and we can use that knowledge to construct more effective tags.

Reference: 

[2660] Radvansky, G. A., Krawietz S. A., & Tamplin A. K. (2011).  Walking Through Doorways Causes Forgetting: Further Explorations. The Quarterly Journal of Experimental Psychology.

Syndicate content