Memory capacity of brain 10 times more than thought

  • New measurements have exploded the previous estimates of the human brain's memory capacity, and also help explain how neurons have such computational power when their energy use is so low.

The question of the brain's capacity usually brings up remarks that the human brain contains about 100 billion neurons. If each one has, say, 1,000 or more connections to other neurons, this produces some 100 trillion connections in which our memory can be held. These connections are between synapses, which change in strength and size when activated. These changes are a critical part of the memory code. In fact, synaptic strength is analogous to the 1s and 0s that computers use to encode information.

But, here's the thing: unlike the binary code of computers, there are more than two sizes available to synapses. On the basis of the not-very-precise tools researchers had available, they had come up with three sizes: small, medium and large. They also had calculated that the difference between the smallest and largest was a factor of 60.

Here is where the new work comes in, because new techniques have enabled researchers to now see that synapses have far more options open to them. Synapses can, it seems, vary by as little as 8%, creating a possible 26 different sizes available, which corresponds to storing 4.7 bits of information at each synapse, as opposed to one or two.

Despite the precision that this 8% speaks to, hippocampal synapses are notoriously unreliable, with signals typically activating the next neuron only 10-20% of the time. But this seeming unreliability is a feature not a bug. It means a single spike isn't going to do the job; what's needed is a stable change in synaptic strength, which comes from repeated and averaged inputs. Synapses are constantly adjusting, averaging out their success and failure rates over time.

The researchers calculate that, for the smallest synapses, about 1,500 events cause a change in their size/ability (20 minutes), while for the largest synapses, only a couple hundred signaling events (1 to 2 minutes) cause a change. In other words, every 2 to 20 minutes, your synapses are going up or down to the next size, in response to the signals they're receiving.

Based on this new information, the new estimate is that the brain can hold at least a petabyte of information, about as much as the World Wide Web currently holds. This is ten times more than previously estimated.

At the moment, only hippocampal neurons have been investigated. More work is needed to determine whether the same is true across the brain.

In the meantime, the work has given us a better notion of how memories are encoded in the brain, increased the potential capacity of the human brain, and offers a new way of thinking about information networks that may enable engineers to build better, more energy-efficient, computers.

http://www.eurekalert.org/pub_releases/2016-01/si-mco012016.php

http://www.scientificamerican.com/article/new-estimate-boosts-the-human-brain-s-memory-capacity-10-fold/

Full text at http://elifesciences.org/content/4/e10778v2

Reference: 

Related News

In the study, two rhesus monkeys were given a standard human test of

In the experiment, rats learned which lever to press to receive water, where the correct lever depended on which lever they had pressed previously (the levers were retractable; there was a variable delay between the first and second presentation of the levers).

I’ve always felt that better thinking was associated with my brain working ‘in a higher gear’ — literally working at a faster rhythm.

Trying to learn two different things one after another is challenging. Almost always some of the information from the first topic or task gets lost. Why does this happen?

I’ve spoken often about the spacing effect — that it’s better to spread out your learning than have it all massed in a block.

What governs whether or not you’ll retrieve a memory? I’ve talked about the importance of retrieval cues, of the match between the cue and the memory code you’re trying to retrieve, of the strength of the connections leading to the code. But these all have to do with the memory code.

In a recent study, 40 undergraduate students learned ten lists of ten pairs of Swahili-English words, with tests after each set of ten. On these tests, each correct answer was followed by an image, either a neutral one or one designed to arouse negative emotions, or by a blank screen.

Childhood amnesia — our inability to remember almost everything that happened to us when very young — is always interesting. It’s not as simple as an inability to form long-term memories.

As we get older, when we suffer memory problems, we often laughingly talk about our brain being ‘full up’, with no room for more information. A new study suggests that in some sense (but not the direct one!) that’s true.

It’s well-established that feelings of encoding fluency are positively correlated with judgments of learning, so it’s been generally believed that people primarily use the simple rule, easily learned = easily remembered (ELER), to work out whether they’re likely to remember something (as discuss

Pages

Subscribe to Latest newsSubscribe to Latest newsSubscribe to Latest health newsSubscribe to Latest news