Gesture & embodied cognition

A Canadian study involving French-speaking university students has found that repeating aloud, especially to another person, improves memory for words.

In the first experiment, 20 students read a series of words while wearing headphones that emitted white noise, in order to mask their own voices and eliminate auditory feedback. Four actions were compared:

  • repeating silently in their head
  • repeating silently while moving their lips
  • repeating aloud while looking at the screen
  • repeating aloud while looking at someone.

They were tested on their memory of the words after a distraction task. The memory test only required them to recognize whether or not the words had occurred previously.

There was a significant effect on memory. The order of the conditions matches the differences in memory, with memory worst in the first condition, and best in the last.

In the second experiment, 19 students went through the same process, except that the stimuli were pseudo-words. In this case, there was no memory difference between the conditions.

The effect is thought to be due to the benefits of motor sensory feedback, but the memory benefit of directing your words at a person rather than a screen suggests that such feedback goes beyond the obvious. Visual attention appears to be an important memory enhancer (no great surprise when we put it that way!).

Most of us have long ago learned that explaining something to someone really helps our own understanding (or demonstrates that we don’t in fact understand it!). This finding supports another, related, experience that most of us have had: the simple act of telling someone something helps our memory.

http://www.eurekalert.org/pub_releases/2015-10/uom-rat100615.php

A small study using an artificial language adds to evidence that new vocabulary is learned more easily when the learner uses gestures.

“Vimmish”, the artificial language used in the study, follows similar phonetic rules to Italian. The German-speaking participants were given abstract and concrete nouns to learn over the course of a week. In the first experiment, the 21 subjects heard the words and their translations under one of three conditions:

  • with a video showing a symbolic gesture of the word's meaning, which they imitated
  • with a picture illustrating the word's meaning, which they traced in the air
  • with no gestures or pictures.

On the 8th day, the participants were tested while their brain activity was monitored. The test involved hearing the foreign word, then selecting the correct translation from four written options.

The researchers were interested in learning whether they could predict the learning condition from the brain activity patterns displayed when the participants were tested. They found that the gesture condition and control could be distinguished in two brain regions: a visual area that processes biological motion (part of the right superior temporal sulcus), and the left premotor cortex. Activity in these regions was also significantly correlated with performance. The picture condition and control could be distinguished in a visual area that processes objects (the right anterior lateral occipital cortex). There was a trend for this activity to correlate with performance, but it didn't reach significance.

Paper-and-pencil translation tests two and six months after learning showed that learning with gestures was significantly better than the other conditions. But note that there was no advantage for any condition in a free recall task.

A second experiment compared gesture and pictures in the more common picture scenario — participants only viewed the video or picture; there was no imitation. Unsurprisingly, there was no motor cortex involvement in this scenario: gesture and control conditions were distinguished only by activity in the biological motion part of the right superior temporal sulcus. The correlation of activity in the right anterior LOC with performance in the picture condition this time reached significance. But most importantly, this time the picture condition led to better translation accuracy than the other two conditions.

However, the most significant result is this: when both experiments were evaluated together, the gesture benefit in experiment 1 (when the participant copied the gesture) was greater than the picture benefit in the second experiment.

The findings are in keeping with other evidence that foreign words are learned more easily when multiple senses are involved.

http://www.eurekalert.org/pub_releases/2015-02/m-lwa020415.php

This is just a preliminary study presented at a recent conference, so we can't give it too much weight, but the finding is consistent with what we know about working memory, and it is of some usefulness.

The study tested the ability of young-adult native English speakers to store spoken words in short-term memory. The English words were spoken either with a standard American accent or with a pronounced but still intelligible Korean accent. Every now and then, the listeners (all unfamiliar with a Korean accent) would be asked to recall the last three words they had heard.

While there was no difference for the last and second-last words, the third word back was remembered significantly better when it was spoken in the familiar accent (80% vs 70%).

The finding suggests that the effort listeners needed to put into understanding the foreign accent used up some of their working memory, reducing their ability to hold onto the information.

The finding is consistent with previous research showing that people with hearing difficulties or who are listening in difficult circumstances (such as over a bad phone line or in a loud room) are poorer at remembering and processing the spoken information compared to individuals who are hearing more clearly.

On a practical level, this finding suggests that, if you're receiving important information (for example, medical information) from someone speaking with an unfamiliar accent, you should make special efforts to remember and process the information. For example, by asking them to speak more slowly, by taking notes and asking for clarification, etc. Those providing such information should take on board the idea that if their listeners are likely to be unfamiliar with their accent, they need to take greater care to speak slowly and clearly, with appropriate levels of repetition and elaboration. Gestures are also helpful for reducing the load on working memory.

http://www.eurekalert.org/pub_releases/2015-05/asoa-htu050715.php

Van Engen, K. et al. 2015. Downstream effects of accented speech on memory. Presentation 1aSC4 at the 169th meeting of the Acoustical Society of America, held May 18-22, 2015 in Pittsburgh, Pennsylvania.

A new study claims to provide ‘some of the strongest evidence yet’ for the benefits of gesturing to help students learn.

The study involved 184 children aged 7-10, of whom half were shown videos of an instructor teaching math problems using only speech, while the rest were shown videos of the instructor teaching the same problems using both speech and gestures. The problem involved mathematical equivalence (i.e., 4+5+7=__+7), which is known to be critical to later algebraic learning.

Students who learned from the gesture videos performed substantially better on a test given immediately afterward than those who learned from the speech-only video (average proportion correct around 42% vs 31% — approximations because I’m eyeballing the graph), and, unlike the speech-only group, showed further improvement on a test 24 hours later (around 46% vs 30%). They also showed stronger transfer to different problem types (35% vs 22%).

http://www.futurity.org/society-culture/to-teach-kids-math-keep-hands-mo...

[3377] Cook SW, Duffy RG, Fenn KM. Consolidation and Transfer of Learning After Observing Hand Gesture. Child Development [Internet]. 2013 :n/a - n/a. Available from: http://onlinelibrary.wiley.com/doi/10.1111/cdev.12097/abstract

The relative ease with which children acquire language has produced much debate and theory, mirroring the similar quantity of debate and theory over how we evolved language. One theory of language evolution is that it began with gesture. A recent study looking at how deaf children learn sign language might perhaps be taken as partial support for this theory, and may also have wider implications for how children acquire language and how we can best support them.

The study, involving 31 deaf toddlers, looked at 89 specific signs understood and produced by the children. It was found that both younger (11-20 months) and older (21-30 months) toddlers understood and produced more signs that were iconic than signs that were less iconic. This benefit seemed to be greater for the older toddlers, supporting the idea that a certain amount of experience and/or cognitive development is needed to make the link between action and meaning.

Surprisingly, the benefits of iconicity did not seem to depend on how familiar, phonologically complex, or imageable the words were.

In contrast to spoken language, a high proportion of signs are iconic, that is, related to the concept being expressed (such as, bringing the hand to the mouth to indicate ‘eat’). Nevertheless, if iconicity is important in sign language, it is surely also important in spoken languages. This is supported by the role of gesture in speech.

The researchers suggest that iconic links between our perceptual-motor experience of the world and the form of a sign may provide an imitation-based mechanism that supports early sign acquisition, and that this might also apply to spoken language — with gestures, tone of voice, inflection, and facial expression helping make the link between words and their meanings less arbitrary.

This suggests that we can support children’s acquisition of language by providing and emphasizing such ‘scaffolding’.

I always like gesture studies. I think I’m probably right in saying that they started with language learning. Way back in 1980 it was shown that acting out action phrases meant they were remembered better than if the phrases had been only heard or read (the “enactment effect”). Enacted items, it turned out, “popped out” effortlessly in free recall tests — in other words, enactment had made the phrases highly accessible. Subsequent research found that this effect occurred both for both older and younger adults, and in immediate and delayed recall tests — suggesting not only that such items are more accessible but that forgetting is slower.

Following these demonstrations, there have been a few studies that have specifically looked at the effect of gestures on learning foreign languages, which have confirmed the benefits of gestures. But there are various confounding factors that are hard to remove when using natural languages, which is why the present researchers have developed an artificial language (“Vimmi”) to use in their research. In their first study, as in most other studies, the words and phrases used related to actions. In a new study, the findings were extended to more abstract vocabulary.

In this study, 20 German-speakers participated in a six-day language class to study Vimmi. The training material included 32 sentences, each containing a subject, verb, adverb, and object. While the subject nouns were concrete agents (e.g., musician, director), the other words were all abstract. Here’s a couple of sample sentences (translated, obviously): (The) designer frequently shapes (the) style. (The) pilot really enjoys (the) view. The length of the words was controlled: nouns all had 3 syllables; verbs and adverbs all had two.

For 16 of the sentences, participants saw the word in Vimmi and heard it. The translation of the word appeared on the screen fractionally later, while at the same time a video appeared in which woman performed the gesture relating to the word. The audio of the word was replayed, and participants were cued to imitate the gesture as they repeated the word. For the other 16 sentences, a video with a still image of the actress appeared, and the participants were simply cued to repeat the word when the audio was replayed.

While many of the words used gestures similar to their meaning (such as a cutting gesture for the word “cut”), the researchers found that the use of any gesture made a difference as long as it was unique and connected to a specific word. For example, the abstract word “rather” does not have an obvious gesture that would go with it. However, a gesture attached to this word also worked.

Each daily session lasted three hours. From day 2, sessions began with a free recall and a cued recall test. In the free recall test, participants were asked to write as many items as possible in both German and Vimmi. Items had to be perfectly correct to be counted. From day 4, participants were also required to produce new sentences with the words they had learned.

Right from the beginning, free recall of items which had been enacted was superior to those which hadn’t been — in German. However, in Vimmi, significant benefits from enactment occurred only from day 3. The main problem here was not forgetting the items, but correctly spelling them. In the cued recall test (translating from Vimmi to German, or German to Vimmi), again, the superiority of the enactment condition only showed up from day 3.

Perhaps the most interesting result came from the written production test. Here, people reproduced the same number of sentences they had learned on each of the three days of the test, and although enacted words were remembered at a higher rate, that rate didn’t alter, and didn’t reach significance. However, the production of new sentences improved each day, and the benefits of enactment increased each day. These benefits were significant from day 5.

The main question, however, was whether the benefits of enactment depended on word category. As expected, concrete nouns were remembered than verbs, followed by abstract nouns, and finally adverbs. When all the tests were lumped together, there was a significant benefit of enactment for all types of word. However, the situation became a little more nuanced when the data was separately analyzed.

In free recall, for Vimmi, enactment was only of significant benefit for concrete nouns and verbs. In cued recall, for translating German into Vimmi, the enactment benefit was significant for all except concrete nouns (I’m guessing concrete nouns have enough ‘natural’ power not to need gestures in this situation). For translating Vimmi into German, the benefit was only significant for verbs and abstract nouns. In new sentence production, interestingly, participants used significantly more items of all four categories if they had been enacted. This is perhaps the best evidence that enactment makes items more accessible in memory.

What all this suggests is that acting out new words helps you learn them, but some types of words may benefit more from this strategy than others. But I think we need more research before being sure about such subtleties. The pattern of results make it clear that we really need longer training, and longer delays, to get a better picture of the most effective way to use this strategy.

For example, it may be that adverbs, although they showed the most inconsistent benefits, are potentially the category that stands to gain the most from this strategy — because they are the hardest type of word to remember. Because any embodiment of such an abstract adverb must be arbitrary — symbolic rather than representational — it naturally is going to be harder to learn (yes, some adverbs could be represented, but the ones used in this study, and the ones I am talking about, are of the “rather”, “really”, “otherwise” ilk). But if you persist in learning the association between concept and gesture, you may derive greater benefit from enactment than you would from easier words, which need less help.

Here’s a practical discussion of all this from a language teacher’s perspective.

I always like studies about embodied cognition — that is, about how what we do physically affects how we think. Here are a couple of new ones.

The first study involved two experiments. In the first, 86 American college students were asked questions about gears in relation to each other. For example, “If five gears are arranged in a line, and you move the first gear clockwise, what will the final gear do?” The participants were videotaped as they talked their way through the problem. But here’s the interesting thing: half the students wore Velcro gloves attached to a board, preventing them from moving their hands. The control half were similarly prevented from moving their feet — giving them the same experience of restriction without the limitation on hand movement.

Those who gestured commonly used perceptual-motor strategies (simulation of gear movements) in solving the puzzles. Those who were prevented from gesturing, as well as those who chose not to gesture, used abstract, mathematical strategies much more often.

The second experiment confirmed the results with 111 British adults.

The findings are consistent with the hypothesis that gestures highlight and structure perceptual-motor information, and thereby make such information more likely to be used in problem solving.

That can be helpful, but not always. Even when we are solving problems that have to do with motion and space, more abstract strategies may sometimes be more efficient, and thus an inability to use the body may force us to come up with better strategies.

The other study is quite different. In this study, college students searched for a single letter embedded within images of fractals and other complex geometrical patterns. Some did this while holding their hands close to the images; others kept their hands in their laps, far from the images. This may sound a little wacky, but previous research has shown that perception and attention are affected by how close our hands are to an object. Items near our hands tend to take priority.

In the first experiment, eight randomly chosen images were periodically repeated 16 times, while the other 128 images were only shown once. The target letter was a gray “T” or “L”; the images were colorful.

As expected, finding the target letter was faster the more times the image had been presented. Hand position didn’t affect learning.

In the second experiment, a new set of students were shown the same shown-once images, while 16 versions of the eight repeated images were created. These versions varied in their color components. In this circumstance, learning was slower when hands were held near the images. That is, people found it harder to recognize the commonalities among identical but differently colored patterns, suggesting they were too focused on the details to see the similarities.

These findings suggest that processing near the hands is biased toward item-specific detail. This is in keeping with earlier suggestions that the improvements in perception and attention near the hands are item-specific. It may indeed be that this increased perceptual focus is at the cost of higher-order function such as memory and learning. This would be consistent with the idea that there are two largely independent visual streams, one of which is mainly concerned with visuospatial operations, and the other of which is primarily for more cognitive operations (such as object identification).

All this may seem somewhat abstruse, but it is worryingly relevant in these days of hand-held technological devices.

The point of both these studies is not that one strategy (whether of hand movements or hand position) is wrong. What you need to take away is the realization that hand movements and hand position can affect the way you approach problems, and the things you perceive. Sometimes you want to take a more physical approach to a problem, or pick out the fine details of a scene or object — in these cases, moving your hands, or holding something in or near your hands, is a good idea. Other times you might want to take a more abstract/generalized approach — in these cases, you might want to step back and keep your body out of it.

In the first of three experiments, 132 students were found to gesture more often when they had difficulties solving mental rotation problems. In the second experiment, 22 students were encouraged to gesture, while 22 were given no such encouragement, and a further 22 were told to sit on their hands to prevent gesturing. Those encouraged to gesture solved more mental rotation problems.

Interestingly, the amount of gesturing decreased with experience with these spatial problems, and when the gesture group were given new spatial visualization problems in which gesturing was prohibited, their performance was still better than that of the other participants. This suggests that the spatial computation supported by gestures becomes internalized. The third experiment increased the range of spatial visualization problems helped by gesture.

The researchers suggest that hand gestures may improve spatial visualization by helping a person keep track of an object in the mind as it is rotated to a new position, and by providing additional feedback and visual cues by simulating how an object would move if the hand were holding it.

[2140] Chu M, Kita S. The nature of gestures' beneficial role in spatial problem solving. Journal of Experimental Psychology: General [Internet]. 2011 ;140(1):102 - 116. Available from: http://psycnet.apa.org/journals/xge/140/1/102/

Full text of the article is available at http://www.apa.org/pubs/journals/releases/xge-140-1-102.pdf

In a recent study, volunteers were asked to solve a problem known as the Tower of Hanoi, a game in which you have to move stacked disks from one peg to another. Later, they were asked to explain how they did it (very difficult to do without using your hands.) The volunteers then played the game again. But for some of them, the weight of the disks had secretly reversed, so that the smallest disk was now the heaviest and needed two hands.

People who had used one hand in their gestures when talking about moving the small disk were in trouble when that disk got heavier. They took longer to complete the task than did people who used two hands in their gestures—and the more one-handed gestures they used, the longer they took.

For those who had not been asked to explain their solution (and replayed the game in the interval) were unaffected by the disk weights changing. So even though they had repeated the action with the original weights, they weren’t thrown by the unexpected changes in weights, as those who gestured with one hand were.

The findings add to the evidence that gestures make thought concrete. Related research has indicated that children can come to understand abstract concepts in mathematics and science more readily if they gesture (and perhaps if their teachers gesture).

[2043] Beilock SL, Goldin-Meadow S. Gesture Changes Thought by Grounding It in Action. Psychological Science [Internet]. 2010 ;21(11):1605 - 1610. Available from: http://pss.sagepub.com/content/21/11/1605.abstract

Older news items (pre-2010) brought over from the old website

Connection between language and movement

A study of all three groups of birds with vocal learning abilities – songbirds, parrots and hummingbirds – has revealed that the brain structures for singing and learning to sing are embedded in areas controlling movement, and areas in charge of movement share many functional similarities with the brain areas for singing. This suggests that the brain pathways used for vocal learning evolved out of the brain pathways used for motor control. Human brain structures for speech also lie adjacent to, and even within, areas that control movement. The findings may explain why humans talk with our hands and voice, and could open up new approaches to understanding speech disorders in humans. They are also consistent with the hypothesis that spoken language was preceded by gestural language, or communication based on movements. Support comes from another very recent study finding that mice engineered to have a mutation to the gene FOXP2 (known to cause problems with controlling the formation of words in humans) had trouble running on a treadmill.
Relatedly, a study of young children found that 5-year-olds do better on motor tasks when they talk to themselves out loud (either spontaneously or when told to do so by an adult) than when they are silent. The study also showed that children with behavioral problems (such as ADHD) tend to talk to themselves more often than children without signs of behavior problems. The findings suggest that teachers should be more tolerant of this kind of private speech.

[436] Feenders G, Liedvogel M, Rivas M, Zapka M, Horita H, Hara E, Wada K, Mouritsen H, Jarvis ED. Molecular Mapping of Movement-Associated Areas in the Avian Brain: A Motor Theory for Vocal Learning Origin. PLoS ONE [Internet]. 2008 ;3(3):e1768 - e1768. Available from: http://dx.plos.org/10.1371/journal.pone.0001768

[1235] Winsler A, Manfra L, Diaz RM. "Should I let them talk?": Private speech and task performance among preschool children with and without behavior problems. Early Childhood Research Quarterly [Internet]. 2007 ;22(2):215 - 231. Available from: http://www.sciencedirect.com/science/article/B6W4B-4N08JHR-1/2/049d62f77f2fe3d1aa7588b8ddddd810

http://www.physorg.com/news124526627.html
http://www.sciam.com/article.cfm?id=song-learning-birds-shed
http://www.eurekalert.org/pub_releases/2008-03/gmu-pkd032808.php

Kids learn more when mother is listening

Research has already shown that children learn well when they explain things to their mother or a peer, but that could be because they’re getting feedback and help. Now a new study has asked 4- and 5-year-olds to explain their solution to a problem to their moms (with the mothers listening silently), to themselves or to simply repeat the answer out loud. Explaining to themselves or to their moms improved the children's ability to solve similar problems, and explaining the answer to their moms helped them solve more difficult problems — presumably because explaining to mom made a difference in the quality of the child's explanations.

[416] Rittle-Johnson B, Saylor M, Swygert KE. Learning from explaining: Does it matter if mom is listening?. Journal of Experimental Child Psychology [Internet]. 2008 ;100(3):215 - 224. Available from: http://www.sciencedirect.com/science/article/B6WJ9-4R5H25T-1/2/b7ea82e5c515b292fd8448a3b3c392ed

http://www.physorg.com/news120320713.html

Gesturing helps grade-schoolers solve math problems

Two studies of children in late third and early fourth grade, who made mistakes in solving math problems, have found that children told to move their hands when explaining how they’d solve a problem were four times as likely as kids given no instructions to manually express correct new ways to solve problems. Even though they didn’t give the right answer, their gestures revealed an implicit knowledge of mathematical ideas, and the second study showed that gesturing set them up to benefit from subsequent instruction. The findings extend previous research that body movement not only helps people to express things they may not be able to verbally articulate, but actually to think better.

[1170] Broaders SC, Cook SW, Mitchell Z, Goldin-Meadow S. Making Children Gesture Brings Out Implicit Knowledge and Leads to Learning. Journal of Experimental Psychology: General [Internet]. 2007 ;136(4):539 - 550. Available from: http://www.sciencedirect.com/science/article/B6X07-4R6JMY1-1/2/579ba864e9fea606cec11df85f21afa8

http://www.eurekalert.org/pub_releases/2007-11/apa-ghg102907.php

Doodling can help memory recall

A study in which 40 academics were asked to listen to a two and a half minute tape giving several names of people and places, and were told to write down only the names of people going to a party, has found that those who were asked to shade in shapes on a piece of paper at the same time, recalled on average 7.5 names of people and places compared to only 5.8 by those who were not asked to doodle. This supports the idea that a simple secondary task like doodling can be useful to stop your mind wandering when it’s doing something boring.

Andrade, J. 2009. What does doodling do? Applied Cognitive Psychology, Published online 27 February

http://www.eurekalert.org/pub_releases/2009-02/w-dd022509.php

Actors’ memory tricks help students and older adults

The ability of actors to remember large amounts of dialog verbatim is a marvel to most of us, and most of us assume they do by painful rote memorization. But two researchers have been studying the way actors learn for many years and have concluded that the secret of actors' memories is in the acting; an actor learning lines by focusing on the character’s motives and feelings — they get inside the character. To do this, they break a script down into a series of logically connected "beats" or intentions. The researchers call this process active experiencing, which uses "all physical, mental, and emotional channels to communicate the meaning of material to another person." This principle can be applied in other contexts. For example, students who imagined themselves explaining something to somebody else remembered more than those who tried to memorize the material by rote. Physical movement also helps — lines learned while doing something, such as walking across the stage, were remembered better than lines not accompanied with action. The principles have been found useful in improving memory in older adults: older adults who received a four-week course in acting showed significantly improved word-recall and problem-solving abilities compared to both a group that received a visual-arts course and a control group, and this improvement persisted four months afterward.

[2464] Noice H, Noice T. What Studies of Actors and Acting Can Tell Us About Memory and Cognitive Functioning. Current Directions in Psychological Science [Internet]. 2006 ;15(1):14 - 18. Available from: http://dx.doi.org/10.1111/j.0963-7214.2006.00398.x

http://www.eurekalert.org/pub_releases/2006-01/aps-bo012506.php

People remember speech better when it is accompanied by gestures

A recent study had participants watch someone narrating three cartoons. Sometimes the narrator used hand gestures and at other times they did not. The participants were then asked to recall the story. The study found that when the narrator used gestures as well as speech the participants were more likely to accurately remember what actually happened in the story rather than change it in some way.

The research was presented to the British Psychological Society Annual Conference in Bournemouth on Thursday 13 March.

Gesturing reduces cognitive load

Why is it that people cannot keep their hands still when they talk? One reason may be that gesturing actually lightens cognitive load while a person is thinking of what to say. Adults and children were asked to remember a list of letters or words while explaining how they solved a math problem. Both groups remembered significantly more items when they gestured during their math explanations than when they did not gesture.

[1300] Goldin-Meadow S, Nusbaum H, Kelly SD, Wagner S. Explaining math: gesturing lightens the load. Psychological Science: A Journal of the American Psychological Society / APS [Internet]. 2001 ;12(6):516 - 522. Available from: http://www.ncbi.nlm.nih.gov/pubmed/11760141