Language development

When you're reading a picture book to a very young child, it's easy to think it's obvious what picture, or part of a picture, is being talked about. But you know what all the words mean. It's not so easy when some of the words are new to you, and the open pages have more than one picture. A recent study has looked at the effect on word learning of having one vs two illustrations on a 2-page open spread.

The study, in two experiments, involved the child being read to from a 10-page storybook, which included two novel objects, mentioned four times, but only incidentally. In the first experiment, 36 preschoolers (average age 3.5 years) were randomly assigned to one of three conditions:

  • one illustration (the illustration filled the page, with the text written as part of the illustration, and the opposing page blank)
  • two illustrations (each illustration filled its page, on opposing pages)
  • one large illustration (the page was twice the size of that found in the other conditions) — this was the control condition.

Children who were read stories with only one illustration at a time learned twice as many words as children who were read stories with two or more illustrations. There was no difference in reading time, or in the child’s enjoyment of the story.

In a follow-up experiment, 12 preschoolers were shown the two-illustration books only, but this time the reader used a simple hand swipe gesture to indicate the correct illustration before the page was read to them. With this help, the children learned best of all.

In fact, the rate of word learning in this last condition was comparable to that observed in other studies using techniques such as pointing or asking questions. Asking questions is decidedly better than simply reading without comment, and yet this simple gesture was enough to match that level of learning.

Other studies have shown that various distractions added to picture books, like flaps to lift, reduce learning. All this is best understood in terms of cognitive load. The most interesting thing about this study is that it took so little to ameliorate the extra load imposed by the two illustrations.

Also see for a blog post by one of the researchers

We've seen a number of studies showing the value of music training for children's development of language skills. A new study has investigated what happens if the training doesn't begin until high school.

The study involved 40 Chicago-area high school students who were followed from their beginning at high school until their senior year. Nearly half the students had enrolled in band classes, which involved two to three hours a week of instrumental group music instruction in school. The rest had enrolled in junior Reserve Officers' Training Corps (ROTC), which emphasized fitness exercises during a comparable period.

The music group showed more rapid maturation in the brain's response to sound, and demonstrated prolonged heightened brain sensitivity to sound details. While all students improved in language skills tied to sound-structure awareness, the improvement was greater for those in music classes.

The finding is encouraging in that it shows that adolescent brains are still receptive to music training.

It's also encouraging in involving students from low-income areas. Children from families of lower socioeconomic status have been found to process sound less efficiently, in part because of noisier environments and also due to linguistic deprivation. A previous small study by the same researchers looked at the benefits of a free community music program for a group of disadvantaged students (the Harmony Project). In this small study, students more engaged in the program (as assessed by attendance and participation) showed greater improvement after two years, in how their brains processed speech and in their reading scores. Those who learned to play instruments also showed greater improvement than those who participated in music appreciation classes.

A study involving 124 teenagers has found that those who were most accurate at tapping along with a metronome also showed the most consistent brain responses to a synthesized speech sound "da". The finding is consistent with previous research showing links between reading ability and beat-keeping ability, and between reading ability and the consistency of the brain's response to sound. The finding also provides more support for the benefits of music training for both language skills and auditory processing.

I’d suggest that it might be particularly important for second language learning, raising the intriguing question: if you have problems learning another language, could you improve your abilities by working on your sense of rhythm?

You can find out more about the work of the Auditory Neuroscience Laboratory at

[3475] Tierney A, Kraus N. The Ability to Move to a Beat Is Linked to the Consistency of Neural Responses to Sound. The Journal of Neuroscience [Internet]. 2013 ;33(38):14981 - 14988. Available from:

A recent report from Autistica estimates that nearly a quarter (24%) of children with autism are non-verbal or minimally verbal — problems that can persist into adulthood.

A review of over 200 published papers and more than 60 different intervention studies has now concluded that:

  • Motor behaviors play a key role in language learning.
  • Children with autism show specific motor impairments, and less "mirroring" brain activity.
  • There is very little evidence that sign language training helps.
  • Picture exchange training can help.
  • Play-based approaches which employ explicit teaching strategies and are developmentally based are particularly successful.
  • New studies involving a focus on motor skills show promising preliminary results.

The paper can be freely downloaded at

[3418] McCleery JP, Elliott NA, Sampanis DS, Stefanidou CA. Motor development and motor resonance difficulties in autism: relevance to early intervention for language and communication skills. Frontiers in Integrative Neuroscience [Internet]. 2013 ;7. Available from:

A rat study has found that infant males have more of the Foxp2 protein (associated with language development) than females and that males also made significantly more distress calls than females. Increasing the protein level in females and reducing it in males reversed the gender differences in alarm calls.

A small pilot study with humans found that 4-year-old girls had more of the protein than boys. In both cases, it is the more communicative gender that has the higher level of Foxp2.

[3314] Bowers MJ, Perez-Pouchoulen M, Edwards SN, McCarthy MM. Foxp2 Mediates Sex Differences in Ultrasonic Vocalization by Rat Pups and Directs Order of Maternal Retrieval. The Journal of Neuroscience [Internet]. 2013 ;33(8):3276 - 3283. Available from:

The relative ease with which children acquire language has produced much debate and theory, mirroring the similar quantity of debate and theory over how we evolved language. One theory of language evolution is that it began with gesture. A recent study looking at how deaf children learn sign language might perhaps be taken as partial support for this theory, and may also have wider implications for how children acquire language and how we can best support them.

The study, involving 31 deaf toddlers, looked at 89 specific signs understood and produced by the children. It was found that both younger (11-20 months) and older (21-30 months) toddlers understood and produced more signs that were iconic than signs that were less iconic. This benefit seemed to be greater for the older toddlers, supporting the idea that a certain amount of experience and/or cognitive development is needed to make the link between action and meaning.

Surprisingly, the benefits of iconicity did not seem to depend on how familiar, phonologically complex, or imageable the words were.

In contrast to spoken language, a high proportion of signs are iconic, that is, related to the concept being expressed (such as, bringing the hand to the mouth to indicate ‘eat’). Nevertheless, if iconicity is important in sign language, it is surely also important in spoken languages. This is supported by the role of gesture in speech.

The researchers suggest that iconic links between our perceptual-motor experience of the world and the form of a sign may provide an imitation-based mechanism that supports early sign acquisition, and that this might also apply to spoken language — with gestures, tone of voice, inflection, and facial expression helping make the link between words and their meanings less arbitrary.

This suggests that we can support children’s acquisition of language by providing and emphasizing such ‘scaffolding’.

A study involving 120 toddlers, tested at 14, 24, and 36 months, has assessed language skills (spoken vocabulary and talkativeness) and the development of self-regulation. Self-regulation is an important skill that predicts later academic and social success. Previous research has found that language skills (and vocabulary in particular) help children regulate their emotions and behavior. Boys have also been shown to lag behind girls in both language and self-regulation.

The present study hoped to explain inconsistencies in previous research findings by accounting for general cognitive development and possible gender differences. It found that vocabulary was more important than talkativeness, and 24-month vocabulary predicted the development of self-regulation even when general cognitive development was accounted for. However, girls seemed ‘naturally’ better able to control themselves and focus, but the ability in boys was much more associated with language skills. Boys with a strong vocabulary showed a dramatic increase in self-regulation, becoming comparable to girls with a strong vocabulary.

These gender differences suggest that language skills may be more important for boys, and that more emphasis should be placed on encouraging young boys to use words to solve problems, rather than accepting that ‘boys will be boys’.

A new automated vocal analysis technology can discriminate pre-verbal vocalizations of very young children with autism with 86% accuracy. The LENA™ (Language Environment Analysis) system also differentiated typically developing children and children with autism from children with language delay. The processor fits into the pocket of specially designed children's clothing and records everything the child vocalizes. LENA could not only enable better early diagnosis of autism spectrum disorders, but also allow parents to continue and supplement language enrichment therapy at home and assess their own effectiveness for themselves.

An imaging study reveals that different brain regions are involved in learning nouns and verbs. Nouns activate the left fusiform gyrus, while learning verbs activates instead the left inferior frontal gyrus and part of the left posterior medial temporal gyrus. The latter two regions are associated with grammatical and semantic information, respectively, while the former is associated with visual and object processing. The finding is consistent with several findings that distinguish nouns and verbs: children learn nouns before verbs; adults process nouns faster; brain damage can differentially affect nouns and verbs.

Older news items (pre-2010) brought over from the old website

Children under 3 need adult help to learn action words from TV

Three studies of 96 children aged 30–42 months have explored their ability to learn from educational television. The studies looked specifically at the learning of verbs, since these are generally harder for children to learn than names of objects. The children saw characters performing unfamiliar actions that were labeled with invented words. It was found that children under age 3 could not learn the words directly from the program, without an adult providing help. Children over 3, however, could learn the verbs directly from the video program, without adult assistance. The findings are a warning about the benefits of educational videos for infants and toddlers.

Roseberry, S. et al. 2009. Live Action: Can Young Children Learn Verbs From Video? Child Development, 80 (5), 1360-1375.

Babies' language learning starts from the womb

Analysis of the cries of 60 healthy newborns (three to five days old), 30 born into French-speaking families and 30 born into German-speaking families, has revealed clear differences in the shape of the melodies, based on their mother tongue. Consistent with characteristic differences between the two languages, French newborns tend to cry with a rising melody contour, whereas German newborns seem to prefer a falling melody contour in their crying. It’s speculated that melody contour may be the only aspect of their mother's speech that newborns are able to imitate. Earlier studies have shown that human fetuses are able to memorize sounds from the external world by the last trimester of pregnancy, with a particular sensitivity to melody contour in both music and language. The finding is consistent with the idea that cry melody is the beginning of language development.

Mampe, B. et al. 2009. Newborns' Cry Melody Is Shaped by Their Native Language. Current Biology, Published online 05 November 2009.

Interestingly, another recent study reported that five-month-old infants matched speech, but not human nonspeech vocalizations, specifically to humans, and monkey calls to monkey faces, but not duck vocalizations to duck faces, even though infants likely have more experience with ducks than monkeys.

Vouloumanos, A. et al. 2009. Five-month-old infants' identification of the sources of vocalizations. PNAS 106 ( 44), 18867-18872.

The effect of gamma waves on cognitive and language skills in children

Gamma waves are fast, high-frequency brainwaves that spike when higher cognitive processes are engaged. Research suggests that lower levels of gamma power might hinder the brain's ability to bind thoughts together. In the first study of the "resting" gamma power in the frontal cortex in young children (16, 24 and 36 months old), it’s been revealed that those with higher language and cognitive abilities had correspondingly higher gamma power than those with poorer language and cognitive scores. Children with better attention and inhibitory control also had higher gamma power. There were no differences in gamma power based on gender or socio-economic status, but children with a family history of language impairments showed lower levels of gamma activity. The finding may enable more accurate pinpointing of a child’s development, enabling earlier, and better targeted, intervention.

Benasich, A.A. et al. 2008. Early cognitive and language skills are linked to resting frontal gamma power across the first 3 years. Behavioral Brain Research, 195 (2), 215-222.

Kids learn more when mother is listening

Research has already shown that children learn well when they explain things to their mother or a peer, but that could be because they’re getting feedback and help. Now a new study has asked 4- and 5-year-olds to explain their solution to a problem to their moms (with the mothers listening silently), to themselves or to simply repeat the answer out loud. Explaining to themselves or to their moms improved the children's ability to solve similar problems, and explaining the answer to their moms helped them solve more difficult problems — presumably because explaining to mom made a difference in the quality of the child's explanations.

Rittle-Johnson, B., Saylor, M. & Swygert, K.E. 2008. Learning from explaining: Does it matter if mom is listening? Journal of Experimental Child Psychology, In press.

Connection between language and movement

A study of all three groups of birds with vocal learning abilities – songbirds, parrots and hummingbirds – has revealed that the brain structures for singing and learning to sing are embedded in areas controlling movement, and areas in charge of movement share many functional similarities with the brain areas for singing. This suggests that the brain pathways used for vocal learning evolved out of the brain pathways used for motor control. Human brain structures for speech also lie adjacent to, and even within, areas that control movement. The findings may explain why humans talk with our hands and voice, and could open up new approaches to understanding speech disorders in humans. They are also consistent with the hypothesis that spoken language was preceded by gestural language, or communication based on movements. Support comes from another very recent study finding that mice engineered to have a mutation to the gene FOXP2 (known to cause problems with controlling the formation of words in humans) had trouble running on a treadmill.

Relatedly, a study of young children found that 5-year-olds do better on motor tasks when they talk to themselves out loud (either spontaneously or when told to do so by an adult) than when they are silent. The study also showed that children with behavioral problems (such as ADHD) tend to talk to themselves more often than children without signs of behavior problems. The findings suggest that teachers should be more tolerant of this kind of private speech.

Feenders, G. et al. 2008. Molecular Mapping of Movement-Associated Areas in the Avian Brain: A Motor Theory for Vocal Learning Origin. PLoS ONE, 3(3), e1768.

Winsler, A., Manfra, L. & Diaz, R.M. 2007. “Should I let them talk?”: Private speech and task performance among preschool children with and without behavior problems. Early Childhood Research Quarterly, 22(2), 215-231.

Different educational approaches appropriate for boys and girls?

An imaging study of some 50 children aged 9 to 15 revealed that girls showed significantly greater activation of the language areas of the brain when doing a language task than did boys. The boys showed greater activation of the specific sensory brain areas--visual or auditory--required by the task. This pattern suggests that girls rely on a supramodal language network, whereas boys process visual and auditory words differently. This difference may reflect the fact that males take longer to mature than females, rather than a lifelong gender difference, but it does have implications for education.

Burman, D.D., Bitan, T. & Booth, J.R. 2008. Sex differences in neural processing of language among children. Neuropsychologia, In Press, Corrected Proof, Available online 4 January 2008

Why do children experience a vocabulary explosion at 18 months of age?

At about 18 months children experience a vocabulary explosion, suddenly learning words at a much faster rate. A new study using computer simulations suggests that the reason for this has little to do with brain maturity or cognitive development but is the result of several simple factors: the repetition of words over time, the fact that children learn many words at the same time, and the fact that words vary in difficulty. This factor, that children must be learning a greater number of difficult or moderate words than easy words, is crucial.

McMurray, B. 2007. Defusing the Childhood Vocabulary Explosion. Science, 317 (5838), 631.

Baby DVDs may hinder, not help, infants' language development

Random telephone interviews with more than 1,000 families found that for every hour per day spent watching baby DVDs and videos, infants eight to 16 months of age understood an average of six to eight fewer words than infants who did not watch them. Baby DVDs and videos had no positive or negative effect on the vocabularies on toddlers 17 to 24 months of age. Daily reading and storytelling by parents were, however, associated with slight increases in language skills. The researchers believe the content of baby DVDs and videos is different from the other types of programming because it tends to have little dialogue, short scenes, disconnected pictures and shows linguistically indescribable images.

Zimmerman, F.J., Christakis, D.A. & Meltzoff, A.N. 2007. Associations between Media Viewing and Language Development in Children Under Age 2 Years. Journal of Pediatrics, 151 (4), 364-368.

Kids learn words best by working out meaning

An undergraduate project involving 100 children aged 3 to 3 ½, provides evidence that children learn words better when they figure out the words' meaning for themselves, rather than when they are simply told their meaning.

Fathers influence child language development more than mothers

A study of parents’ contribution to children’s language skills found that, in families with two working parents, fathers had greater impact than mothers on their children's language development between ages 2 and 3. Observations of the language interactions between parents and child revealed that 2-year-old children whose fathers used more diverse vocabularies had greater language development when they were tested one year later, but the mothers' vocabulary did not significantly affect a child's language skills. The study also found that high-quality child care during the first three years of life was associated with higher scores at age 3 on a test of expressive language development, but this was less important than family language.

Pancsofar, N. & Vernon-Feagans, L. 2006. Mother and father language input to young children: Contributions to later language development. Journal of Applied Developmental Psychology, 27 (6), 571-587.

Skills related to early language learning

A study of more than 120 children aged 21 months — a peak time for language learning — has found a link between language learning and several motor and cognitive skills. Children who were poor at moving their mouths (for example not being able to lick their lips, or blow bubbles) were particularly weak at language skills, while those who were good at these movements had a range of language abilities. Children who were good at pretending that one object is another, such as using a block for a car, or a box for a doll's bed, or giving a doll a tea party, were also better at language, but there was no relationship with more general thinking skills, such as doing puzzles. Children who could say new words an adult asked them to repeat, were best at language. Being able to listen to a new word or a funny sound and work out which picture it went with also distinguished between children with advanced and not so strong abilities.

Alcock, K. 2006. The Vocabulary Burst and Individual Differences. Study funded by the Economic and Social Research Council (ESRC).

Early gaze-following associated with early language

The ability to detect the direction of another's glance has been recognized as a crucial component of human social interaction for some time. New research now reveals that babies start to follow the movement of another person’s head at around 9 months, and by 10-11 months they follow the head and eyes. Sometimes they will make sounds as they follow the gaze. Those who simultaneously followed the eyes of the researcher and made vocalizations when they were 10 or 11 months old understood an average of 337 words at 18 months old while the other babies understood an average of only 195 words.

Brooks, R. & Meltzoff, A.N. 2005. The development of gaze following and its relation to language. Developmental Science, 8(6), 535.

Too much knowledge can be bad for some types of memory

Following on from an earlier study reported last year, in which children were found to have better memories than adults in certain circumstances, researchers have found that adults did better remembering pictures of imaginary animals than they did remembering pictures of real cats. The reason has to do with the effects of categorization. While categorization is often vital, it can lead people to ignore individual details. The trick is to know when it’s important to categorize and when it’s better to note specific details. The new study added to the earlier findings by showing that there is a gradual decrease in recognition memory from children to adults, rather than an abrupt change in the way people see the world. Moreover, the difference in how adults and children perceive and remember objects is not a developmental difference, but one caused by differences in knowledge. Adults performed like children when shown imaginary animals.

Fisher, A.V. & Sloutsky, V.M. 2005. When Induction Meets Memory: Evidence for Gradual Transition From Similarity-Based to Category-Based Induction. Child Development, 76(3), 583.

Language cues help visual learning in children

A study of 4-year-old children has found that language, in the form of specific kinds of sentences spoken aloud, helped them remember mirror image visual patterns. The children were shown cards bearing red and green vertical, horizontal and diagonal patterns that were mirror images of one another. When asked to choose the card that matched the one previously seen, the children tended to mistake the original card for its mirror image, showing how difficult it was for them to remember both color and location. However, if they were told, when viewing the original card, a mnemonic cue such as ‘The red part is on the left’, they performed “reliably better”.

The paper was presented by a graduate student at the 17th annual meeting of the American Psychological Society, held May 26-29 in Los Angeles.

Baby talk helps infants learn to speak

Most adults speak to infants using so-called infant-directed speech: short, simple sentences coupled with higher pitch and exaggerated intonation. Researchers have long known that babies prefer to be spoken to in this manner. A new study of 8-month-old infants reveals that infant-directed speech also helps infants learn words more quickly than normal adult speech. Thiessen's study may also explain why many adults struggle to learn a second language.

The study was published in the March issue of Infancy.

Children process words by sound while adults process by meaning

A study into the question of how false memories are formed has found evidence of an age-related, developmental shift in language, suggesting that younger children process words primarily on the basis of phonology, or sound, while older children and adults process words primarily on the basis of semantics, or meaning.

Dewhurst, S. & Robinson, C. 2004. False Memories in Children: Evidence for a Shift from Phonological to Semantic Associations. Psychological Science, 15 (11), 782-6.

Children outperform adults in memory study

An example of the perils of knowing too much! — under specific conditions, young children can beat most adults on a recognition memory test. The study compared young children (average age 5 years) with college students. Without being told what was being tested, participants were shown pictures of cats, bears and birds. Some of them were first shown a picture of a cat, and told that it had “beta cells inside its body”. They were then shown other pictures, and asked whether these animals also had beta cells. After this, they were shown other pictures, and asked whether they had been shown them before. The children were accurate on average 31% of the time; the college students only 7% of the time. The researchers suggested the reason was because the children used similarity-based induction: when asked whether each pictured animal had "beta cells", they looked carefully to see if the animal looked similar to the original cat. On the other hand, the adults used category-based induction: once they determined whether the animal pictured was a cat or not, they paid no more attention. Thus, when they were tested later, the adults didn't know the pictures as well as the children. A subsequent study taught the children to use category-based induction. Their performance then dropped to the level of the adults. Another study in which participants were simply shown the pictures of the 30 animals and told to remember them for a recognition test, found adults were accurate 42% of the time, compared to only 27% for the children.

Sloutsky, V.M. & Fisher, A.V. 2004. When Development and Learning Decrease Memory: Evidence Against Category-Based Induction in Children. Psychological Science, 15 (8), 553-558.

Language learning declines after second year of life

A study involving 96 deaf children who had received cochlear implants during their first four years of life has found that the rate of language learning was greatest for those given implants before they turned two. Children given implants at three or four years of age acquired language skills more slowly. The finding supports the idea that there is a 'sensitive period' for language learning, and suggests that deaf children should get cochlear implants sooner (it is still relatively rare for them to be given to children younger than two).

The findings were presented on 16 May at the Acoustical Society of America conference in Vancouver, Canada.

Childhood "amnesia" linked to vocabulary

"Childhood amnesia" is the term given to the well-known phenomenon of our almost complete lack of memory for the experiences of our very early childhood. Exactly why it occurs is long been a subject of debate. New research suggests the answer may lie in the very limited vocabulary of very young children. A study of 2- and 3-year-old children found that children can only describe memories of events using words they knew when the experience occurred. When asked about the experimental situation (involving a "magic shrinking machine") a year later, the children easily remembered how to operate the device, but were only able to describe the machine in words they knew when they first learned how to operate it.

Simcock, G. & Hayne, H. 2002. Breaking the Barrier? Children Fail to Translate Their Preverbal Memories Into Language. Psychological Science, 13 (3), 225-231.

Children's brains process words differently

An imaging study looked at brain activity in 19 children (7 - 10 years old) while saying a word in response to a written word. These images were compared with those from 22 adults (average of 25 years old). The study highlighted two brain regions in particular - regions in the left frontal and left extrastriate cortex that are known to be critical in language processing and thought to undergo substantial development between childhood and adulthood. Six subregions within these areas were identified, and two of these revealed differences in brain activity between the children and the adults.There was less activation in a left frontal region and greater activation in posterior left extrastriate cortex in children than in adults. It may be that the left frontal region is immature in children, leading to an alternative strategy that produces more activation in extrastriate regions. Or it may be that more experience is needed before the processing resources of this region can be used.

Schlaggar, B.L., Brown, T.T., Lugar, H.M., Visscher, K.M., Miezin, F.M., Petersen, S.E. 2002. Functional neuroanatomical differences between adults and school-age children in the processing of single words. Science, 296, 1476-9.