speech perception

Half-heard phone conversations reduce cognitive performance

September, 2010

A new study finds that overheard cell phone conversations are particularly distracting because we can't predict what will be said next.

Why are other people’s phone conversations so annoying? A new study suggests that hearing only half a conversation is more distracting than other kinds of conversations because we're missing the other side of the story and so can't predict the flow of the conversation. This finding suggests that driving a car might be impaired not only by the driver talking on the phone, but also by passengers talking on their phones.

It also tells us something about the way we listen to people talking — we’re actively predicting what the person is going to say next. This helps explain something I’ve always wondered about. Listen to people talking in a language you don’t know and you’re often amazed how fast they talk. See an audio recording of the soundwaves, and you’ll wonder how people know when one word starts and another begins. Understanding what people are saying is not as easy as we believe it is — it takes a lot of experience. An important part of that experience, it seems, is learning the patterns of people’s speech, so we can predict what’s going to come next.

The study showed that people overhearing cell phone conversations did more poorly on everyday tasks that demanded attention, than when overhearing both sides of a cell phone conversation, which resulted in no decreased performance. By controlling for other acoustic factors, the researchers demonstrated that it was the unpredictable information content of the half-heard conversation that was so distracting.

Reference: 

Emberson, L.L., Lupyan, G., Goldstein, M.H. & Spivey, M.J. 2010. Overheard Cell-Phone Conversations: When Less Speech Is More Distracting Psychological Science first published on September 3, 2010 as doi:10.1177/0956797610382126

Source: 

Topics: 

tags: 

tags problems: 

tags strategies: 

Changing sounds are key to understanding speech

July, 2010

New research reveals that understanding spoken speech relies on sound changes, making "low" vowels most important and "stop" consonants least important.

As I get older, the question of how we perceive speech becomes more interesting (people don’t talk as clearly as they used to!). So I was intrigued by this latest research that reveals that it is not so much a question of whether consonants or vowels are more important (although consonants do appear to be less important than vowels — the opposite of what is true for written language), but a matter of transitions. It’s all a matter of the very brief changes across amplitude and frequency that make sound-handling neurons fire more often and easily — after all, as we know from other perception research, we’re designed to recognize/respond to change. Most likely to rate as high-change sounds are "low" vowels, sounds like "ah" in "father" or "top" that draw the jaw and tongue downward. Least likely to cause much change are "stop" consonants like "t" and "d" in "today." The physical measure of change corresponds closely with the linguistic construct of sonority (or vowel-likeness).

Reference: 

[1632] Stilp, C. E., & Kluender K. R.
(2010).  Cochlea-scaled entropy, not consonants, vowels, or time, best predicts speech intelligibility.
Proceedings of the National Academy of Sciences. 107(27), 12387 - 12392.

Source: 

Topics: 

tags: 

tags problems: 

Subscribe to RSS - speech perception
Error | About memory

Error

The website encountered an unexpected error. Please try again later.