University of Rochester
EMERGENCY INFORMATIONCALENDARDIRECTORYA TO Z INDEXCONTACTGIVINGTEXT ONLY

Cognitive Science

Study: Learning Language Requires Heavy Statistics

The human brain makes much more extensive use of highly complex statistics when learning a language than scientists had previously realized, a Rochester team has found.

The research, published in Cognitive Psychology, shows that the brain is wired to quickly grasp certain relationships between spoken sounds even though those relationships may be so complicated they’re beyond our ability to consciously comprehend.

“We’re starting to learn just how intuitively our minds are able to analyze amazingly complex information without our even being aware of it,” says Elissa Newport, professor of brain and cognitive sciences and lead author of the study. “There is a powerful correlation between what our brains are able to do and what language demands of us.”

For the study, Newport and Richard Aslin, professor of brain and cognitive sciences, looked at how people are able to recognize the division between spoken words when spoken language is really a stream of unbroken syllables. That’s part of the reason why speakers of foreign languages often seem to talk very quickly: nonspeakers of the language don’t perceive the pauses.

So how is a baby supposed to make out where one word begins and another ends?

Newport and Aslin devised a test in which babies and adults listened to snippets of a synthetic language: a few syllables arranged into nonsense words and played in random order for 20 minutes. During that time, the listeners were taking in information about the syllables, such as how often each occurred, and how often they occurred in relation to other syllables. For instance, in the real words “pretty baby,” the syllable “pre” is followed by “ty,” which happens more frequently in English than the syllable “ty” being followed by “ba.” The brain notes that “ty” is more likely to be associated with “pre” than with “ba,” and so English speakers hear a pause between those two syllables.

After listening to the synthesized syllables for the full 20 minutes, adults were played invented words along with made-up words. More than 85 percent of the time, adults were able to recognize words from nonwords. Five-year-olds also reacted definitively to words and nonwords.

“If you were given paper and a calculator, you’d be hard-pressed to figure out the statistics involved,” says Newport. “Yet after listening for a while, certain syllables just pop out at you and you start imagining pauses between the ‘words.’ It’s a reflection of the fact that somewhere in your brain you’re actually absorbing and processing a staggering amount of information.”

In a further aspect of the study, Newport and Aslin explored the relationships of other language elements, including elements that are linked to one another but not necessarily adjacent.

“These results suggest that human learning ability is not just limited to a few elementary computations, but encompasses a variety of mechanisms,” says Newport. “A question to explore now is, How complex and extensive are these learning mechanisms, and what kinds of computational abilities do people bring to the process of learning languages?”

—Jonathan Sherwood