Learning a new language is a multi-step, often multi-year process: Listen to new sounds, read new word structures, speak in different patterns or inflections.
But the chances of picking up that new language -- even unintentionally -- may be better if you're exposed to a variety of languages, not just your native tongue.
A new study from the University of Washington finds that, based on brain activity, people who live in communities where multiple languages are spoken can identify words in yet another language better than those who live in a monolingual environment.
"This study shows that the brain is always working in the background. When you're overhearing conversations in other languages, you pick up that information whether you know it or not," said Kinsey Bice, a postdoctoral fellow in the UW Department of Psychology and the Institute for Learning and Brain Sciences and lead author of the study, which is published in the September issue of the journal Brain and Language.
The finding itself was somewhat by happenstance, explained Bice. The research launched in a community where English was the predominant language, but a cross-country lab move -- for unrelated reasons -- resulted in an additional study sample, in a community with a diversity of languages.
Yet the task for participants remained the same: Identify basic words and vowel patterns in an unfamiliar language -- in this case, Finnish. While some of the classroom test results were similar between the two groups, the brain activity of those in the diverse-language setting was measurably greater when it came to identifying words they hadn't seen before.
The work started in the community around Pennsylvania University. According to Census data, the surrounding county is 85% white, and statewide, about 10% of residents speak a language other than English at home. For this study, researchers enrolled 18 people who were "functionally monolingual," based on their self-professed lack of proficiency in any language other than English.
Then, faculty author Judith Kroll, now of the University of California, Irvine, had an opportunity to relocate her lab to Southern California, and the team decided to see what results the experiment might yield there.
The second study sample of 16 people -- all monolingual English speakers -- came from the community around University of California, Riverside. Census data show that 35% of the surrounding county is white, and statewide, 44% of people ages 5 and older live in households where English is not the primary language.
The researchers chose Finnish because it wasn't common to either study location and relies on vowel-harmony rules that can be challenging for learners. Essentially, the vowels "ä," "ö" and "y" -- known as "front vowels" because they are formed in the front of the speaker's mouth -- cannot appear in the same words as the "back vowels": "a," "o" and "u." For instance, "lätkä," the word for "hockey," contains only front vowels, while "naula," the word for "nail," contains only back vowels.
Across two hour-long sessions, the participants were introduced to 90 Finnish vocabulary words through cards labeled with the word, a picture of what the word represented and an audio recording of a native speaker pronouncing the word. They also were asked to distinguish between nonsense and actual Finnish words to help them infer the vowel patterns. At the conclusion of training, participants were tested on words they had learned, as well as new and "fake" Finnish words. For the test portion, participants wore a headpiece equipped with special sensors that measure brain activity by detecting minute electrical signals on the scalp, a noninvasive technique called electroencephalography (EEG).
Both groups demonstrated similar abilities to identify Finnish words they had studied and determining whether they were different from the nonsense words they encountered during the training sessions. Neither group, however, showed particular fluency with telling the difference between real words and nonsense words that they had not seen before.
The EEG results, however, showed that the brains of the California participants, when shown the unknown words (real and nonsense), could tell the difference.
Brain and behavior data measure different time scales of information, she explained. Neurological measures show, millisecond by millisecond, how the brain processes what a person perceives. Behavioral measures can show a slight delay when compared to what's happening in the brain, because cognitive processes like decision-making and retrieving information from memory occur before a person answers a question or takes some kind of action, Bice said.
The results suggest an effect of ambient exposure to other languages, Bice said. The groups were generally matched in terms of demographics and their proficiency in other languages; the only differences were the socioeconomic status and the language environment. If anything, the higher levels of education and income in the Pennsylvania community normally would be associated with greater language learning. That leaves the environment.
The difference in brain activity among the California participants is reminiscent of past research at the UW that shows neurological results often "outpace" behavioral, or classroom, results, Bice said.
In the end, because of the lab relocation, the study findings were serendipitous, Bice said. Further research could more formally control for various factors and expand the study pool. But this study shows the ways the human brain may absorb another language, itself a useful skill in a globalizing society, she added.
"It's exciting to be reminded that our brains are still plastic and soaking in information around us, and we can change ourselves based on the context we place ourselves in," Bice said.