Répétez-vous?

I knew from the minute I set foot into the French customs line at the Charles de Gaulle airport that perhaps I didn’t know French as well as I thought I did. Every conversation around me—except for the Americans’ I followed off the plane—sounded oddly like gibberish. In keeping with my nosy personality, I sidled a little closer to the French couple behind me to see if I could eavesdrop on a word or two—nada. One would think that five years of taking French classes would have gotten me a little farther than that.

Image Courtesy of Google Maps

I still remember my reaction in the first few minutes of French 201 at Emory. My professor greeted all the students in French when we walked in the door. Oh, that’s cute, I thought. But when 11:30 hit, class officially started, and she continued to speak in French, my mouth actually dropped open. How was I supposed to understand her? I could barely understand a word of spoken French. The nerve of my French professor to actually speak in French! Initially, the biggest thing on my mind was finding out a way to get the biggest bang for my buck in returning my newly purchased French textbooks.

Fortunately, a mix of procrastination in dropping the course and unyielding determination—a quitter I was not—led me to eventually decide to tough it out in French for the year. Good thing I did, because a few months later I would find myself in the largest French-speaking city in the world.

Setting foot in Paris a few weeks ago brought me right back to the feelings I felt the first day of French 201. As the weeks went by, I practiced, spoke to a few French natives, and most importantly, I listened. I started getting lost in the raw melodies of the French language—often I would find myself listening to the intonations of the speech rather than actually paying attention to what was said. I started comparing French to other languages, like English. Would a foreigner to the English language appreciate the melodies that are simply words to us? How does the brain process it? I know that there are some languages that have totally different basic sound units—can a person who is not native to the language even process those units? The budding neuroscientist in me had so many questions.

I looked up this super cool graphic from medical daily that basically told me that yes—language changes the way we think* For example, because there are more words for colors from dark to light blue in the Japanese language than in English, the Japanese perceive more colors than we do. Conversely, languages with fewer terms have the opposite effect—those native speakers perceive even less colors (Medical Daily). The ball doesn’t just stop at color perception—there are nearly infinite differences between languages that could change the way we think. Do these differences mean that the brain of one native language speaker is set up a little differently from the next? I wondered: Do differences in language between native speakers have any effect on the brain?

An article by Ge et al. (2015) asks how native speakers of different languages process that language—specifically Mandarin Chinese and English. Why did experimenters compare English with Chinese and not, say, the second best language on Earth—French? A part from English and Chinese being the two most widely used languages in the world, the Chinese language is a tonal language, meaning that the intonation used determines the meanings of the words. English, on the other hand, is atonal (hence the reason why high school geography teachers could get away with fully monotone class sessions). Researchers placed 30 native Chinese speakers and 26 native English speakers in fMRIs and used dynamic causal modeling (DCM)—which is essentially a way to construct a model on how brain regions interact. Our lovely subjects were presented with either intelligible or unintelligible speech in their native languages while being scanned, then data was compared between the two groups.

Classic language areas

Now, before we delve into the scintillating results of this study, let’s talk a little about how brain areas relating to language actually work. Most of language is processed in the left hemisphere of the brain. In this classic language area are structures like Broca’s area and Wernicke’s area, which are big names to the brain language nerds. Perhaps more relevant to this article is the pathway associated with the sound-meaning map, which assumes language-processing starts from the temporal lobe, goes to its anterior reaches, and ends up in the frontal lobe. In this paper, researchers think that this sound-meaning area will be more highly activated in native Chinese speakers, since their language relies so heavily on sounds and intonations for understanding speech.

Now for the exciting part: were the researchers right? They found that while the regions themselves that process speech are mostly the same across languages, the pathways through which these regions interact may be different. The brain areas commonly associated with language are the left posterior part of the superior temporal gyrus (pSTG), the anterior part of the superior temporal gyrus (aSTG), and the inferior frontal gyrus (IFG), which—for the purposes of this study—were named regions P, A, and F, respectively.

Essentially, data showed that when hearing intelligible speech, both the Chinese (tonal) and English (atonal) brain showed increased speech in the P to A areas—that’s shown by the green arrow on the first brain in the diagram below. Chinese speakers showed more activation than English speakers when listening to intelligible speech in both of the pathways coming out of the A area (red arrows in the middle brain). This may be due to further semantic processing that is needed for word identification in Chinese. This also happens to be one of the pathways for the sound-meaning map that we talked about before. So yes, the researchers were right in their hypothesis (big surprise)— the Chinese brain had more activation in this particular pathway than the English brain did. Finally, good ole’ English speakers showed more activation than Chinese when listening to intelligible speech in the P to F pathway (the red arrow on the final brain). This pathway is usually implicated in phonological speech processing (Obleser et al., 2007), where the first phonological features are usually enough to be able to identify words in atonal languages. Long story short, this data tells us that while there are common pathways used in understanding speech in both languages, some of the pathways between the brain regions are also different. To the big language nerds—and now to us—that’s pretty exciting stuff.

Figure 2A from Ge et al (2015)

 

What’s great about this paper is that it uses two languages that have really clear differences—tonal Chinese vs. atonal English. Scientific experiments are usually best with wide and clear-cut variables like those seen between English and Chinese, so the languages they tested for in this study were great. However, because of the way that this experiment was designed, we don’t know whether their main question—how is language processed in the brain by native speakers of a different language—really has anything to do with whether the subject was a native speaker or not. We don’t know if the pathway activation that we saw was due to a different general functioning of the brain in a given subject, or if it was due to the subject simply understanding a language that required certain pathways to be activated. In other words, is the difference in activated pathways due to the inherent way a native speaker’s brain works, or is it due to the pathways required to understand the language—regardless of the brain that’s doing the processing? In defense of the article, their question may not have been this complex. Maybe in the future, researchers could do a further experiment with native English speakers who also understood Chinese (or vice versa), and compare activated pathways when they heard intelligible Chinese to the pathways activated in a native Chinese speaker.

Either way, it’s definitely interesting to know that different languages require different brain pathways for processing. Maybe one day—preferably after an especially delicious Nutella crepe—the language pathways in my brain used for understanding French will become activated, and I can finally eavesdrop on all the airport conversations I want.

-Ngozi

 

 

*http://www.medicaldaily.com/pulse/how-learning-new-language-changes-your-brain-and-your-perception-362872

Image #2 from: thebrain.mcgill.ca

References

Crinion JT, et al. (2009) Neuroanatomical markers of speaking Chinese. Hum Brain Mapp 30(12):4108–4115.

Ge J, Peng G, Lyu B, Wang Y, Zhuo Y, Niu Z, Tan LH, Leff A, Gao J (2015) Cross-language differences in the brain network subserving intelligible speech. PNAS 112(10):2972-2977.

Obleser J, Wise RJ, Dresner MA, Scott SK (2007) Functional integration across brain regions improves speech perception under adverse listening conditions. J Neurosci 27(9):2283–2289.

 

 

 

SaveSaveSaveSave

One response to “Répétez-vous?

Leave a Reply

Your email address will not be published. Required fields are marked *