Author Archives: Ngozi Valerie Nwabueze


I knew from the minute I set foot into the French customs line at the Charles de Gaulle airport that perhaps I didn’t know French as well as I thought I did. Every conversation around me—except for the Americans’ I followed off the plane—sounded oddly like gibberish. In keeping with my nosy personality, I sidled a little closer to the French couple behind me to see if I could eavesdrop on a word or two—nada. One would think that five years of taking French classes would have gotten me a little farther than that.

Image Courtesy of Google Maps

I still remember my reaction in the first few minutes of French 201 at Emory. My professor greeted all the students in French when we walked in the door. Oh, that’s cute, I thought. But when 11:30 hit, class officially started, and she continued to speak in French, my mouth actually dropped open. How was I supposed to understand her? I could barely understand a word of spoken French. The nerve of my French professor to actually speak in French! Initially, the biggest thing on my mind was finding out a way to get the biggest bang for my buck in returning my newly purchased French textbooks.

Fortunately, a mix of procrastination in dropping the course and unyielding determination—a quitter I was not—led me to eventually decide to tough it out in French for the year. Good thing I did, because a few months later I would find myself in the largest French-speaking city in the world.

Setting foot in Paris a few weeks ago brought me right back to the feelings I felt the first day of French 201. As the weeks went by, I practiced, spoke to a few French natives, and most importantly, I listened. I started getting lost in the raw melodies of the French language—often I would find myself listening to the intonations of the speech rather than actually paying attention to what was said. I started comparing French to other languages, like English. Would a foreigner to the English language appreciate the melodies that are simply words to us? How does the brain process it? I know that there are some languages that have totally different basic sound units—can a person who is not native to the language even process those units? The budding neuroscientist in me had so many questions.

I looked up this super cool graphic from medical daily that basically told me that yes—language changes the way we think* For example, because there are more words for colors from dark to light blue in the Japanese language than in English, the Japanese perceive more colors than we do. Conversely, languages with fewer terms have the opposite effect—those native speakers perceive even less colors (Medical Daily). The ball doesn’t just stop at color perception—there are nearly infinite differences between languages that could change the way we think. Do these differences mean that the brain of one native language speaker is set up a little differently from the next? I wondered: Do differences in language between native speakers have any effect on the brain?

An article by Ge et al. (2015) asks how native speakers of different languages process that language—specifically Mandarin Chinese and English. Why did experimenters compare English with Chinese and not, say, the second best language on Earth—French? A part from English and Chinese being the two most widely used languages in the world, the Chinese language is a tonal language, meaning that the intonation used determines the meanings of the words. English, on the other hand, is atonal (hence the reason why high school geography teachers could get away with fully monotone class sessions). Researchers placed 30 native Chinese speakers and 26 native English speakers in fMRIs and used dynamic causal modeling (DCM)—which is essentially a way to construct a model on how brain regions interact. Our lovely subjects were presented with either intelligible or unintelligible speech in their native languages while being scanned, then data was compared between the two groups.

Classic language areas

Now, before we delve into the scintillating results of this study, let’s talk a little about how brain areas relating to language actually work. Most of language is processed in the left hemisphere of the brain. In this classic language area are structures like Broca’s area and Wernicke’s area, which are big names to the brain language nerds. Perhaps more relevant to this article is the pathway associated with the sound-meaning map, which assumes language-processing starts from the temporal lobe, goes to its anterior reaches, and ends up in the frontal lobe. In this paper, researchers think that this sound-meaning area will be more highly activated in native Chinese speakers, since their language relies so heavily on sounds and intonations for understanding speech.

Now for the exciting part: were the researchers right? They found that while the regions themselves that process speech are mostly the same across languages, the pathways through which these regions interact may be different. The brain areas commonly associated with language are the left posterior part of the superior temporal gyrus (pSTG), the anterior part of the superior temporal gyrus (aSTG), and the inferior frontal gyrus (IFG), which—for the purposes of this study—were named regions P, A, and F, respectively.

Essentially, data showed that when hearing intelligible speech, both the Chinese (tonal) and English (atonal) brain showed increased speech in the P to A areas—that’s shown by the green arrow on the first brain in the diagram below. Chinese speakers showed more activation than English speakers when listening to intelligible speech in both of the pathways coming out of the A area (red arrows in the middle brain). This may be due to further semantic processing that is needed for word identification in Chinese. This also happens to be one of the pathways for the sound-meaning map that we talked about before. So yes, the researchers were right in their hypothesis (big surprise)— the Chinese brain had more activation in this particular pathway than the English brain did. Finally, good ole’ English speakers showed more activation than Chinese when listening to intelligible speech in the P to F pathway (the red arrow on the final brain). This pathway is usually implicated in phonological speech processing (Obleser et al., 2007), where the first phonological features are usually enough to be able to identify words in atonal languages. Long story short, this data tells us that while there are common pathways used in understanding speech in both languages, some of the pathways between the brain regions are also different. To the big language nerds—and now to us—that’s pretty exciting stuff.

Figure 2A from Ge et al (2015)


What’s great about this paper is that it uses two languages that have really clear differences—tonal Chinese vs. atonal English. Scientific experiments are usually best with wide and clear-cut variables like those seen between English and Chinese, so the languages they tested for in this study were great. However, because of the way that this experiment was designed, we don’t know whether their main question—how is language processed in the brain by native speakers of a different language—really has anything to do with whether the subject was a native speaker or not. We don’t know if the pathway activation that we saw was due to a different general functioning of the brain in a given subject, or if it was due to the subject simply understanding a language that required certain pathways to be activated. In other words, is the difference in activated pathways due to the inherent way a native speaker’s brain works, or is it due to the pathways required to understand the language—regardless of the brain that’s doing the processing? In defense of the article, their question may not have been this complex. Maybe in the future, researchers could do a further experiment with native English speakers who also understood Chinese (or vice versa), and compare activated pathways when they heard intelligible Chinese to the pathways activated in a native Chinese speaker.

Either way, it’s definitely interesting to know that different languages require different brain pathways for processing. Maybe one day—preferably after an especially delicious Nutella crepe—the language pathways in my brain used for understanding French will become activated, and I can finally eavesdrop on all the airport conversations I want.





Image #2 from:


Crinion JT, et al. (2009) Neuroanatomical markers of speaking Chinese. Hum Brain Mapp 30(12):4108–4115.

Ge J, Peng G, Lyu B, Wang Y, Zhuo Y, Niu Z, Tan LH, Leff A, Gao J (2015) Cross-language differences in the brain network subserving intelligible speech. PNAS 112(10):2972-2977.

Obleser J, Wise RJ, Dresner MA, Scott SK (2007) Functional integration across brain regions improves speech perception under adverse listening conditions. J Neurosci 27(9):2283–2289.





Monkey See, Monkey Do

Thinking back to fifth grade field trips, the long lines, the sweltering hot Louisiana sun, and the teachers who thought that the visit would be the pinnacle of fifth grade achievement, I became accustomed to disliking trips to public places like zoos, museums, aquariums, etc. That being said, I never would have imagined finding myself paying 16.50 euros to go to a zoo here in Paris—yet there I was last Saturday, in that very position. Little did I know that that trip would become one of the most memorable ones of my first week in Paris.

Photo Courtesy of Google Maps


Parc Zoologique Sign


It all started when I met Bruce, a zebra in the zoo who I decided to name myself. There I was–staring at two especially unexciting rhinos and wondering to myself whether there were any animals in this zoo that did anything other than sleep through the day–when, as if on cue, Bruce appeared out the brush and proceeded to feast on a nearby tuft of grass. He was almost close enough to touch, the intricate patterning of his dust-covered black and white stripes catching my eye. Never in my young adult life would I think that I would be so excited to see a zebra, but there I was, gasping and aweing with the eight-year olds beside me.

Bruce and I


Bruce wasn’t the only new friend I made that Saturday afternoon. I had yet to see my favorite exhibit—the baboons. As I approached, the first thing I noticed was the large number of people watching from the viewing platforms. This must be a good one, I thought to myself. As I peered through the enclosure, I was surprised by their cacophonous action. There was always some baboon doing something hilarious somewhere in the cage—one chewing on some type of plastic, another swinging from tree to tree, yet another sitting up straight—quite peculiarly with its hands folded in its lap and looking like an old-time English professor.


The English Professor

However, I quite literally believe I got stars in my eyes when I caught sight of a newborn baboon with its mother. The baby—I named him Johnny—climbed on its mother’s back for a piggyback ride to the watering hole. When they arrived, Mom sat down, whipped Johnny around to her front side (her back to the viewing platform) and began breastfeeding. Soon enough, two other mothers arrived and started breastfeeding their babies, until there was a ring of nurturing mothers and their children. I was struck by the similarities in these baboons to human mothers—the way they cradled their children, the way they stroked their newborn’s fur as the baby suckled, the protective sidling of the mother whenever a male encroached into her area—with every minute I watched, I could see why these were our closest animal relatives.

Mothering Baboons


Being the budding neuroscientist that I am, I started to consider the brain mechanisms for this behavior. In almost every animal species, mothers innately care for their young—this obviously makes sense in an evolutionary perspective, but what are the brain mechanisms for this behavior? What motivates it? What is the neuroscience behind it all?

An article by Kikuchi et al. (2015) addresses the neuroscience of maternal love. The experiment was designed to look at the brain activity of young mothers when viewing video clips of their 16 month old infants showing attachment behaviors. Mothers in an fMRI were shown either a clip of their infant smiling at them while they played together (play situation—PS) or of their infant being in distress when the mother left the room (separation situation—SS). The researchers hypothesized that the parts of the brain mediating maternal behavior would be more activated when the infant was in distress (SS) than when the infant was not (PS). After the fMRI, mothers were asked to rate their subjective feelings (happy, motherly, joyful, warm, love, calm, excited, anxious, irritated, worry, and pity) in response to video clips in PS/SS of their own infants and of other infants.

So, what did the researchers find? There were four brain regions found to be specifically involved in feelings of motherly love: the right orbitofrontal cortex (OFC), the anterior insula, the periaqueductal gray (PAG), and the striatum. After identifying these brain regions, researchers then had to interpret what these results actually meant. Essentially, the OFC and the striatum are involved in the dopamine reward system, which would involve the mother’s motivation to care for her infant. The OFC, insula, and PAG are involved in an information processing system mediating homeostatic emotions for the mother and the realization of motherhood itself. Since the OFC is involved in both systems, it is thought to play an important role in mediating between the two.

Activated regions from Kikuchi et al.

Schematic of brain activation from Kikuchi et al.


After all was said and done, researchers found that mothers did, indeed experience higher brain activation in SS than in PS. As subjective ratings of worry increased in the SS, activity in the right OFC increased. Good to know that mom cares.

What’s great about this article is that it provides a simple and straightforward model of measuring brain activation of mothers in response to their babies. It asks the mothers’ subjective feelings in addition to the neurological aspects. However, it can also be said that this method might be too straightforward—mothers are undoubtedly faced with more than two situations of maternal love and attachment. Perhaps in the future, the authors could consider approaching a more complicated model—one with more situations like a feeding or a stress situation. Also, it is definitely a challenge to quantify love in a scientific aspect as these article attempts to do—perhaps it might be too poetic for such a field.

Poetic or not, there is an undeniable beauty in the way a mother cares for her child—whether that mother is human, baboon, or zebra. Either way you look at it, Johnny the baboon is certainly well accounted for.




Yoshiaki Kikuchi et al. (2015) The Neuroscience of Maternal Love. Neurosci Common 2015; 1: e991. doi: 10.14800/nc.991.

Photos taken by yours truly.