Tag Archives: Language

A Symphony of Birds

Paris is a city of lights, but also a city of sound. The peacefulness of the gardens surrounding the cityscape is no match to the hustle and bustle of everyday city life. Sometimes the sound is welcomed, such as a talented neighbor’s piano playing or an excellent street musician’s violin performance under the Arc de Triomphe. However, sometimes it is less welcomed, such as a taxi honking or an amateur trumpeter interjecting himself on my metro ride. Despite that, I absolutely love the sounds around the city. An overlooked, but equally important aspect to the music of the city, is the music of the animal residents of Paris. Every morning, I walk outside my apartment and generally hear the sound of some animal within five minutes of stepping foot outside my door. Whether it is two pigeons fighting over food by a bakery or two dogs barking as they pass each other, it is clear that animals have specific abilities to communicate unique to each species.

1 Metro performer during daily ride on line 8


One of my favorite sounds to hear in Paris is the tweeting of birds up in the trees while I walk below on the street. During my time here in Paris, I have been exposed to the knowledge of bird songs in my classes and how their songs act as communication to one another. One questioned asked in class was “If some animals can be shown to have language, do they also create art?”. When I first heard this question, I immediately imagined monkeys holding a paintbrush behind a canvas with paint splattered on it and thought how I wasn’t so sure that it could be considered art. Upon further thinking, I thought of how art can be more than drawing, it could be related to dancing or singing. Instantly I started wondering if some birds may actually be singing for aesthetic purposes or just for their own personal entertainment. I knew that songbirds, like canaries or finches, are even known to have neural circuitry that shows that they are selective in what singing they process from other birds in order to rely on their memories for song learning (Phan et al., 2005). I then began to investigate if birds have been shown to exhibit any capacity of artistic expression and found an article by Gupfinger and Kaltenbrunner (2017) that demonstrated the auditory skills and musical preferences of grey parrots in captivity.

2 What I initially thought of as animals creating art

According to Gupfinger and Kaltenbrunner (2017), grey parrots are quite intelligent and have high audible skills and musical talents. Male parrots are even known to have songs that are specific to only themselves and are able to provide highly trained song learning to their offspring (Berg et al., 2011). The aim of their study was to determine how music and the use of musical instruments would influence the activity of grey parrots and add to their audible enrichment.  A central experiment of the study focused on how the parrots would interact and manipulate a music-producing joystick test device. The parrots’ beaks and legs were able to freely manipulate two joy sticks in two different experimental set ups. The first set up gave one joystick that produced sound and another joystick that remained silent. The preference for the grey parrots to activate the joystick that produced sound over silence demonstrates how parrots are more inclined to have auditory stimulation than to be without it (Gupfinger and Kaltenbrunner, 2017). In the second set up, there were two active joysticks, one set to 90 beats per minute and the other set to 120 beats per minute. This setup was used in order to gain a better understanding of musical and auditory preferences of individual grey parrots. The results from the second setup demonstrate that the parrots preferred to play beats at 90 beats per minute over 120 beats per minute. The spontaneous interaction of the parrots with the joystick device demonstrates that they have a potential capacity to exhibit musical expression.

3 Joystick Test Device used by Gupfinger and Kaltenbrunner (2017)

The real world application of the Gupfinger and Kaltenbrunner (2017) study implies that musical instruments can significantly benefit grey parrots in captivity by giving them a creative outlet for expression. The strength of this experiment was the use of these two different set ups. By being able to compare sound to silence and then strengthen that result (birds prefer auditory stimuli to silence) by specific was measure of beat the grey parrots prefer, it really helps those curious (including me) to agree with their conclusion that grey parrots can not only have vocal singing capabilities, but that they can  consciously process music and have the capability to manipulate a simple form of a musical instrument. While I believe that their experiment, for the most part, was strongly thought through, there is one aspect of their experimental design that I find questionable. Gupfinger and Kaltenbrunner (2017) state that their method to ensure that the birds acknowledged and used the musical joystick was to have a person stay present with the parrots and motivate them to engage with it. This alarms me as a possible confounding variable as they do not go in depth describing what their specific methods were to motivate the birds. The idea to measure grey parrot beat preference and frequency preference proved insightful and begs me to ask the further question of could birds, songbirds and non-songbirds, be shown to have the capability to synthesize the beats that they prefer and make a music all their own?


Works Cited

Berg, K. S., Delgado, S., Cortopassi, K. A., Beissinger, S. R., & Bradbury, J. W. (2011). Vertical transmission of learned signatures in a wild parrot. Proceedings of the Royal Society B: Biological Sciences279(1728), 585-591.

Gupfinger, R., & Kaltenbrunner, M. (2017, November). Sonic experiments with grey parrots: A report on testing the auditory skills and musical preferences of grey parrots in captivity. In Proceedings of the Fourth International Conference on Animal-Computer Interaction (p. 3). ACM.

Phan, M. L., Pytte, C. L., & Vicario, D. S. (2006). Early auditory experience generates long-lasting memories that may subserve vocal learning in songbirds. Proceedings of the National Academy of Sciences103(4), 1088-1093.


Image 1: taken by me

Image 2: taken from: https://www.google.com/search?q=monkeys+painting&source=lnms&tbm=isch&sa=X&ved=0ahUKEwjzk8a914zjAhVJWBoKHVw0BJcQ_AUIECgB&biw=1366&bih=665#imgrc=CjN9G5sHYDmNFM:


Image 3 taken from: Gupfinger, R., & Kaltenbrunner, M. (2017, November).

Accents away from Accent

This weekend I went on a crazy, fun, whirlwind trip to London along with Shelby, Kendall, Jamie, Alyssa, and Merry. While we were only there for a day and half, we managed to see Buckingham Palace, Westminster Abbey, Big Ben, London Bridge, and most of the other major famous sites. As we raced all over the city in the underground, I kept accidentally saying “pardonne-moi” and “désolé” to everyone I bumped into. Only, for the first time in weeks, everyone around us was speaking English. But, even though we all speak English, the way that the locals around us pronounced words and phrases was still different than our own speech.


Of course, from the moment we arrived in England, we were sounded by English accents. Several of us found ourselves fascinated by these accents and, when we were safely out of earshot, we even did our best to imitate them. Yesterday morning as I sat on the train back to Paris, I decided to try to find out what it is about our brain that allows to recognize, use, and understand different accented versions of the same language.

Westminster Abbey

Determining exactly what parts of the brain allow us to understand unfamiliar accents is a difficult task, but there is a growing body of research on this topic. Many of the studies on accent comprehension use functional magnetic resonance imaging (fMRI) to detect changes in brain activity and as subjects listen to sounds or sentences in different accents (Ghazi-Saidi et al., 2015).

A recent review of this research and found that other researchers have identified areas like the left inferior frontal gyrus, the insula, and the superior temporal sulci and gyri as having higher activity when listening to accented speakers produce sounds (Callan et al., 2014; Adank et al., 2015).Interestingly, many of these brain areas are the same regions that have been identified as important for understanding foreign languages (Perani and Abutalebi, 2005; Hesling et al., 2012).Some of these areas that are important for understanding unfamiliar accents – including the insula, motor cortex and premotor cortex – have also been implicated in the production of these accents (Adank et al., 2012a; Callan et al., 2014; Ghazi-Saidi et al., 2015). 

Investigating the production of accented speech is also an exciting field of study. Interestingly, one of the main ways we have learned about accent production is through case studies of patients with Foreign Accent Syndrome (FAS). FAS is a fascinating motor speech disorder where patients speak in a different accent than they originally used, typically following brain damage (Keulen et al., 2016). This condition was actually first identified here in Paris by Pierre Marie¹, a French neurologist (Keulen et al., 2016). After recovering from a brain hemorrhage, Marie’s patient had an Alsatian French accent instead of his original Parisian one (Marie, 1907). Since then, nearly 200 cases of this rare disease have been identified (Mariën et al., 2019).

Pierre Marie

However, it is hard to draw conclusions from individual case studies with just one patient. In a recent metanalysis (a procedure where data from other studies is combined and analyzed), Mariën et al. looked at 112 different published cases of FAS to draw larger conclusions about this rare disease. The authors were particularly interested in cases of FAS that occurred after a stroke, but they analyzed case studies from patients with all different kinds of brain damage.

To review these cases, Mariën et al. first compiled published case studies that reported the cause and symptoms of a patient’s FAS from Pierre Marie’s case in 1907 through October 2016. They then calculated and analyzed the demographic, anatomical, and symptomatic features of these FAS patients to look for larger trends across the different cases.

The authors found that there are statistically significantly more female patients (68% of cases) than male patients in these 112 FAS cases. Additionally, a significant and overwhelming majority (97%) of cases were in adults. In more than half the patients (53%) FAS was present following a stroke.

For those patients who developed FAS following a stroke, the authors also analyzed where in the brain their vascular damage was. The most commonly damaged brain areas (60% of vascular FAS patients) were the primary motor cortex, premotor cortex and basal ganglia which are all important for the physical ability to produce voluntary speech (Brown, Schneider, & Lidsky, 1997). The authors also found that 13% of these vascular FAS patients had damage in the insula, an area that has also been identified as important for accented speech production in studies of healthy subjects (Ghazi-Saidi et al., 2015).

The Insula

I think FAS is a fascinating disorder, but is important to remember that, like any case studies, these reports have a limited ability to tell us about how healthy people produce accented speech. The naturally occurring brain damage in these FAS patients is not necessarily localized, and other brain areas besides for the primary lesion location could have been affected by the damage. Furthermore, there are some cases of psychological (as opposed to neurological) FAS which complicates our understanding of the onset of this disease (Keulen et al., 2016).

While there is still a lot to learn about understanding how we construct and comprehend accented speech. Studies of FAS patients, particularly large metanalyses like this one, have just begun to identify some of the key brain areas that are reliably indicated in accent production. These findings provide a good starting point for future researchers to analyze these brain areas further and possibly study their role in healthy patients’ accents, which can help us all understand each other a little better.



1 – As a side note for my NBB 301 classmates: Pierre Marie is the “Marie” in Charcot-Marie-Tooth disease, a glial disease that affects Schwann cells. He was also a student of Jean-Martin Charcot and was one of the people depicted in the famous painting A Clinical Lesson at the Salpêtrière that we saw at the Musée de l’Histoire de la Médecine today.



Westminster Abbey: taken by me

Pierre Marie: https://upload.wikimedia.org/wikipedia/commons/thumb/a/a4/PierreMarie.jpg/230px-PierreMarie.jpg

Insula: https://upload.wikimedia.org/wikipedia/commons/b/b4/Sobo_1909_633.png



Adank P, Davis M, Hagoort P (2012a). Neural dissociation in processing noise and accent in spoken language comprehension. Neuropsychologia50, 77–84. 

Adank P, Nuttall HE., Banks B, & Kennedy-Higgins D (2015). Neural bases of accented speech perception. Frontiers in human neuroscience9, 558. doi:10.3389/fnhum.2015.00558

Brown L, Schneider JS, & Lidsky TI (1997). Sensory and cognitive functions of the basal ganglia. Current Opinion in Neurobiology, 7, 157–163.

Callan D, Callan A, & Jones, JA (2014). Speech motor brain regions are differentially recruited during perception of native and foreign-accented phonemes for first and second language listeners. Frontiers in neuroscience8, 275. doi:10.3389/fnins.2014.00275 

Ghazi-Saidi L, Dash T, Ansaldo AI (2015). How native-like can you possibly get: fMRI evidence in a pair of linguistically close languages, special issue: language beyond words: the neuroscience of accent. Front. Neurosci. 9:587.

Hesling I, Dilharreguy B, Bordessoules M, Allard M. (2012). The neural processing of second language comprehension modulated by the degree of proficiency: a listening connected speech FMRI study. Open Neuroimag. J. 6, 1–11.

Keulen S, Verhoeven J, De Witte E, De Page L, Bastiaanse R, & Mariën P (2016). Foreign Accent Syndrome As a Psychogenic Disorder: A Review. Frontiers in human neuroscience, 10, 168.

Marie P (1907). Un cas d’anarthrie transitatoire par lésion de la zone lenticulaire. In P. Marie Travaux et Memoires, Bulletins et Mémoires de la Société Médicale des Hôpitaux; 1906: Vol. IParis: Masson pp. 153–157.

Mariën P, Keulen S, Verhoeven J (2019) Neurological Aspects of Foreign Accent Syndrome in Stroke Patients, Journal of Communication Disorders, 77: 94-113,

Perani D, Abutalebi J (2005). The neural basis of first and second language processing. Curr. Opin. Neurobiol. 15, 202–206.

What Colorful Language!

We always see it in the movies: the younger child and the father laying together in the grass, gazing up at the midday sky. She asks what color the sky is, and he says blue without hesitation. Such a simple answer to what is, in reality, such a complex question. Over the past few weeks, to combat my occasional homesickness, I’ve found myself looking up to the sky, wondering if my parents can see the same sky back home in Georgia. When we discussed the colors of the sky in class, it encouraged me to investigate the simple answer to the question: what color is the sky?

Just one example of the types of colorful skies one could witness here in Paris.

The real answer, it turns out, depends on a variety of factors; the time of day, location of the viewer, location of the sun, the viewer’s visual abilities, language, mood, etc. From personal experience, I believe the same sky can be different colors to the same viewer in different states of mind. For example, individuals experiencing sadness have a greater tendency to “focus on the tree instead of the forest” (Gasper 2002), which translates to not seeing the full visual picture and instead fixating on visual detail, such as the shade of one item instead of the collective colors in a room. In a more scientific sense, a red-green colorblind viewer would have a different visual opinion of a sunset than a normally sighted individual. But what about language?

Interestingly enough, language and culture also exert a large influence on color perception; different languages have different words for different colors, and some only have one word for a whole category of colors. The color category perception effect (Zhang 2018) describes this phenomenon in which “people were more likely to distinguish colors from different colors than those that landed in the same area.” Those who speak languages that have more words for different colors would, under this theory, be better able to distinguish various shades than those who speak a language with fewer words for color. Based on this perception of color, two people from different cultures could view the sky in different shades. The figure below displays how the color wheels of the English and Greek lexicon differ due to variations in groupings.

Image result for the color wheelImage result for color wheel in greek

There is evidence that language centers in the brain are activated with color perception; in an experiment performed by Siok et al., when stimuli are observed from different linguistic categories, there is a greater activation of visual cortex areas 2/3 – the areas responsible for color vision. This enhanced V2/3 activity coincided with enhanced activity in the left posterior temporoparietal language region, which suggests a top-down control from the language center to modulate the visual cortex (Siok 2009). In other words, increased activity in language perception areas of the brain correlates to increased modulation of color vision before you’ve had the chance to pay conscious attention (Athanasopoulos 2010).

This is especially relevant in Paris; as an English-only speaker in a world of French speakers, I can’t help but wonder how differences in our color-related vocabulary translate to questions like that of the sky’s color. It is known that language effects sensory perception in its earliest stages (Athanasopoulos 2010), but would learning French color vocabulary change my perception of what colors I see? A previous experiment (Theirry 2009) demonstrated a difference in brain activity for both a native Greek and English speaker, the former of which makes a lexical distinction between light blue (ghalazio) and dark blue (ble). This is shown in the figure below, which demonstrates a greater Visual Mismatch Negativity response for the Greek participant when they were observing a blue stimulus due to greater lexical representation for this color.

A report of differences between speakers of different languages in early color perception. The shaded area represents presentation of a specific marker between 170 and 220 milliseconds post-stimulus. Notice the difference in negative response between Native English and Native Greek for the color blue.

In summary, the influence of language is one often underestimated when considering why we see the colors we do. I believe perception of color is a uniquely integrative experience, combining elements of culture, background, language, personality, and individuality to create specific visuals distinctive to one person. This seems all the more evident in Paris; everything is so new, so fresh and exciting that I cannot help but feel that the very colors of Paris hold something special that I have not seen elsewhere. So what color is the sky? You may be surprised, as I was, to find your answer constantly changes.


Athanasopoulos, P., Dering, B., Wiggett, A., Kuipers, J., & Thierry, G. (2010). Perceptual shift in bilingualism: Brain potentials reveal plasticity in pre-attentive colour perception. Cognition, 116(3), 437-443. doi:10.1016/j.cognition.2010.05.016

Gasper, K., & Clore, G. L. (2002). Attending to the Big Picture: Mood and Global Versus Local Processing of Visual Information. Psychological Science, 13(1), 34-40. doi:10.1111/1467-9280.00406

Siok, W. T., Kay, P., Wang, W. S., Chan, A. H., Chen, L., Luke, K., & Tan, L. H. (2009). Language regions of brain are operative in color perception. Proceedings of the National Academy of Sciences, 106(20), 8140-8145. doi:10.1073/pnas.0903627106

Thierry, G., Athanasopoulos, P., Wiggett, A., Dering, B., & Kuipers, J. (2009). Unconscious effects of language-specific terminology on preattentive color perception. Proceedings of the National Academy of Sciences, 106(11), 4567-4570. doi:10.1073/pnas.0811155106

Zhang, J., Chen, X., You, N., & Wang, B. (2018). On how conceptual connections influence the category perception effect of colors: Another evidence of connections between language and cognition. Acta Psychologica Sinica, 50(4), 390. doi:10.3724/sp.j.1041.2018.00390


Do we see as well as we think we see?

Picture of the sky over Pont du Gard

On the first day of Arts on the Brain, we were told to write freely about the prompt “What color is the sky?” I immediately remembered a podcast about a man, Guy Deutscher, who asked his daughter every day what color the sky is, and she didn’t answer blue. The podcast by Jad Abumrad and Jim Gleick starts off by talking about Homer and his lack of the word blue in his texts. It then goes onto talking about other old texts that don’t mention blue. It then goes into talking about the order that colors enter languages and says that blue is always the last one and that the theory was that it had to do with having the ability to make the color. They then talked about another person who brought a test to a group of people without the word for blue and that they had trouble identifying the blue box from green boxes. This seemed like proof that language impacts perception. They then got to Deutscher’s experiment with his daughter. They made sure that no one told her the sky was blue but made sure she did know the color blue. At first, she refused to answer the question about what color it was until one day she answered white and eventually she said blue. This seemed to answer why languages wouldn’t find it incredibly important to add a word for the color blue.

This made me wonder, how much does language impact perception? Do French people experience the world differently than I do? So many people speak more than one language here, unlike in America, and would that impact your perception as well?

Photo from https://theophthalmologist.com/fileadmin/_processed_/0/a/csm_0614-201-brain_b506a2a191.png

Broca’s and Wernike’s areas, outlined above, are two of the major regions associated with speech. The visual cortex at the back of the brain is where the majority of visual processing happens. At first, it appears that the visual cortex is so far away from the rest of the sensory processing and anything involving language. However, everything in the brain travels through multiple areas in the brain. Here is the path that light takes after entering the eye:

Photo from http://brainmind.com/images/VisualCortexOptic.jpg

Once the sight has been processed by the visual cortex, it then projects out to other regions of the brain.

Photo from https://nba.uth.tmc.edu/neuroscience/m/s2/images/html5/s2_15_10.jpg

Language and speech also move around to different regions like in the picture below.

Photo from https://michellepetersen76.files.wordpress.com/2015/06/redrawing-language-map-of-brain-neuroinnovations.png

With all of this and other information moving through the brain, it doesn’t seem super farfetched to me that language could impact our perception. Bhatara et al. (2015) showed that learning a second language would impact rhythm perception in native French speakers. Work by Ardila et al. (2015) shows that one region of the brain has to do with both recognition and adding a word to what you see. They also showed that this region connects with regions that play roles in thinking, categorization, and memory.

More recent research by He et al. (2019) compared color perception between Mongolian and Mandarin speakers. According to the study, both languages only have one word for light versus dark green. However, Mongolian divides light and dark blue into two different words while Mandarin only has one word for light and dark blue. They showed the subjects greens and blues and asked them to divide them into one of the 2 or 3 categories. They were then asked to sort the colors so that similar ones were together. The Mongolian speakers grouped the colors more closely together than the Mandarin speakers did. They also did an experiment where they timed how long it took the participants to find which color was different than the rest and found differences between the two groups. These experiments further show that language does have an impact on how we perceive color.

It would be interesting to find out if language or culture plays more of an impact on color perception. However, because the two heavily influence each other and are nearly impossible to completely separate, it would be impossible to know which plays a larger role. I would also be interested to know if language’s impact on color perception means that I would see artwork differently than a native speaker of a different language. Did all of the artists that we’re learning about in Arts on the Brain see their paintings differently than I do?  Would a bilingual person categorize colors according to their first language or the language they speak with the most color terms? Would common terms like light blue vs dark blue play a role or would they both be considered blue? I think the impact that language can have on perception is fascinating and will definitely keep it in mind the next time I’m looking at paintings in a museum.

Works Cited

Ardila, A., Bernal, B., & Rosselli, M. (2015). Language and visual perception associations: meta-analytic connectivity modeling of Brodmann area 37. Behavioural neurology, 2015, 565871. doi:10.1155/2015/565871

Bhatara, A., Yeung, H. H., & Nazzi, T. (2015). Foreign language learning in French speakers is associated with rhythm perception, but not with melody perception. [Abstract]. Journal of Experimental Psychology: Human Perception and Performance, 41(2), 277-282. doi:10.1037/a0038736

He, H., Li, J., Xiao, Q., Jiang, S., Yang, Y., & Zhi, S. (2019). Language and Color Perception: Evidence From Mongolian and Chinese Speakers. Frontiers in psychology, 10, 551. doi:10.3389/fpsyg.2019.00551

Radiolab – Why Isn’t the Sky Blue? [Jules Davidoff and Guy Deutscher] [Audio blog review]. (2018, January 2). Retrieved June 9, 2019, from https://www.youtube.com/watch?v=um6j_WRDggs










Fake It till you Learn It

Bonjour! Comment allez-vous? (That’s French for Hi! How are you?) During my first week abroad, there have been so many changes: living with new people, exploring a new city, immersing myself in an unknown culture. Through all these changes, the hardest one to adjust to has been learning a new language that I haven’t heard or seen since the fourth grade. Even though it has been such a short amount of time, I feel that it has gotten easier for me to communicate and understand conversations in French. I came into this trip knowing almost no French, but in just seven days, I notice myself recognizing words at the supermarket, and knowing how to respond to people who speak French fluently. I was actually amazed at how quickly I was able to start learning a new language!

Purchasing food at the local market

Language cognition has been studied to better understand how and where the process of language occurs. There have been new models of language cognition that demonstrate the use procedural memory (long term memory associated with how to do things) and declarative memory (memory of things that can be consciously recalled) in learning a new language (Ullman, 2016). Previous studies have noted that word learning has been a product of our declarative memory, while grammar is heavily dependent on our procedural memory (Davachi et al, 2003, Lum et al, 2012). This process of learning new languages is important, but perhaps not the only thing that has been beneficial during my first week in France.

Although types of memory play an important role in learning new languages, one of the reasons I have been able to grasp French this efficiently is because of gestures and their role in learning language. Gestures are using the body to convey a meaning. Recently, I have been noticing that I have been using my hands a lot more than I usually do while conversing with people. When I see people in the grocery store or the chocolate shops in Belgium, I can communicate with them through the use of gestures to supplement the little French I do know. This helps me learn new words while communicating effectively with people who would not understand me otherwise. Gestures have become a prominent part of my communication method because they are able convey a different type of speech and help me produce speech (Goldin-Meadow and Alibali, 2012).

In an fMRI study done by Weisberg et al (2017), the activation of language regions (shown below) in the brain were reduced when related gestures accompanied speech, as shown in the fMRI data below.


Decrease in activation of speech with gesture compared to speech alone and gesture alone

Language regions in the brain







However, when gestures were used alone, there was a greater activation in language comprehension areas. The figure shows that speech accompanied by meaningful gestures does not require as much neuronal resources and thus there is not as much activation in regions associated with action representation or language comprehension (Weisberg et al, 2017). Both of these systems rely on each other to create a more efficient method of communicating using less resources.

There has also been evidence provided that gestures increase the activation of the word they are describing to make it easier for the speaker to access that word (Krauss, 1998). Krauss coined this method as the Lexical Gesture Process Model. In further studies, Krauss found that regardless of spontaneous speech or rehearsed speech, gestures are activated prior or simultaneously to its lexical affiliate, the word the gesture describes. The figure below shows the difference of onset time for speech minus the onset time for gesture and the times are all either happening simultaneously or the gesture is activated before speech. This helps show that the gestures are used as an aid to help communicate in speech because they are activated prior to the words (Krauss, 1998). Thank goodness for these gestures guiding me through these new changes and helping me learn the words!



I am so lucky to have these gestures as a part of my communication vocabulary because it has made it easier to learn French words and gotten me through the first week. Although I plan on learning more of the language, I am grateful for the grace gestures have given me as I attempt to blend in and communicate with others.


  1. Davachi, L., Mitchell, J. P., & Wagner, A. D. (2003, February 18). Multiple routes to memory: Distinct medial temporal lobe processes build item and source memories. Retrieved from https://www.ncbi.nlm.nih.gov/pubmed/12578977
  2. Goldin-Meadow, S., & Alibali, M. W. (2013). Gesture’s role in speaking, learning, and creating language. Retrieved from https://www.ncbi.nlm.nih.gov/pubmed/22830562
  3. Krauss, R. (1998). Why Do We Gesture When We Speak? Current Directions in Psychological Science,7(2), 54-60. Retrieved from http://www.jstor.org/stable/20182502
  4. Krauss RM, Chen Y, Chawla P. Nonverbal Behavior and Nonverbal Communication: What do Conversational Hand Gestures Tell Us? (2008, April 11). Retrieved from https://www.sciencedirect.com/science/article/pii/S0065260108602415
  5. Lum, J. A., Conti-Ramsden, G., Morgan, A. T., & Ullman, M. T. (2014). Procedural learning deficits in specific language impairment (SLI): a meta-analysis of serial reaction time task performance. Cortex; a journal devoted to the study of the nervous system and behavior51(100), 1–10. doi:10.1016/j.cortex.2013.10.011
  6. Weisberg, J., Hubbard, A. L., & Emmorey, K. (2017). Multimodal integration of spontaneously produced representational co-speech gestures: an fMRI study. Language, cognition and neuroscience32(2), 158–174. doi:10.1080/23273798.2016.1245426


Picture of Language Region

  1. https://www.pinterest.com/pin/678847343807021257/?lp=true



I knew from the minute I set foot into the French customs line at the Charles de Gaulle airport that perhaps I didn’t know French as well as I thought I did. Every conversation around me—except for the Americans’ I followed off the plane—sounded oddly like gibberish. In keeping with my nosy personality, I sidled a little closer to the French couple behind me to see if I could eavesdrop on a word or two—nada. One would think that five years of taking French classes would have gotten me a little farther than that.

Image Courtesy of Google Maps

I still remember my reaction in the first few minutes of French 201 at Emory. My professor greeted all the students in French when we walked in the door. Oh, that’s cute, I thought. But when 11:30 hit, class officially started, and she continued to speak in French, my mouth actually dropped open. How was I supposed to understand her? I could barely understand a word of spoken French. The nerve of my French professor to actually speak in French! Initially, the biggest thing on my mind was finding out a way to get the biggest bang for my buck in returning my newly purchased French textbooks.

Fortunately, a mix of procrastination in dropping the course and unyielding determination—a quitter I was not—led me to eventually decide to tough it out in French for the year. Good thing I did, because a few months later I would find myself in the largest French-speaking city in the world.

Setting foot in Paris a few weeks ago brought me right back to the feelings I felt the first day of French 201. As the weeks went by, I practiced, spoke to a few French natives, and most importantly, I listened. I started getting lost in the raw melodies of the French language—often I would find myself listening to the intonations of the speech rather than actually paying attention to what was said. I started comparing French to other languages, like English. Would a foreigner to the English language appreciate the melodies that are simply words to us? How does the brain process it? I know that there are some languages that have totally different basic sound units—can a person who is not native to the language even process those units? The budding neuroscientist in me had so many questions.

I looked up this super cool graphic from medical daily that basically told me that yes—language changes the way we think* For example, because there are more words for colors from dark to light blue in the Japanese language than in English, the Japanese perceive more colors than we do. Conversely, languages with fewer terms have the opposite effect—those native speakers perceive even less colors (Medical Daily). The ball doesn’t just stop at color perception—there are nearly infinite differences between languages that could change the way we think. Do these differences mean that the brain of one native language speaker is set up a little differently from the next? I wondered: Do differences in language between native speakers have any effect on the brain?

An article by Ge et al. (2015) asks how native speakers of different languages process that language—specifically Mandarin Chinese and English. Why did experimenters compare English with Chinese and not, say, the second best language on Earth—French? A part from English and Chinese being the two most widely used languages in the world, the Chinese language is a tonal language, meaning that the intonation used determines the meanings of the words. English, on the other hand, is atonal (hence the reason why high school geography teachers could get away with fully monotone class sessions). Researchers placed 30 native Chinese speakers and 26 native English speakers in fMRIs and used dynamic causal modeling (DCM)—which is essentially a way to construct a model on how brain regions interact. Our lovely subjects were presented with either intelligible or unintelligible speech in their native languages while being scanned, then data was compared between the two groups.

Classic language areas

Now, before we delve into the scintillating results of this study, let’s talk a little about how brain areas relating to language actually work. Most of language is processed in the left hemisphere of the brain. In this classic language area are structures like Broca’s area and Wernicke’s area, which are big names to the brain language nerds. Perhaps more relevant to this article is the pathway associated with the sound-meaning map, which assumes language-processing starts from the temporal lobe, goes to its anterior reaches, and ends up in the frontal lobe. In this paper, researchers think that this sound-meaning area will be more highly activated in native Chinese speakers, since their language relies so heavily on sounds and intonations for understanding speech.

Now for the exciting part: were the researchers right? They found that while the regions themselves that process speech are mostly the same across languages, the pathways through which these regions interact may be different. The brain areas commonly associated with language are the left posterior part of the superior temporal gyrus (pSTG), the anterior part of the superior temporal gyrus (aSTG), and the inferior frontal gyrus (IFG), which—for the purposes of this study—were named regions P, A, and F, respectively.

Essentially, data showed that when hearing intelligible speech, both the Chinese (tonal) and English (atonal) brain showed increased speech in the P to A areas—that’s shown by the green arrow on the first brain in the diagram below. Chinese speakers showed more activation than English speakers when listening to intelligible speech in both of the pathways coming out of the A area (red arrows in the middle brain). This may be due to further semantic processing that is needed for word identification in Chinese. This also happens to be one of the pathways for the sound-meaning map that we talked about before. So yes, the researchers were right in their hypothesis (big surprise)— the Chinese brain had more activation in this particular pathway than the English brain did. Finally, good ole’ English speakers showed more activation than Chinese when listening to intelligible speech in the P to F pathway (the red arrow on the final brain). This pathway is usually implicated in phonological speech processing (Obleser et al., 2007), where the first phonological features are usually enough to be able to identify words in atonal languages. Long story short, this data tells us that while there are common pathways used in understanding speech in both languages, some of the pathways between the brain regions are also different. To the big language nerds—and now to us—that’s pretty exciting stuff.

Figure 2A from Ge et al (2015)


What’s great about this paper is that it uses two languages that have really clear differences—tonal Chinese vs. atonal English. Scientific experiments are usually best with wide and clear-cut variables like those seen between English and Chinese, so the languages they tested for in this study were great. However, because of the way that this experiment was designed, we don’t know whether their main question—how is language processed in the brain by native speakers of a different language—really has anything to do with whether the subject was a native speaker or not. We don’t know if the pathway activation that we saw was due to a different general functioning of the brain in a given subject, or if it was due to the subject simply understanding a language that required certain pathways to be activated. In other words, is the difference in activated pathways due to the inherent way a native speaker’s brain works, or is it due to the pathways required to understand the language—regardless of the brain that’s doing the processing? In defense of the article, their question may not have been this complex. Maybe in the future, researchers could do a further experiment with native English speakers who also understood Chinese (or vice versa), and compare activated pathways when they heard intelligible Chinese to the pathways activated in a native Chinese speaker.

Either way, it’s definitely interesting to know that different languages require different brain pathways for processing. Maybe one day—preferably after an especially delicious Nutella crepe—the language pathways in my brain used for understanding French will become activated, and I can finally eavesdrop on all the airport conversations I want.





Image #2 from: thebrain.mcgill.ca


Crinion JT, et al. (2009) Neuroanatomical markers of speaking Chinese. Hum Brain Mapp 30(12):4108–4115.

Ge J, Peng G, Lyu B, Wang Y, Zhuo Y, Niu Z, Tan LH, Leff A, Gao J (2015) Cross-language differences in the brain network subserving intelligible speech. PNAS 112(10):2972-2977.

Obleser J, Wise RJ, Dresner MA, Scott SK (2007) Functional integration across brain regions improves speech perception under adverse listening conditions. J Neurosci 27(9):2283–2289.





Bonjour, Do You Speak English?


If you asked me what the hardest thing about living in Paris has been, my answer would be simple – the language barrier. Before leaving for Paris, I didn’t know any French besides how to say hello and goodbye. While I have picked up a few useful phrases in the past 4 weeks, it has still been very difficult to remember what I’ve learned. I began to wonder why I was having such a hard time with French, especially based on my previous experiences with language. When I was a young child, my mother used to teach me Chinese words and phrases. While I am nowhere near fluent in Chinese, I can still easily remember names of words and recognize phrases that I learned many years ago. On the other hand, learning French has been quite the struggle. I can spend a while reading my French traveler’s guide and practice my accent, yet hardly remember what I practiced the next day. Language is a very important field in neuroscience, so this experience led me to ask several questions: Why is it more difficult to learn a second language as we get older? Are there differences in anatomy of language areas in the brain depending on what age you learned a second language? While it is generally well known that children are able to learn languages much more quickly than adults (Johnson et al., 1989), I wanted to look further into how the age of learning a second language affects brain structure.

In 2014, Klein et al. published a study that examined how the age at which a second language is learned shapes brain structure. This study used four groups of participants: monolinguals who spoke only one language (monolinguals), bilinguals who learned two languages either simultaneously from birth or up until age 3 (simultaneous bilinguals), bilinguals who learned their second language from early childhood ages 4-7 (early sequential bilinguals), and bilinguals who learned their second language during late childhood ages 8-13 (late sequential bilinguals). All participants were interviewed and given questionnaires about their language background to determine which group they belonged to. It’s important to know that monolinguals were considered fluent only in their native language even if they received some formal training of another language, so taking a few years of Spanish in school doesn’t count as being bilingual. This study used magnetic resonance imaging scans (MRI), which allowed researchers to take an image of the brain and compare anatomical differences between participants’ brains.

Image: Cerebral Cortex, the outer layer of tissue in the brain that researchers measured for thickness

Animation: Inferior Frontal Gyrus Location (left side)

First, researchers tested for general differences in cortical thickness (how thick the outer layer of tissue in the brain was) using MRI between monolinguals and the different groups of bilinguals. They were interested in measuring cortical thickness to see exactly how being bilingual affects growth in language areas of the brain during development. A thicker cortex meant that there was more neuronal (cells in the brain) development in that brain region. Researchers found that there was a significant difference in cortical thickness between the groups in a brain region called the left inferior frontal gyrus (LIFG). The LIFG is very important for phonological and syntax processing in language (Vigneau et al., 2006). Phonological processing means using sounds to understand language, and syntax refers to understanding the order of words to form sentences. Researchers found that the LIFG was much thicker in the early and late sequential bilingual groups compared to the monolingual group. Put more simply, the LIFG was much thicker only in bilinguals that learned their second language after early childhood compared to monolinguals. These differences in cortical thickness were not surprising, since the LIFG is a key brain area involved in language processing. These results demonstrated that learning a second language after becoming fluent in the first language changes brain structure during development. This was very significant finding, because it shows the “plasticity” of the brain, or the brain’s ability to reorganize itself and form new connections in different environments! To explain why the cortex becomes thicker in early and late sequential bilingual groups, researchers suggested that learning a second language after early childhood causes neurons and connections between neurons to grow in brain areas involved in language.

Figure 1: Klein et al., 2014

MRI scans showed that there was no difference in cortical thickness between the monolingual group and the simultaneous bilingual group. This was another very important finding, because it showed that being bilingual only affects brain development when a person learns their second language after early childhood. Researchers reasoned that these differences in cortex thickness might mean that there are different learning processes involved in first and second language learning only when the languages are learned separately after early childhood. These different learning processes might cause the cortex in language areas to become thicker as neurons and their connections grow. These results also show that the age when learning a second language is very important for setting up the brain structures involved in language.

Neurons and their many connections

Once researchers determined general differences in cortex thickness between monolinguals and bilinguals, they wanted to further study the relationship between brain structure and age of language learning in the bilingual participants. They found that the later a second language was learned after an individual learned their first language, the thicker the cortex was in the LIFG. Based on these results, researchers suggested that that the thicker cortex associated with later second language learning might reflect the brain using less than optimal neural circuits for language learning. An easier way to think about the brain is by thinking of it as a huge switchboard with lots of connections between each area of the brain. A neural circuit is like a path that information follows to get from one part of the brain to another. There are neural circuits that are direct and very quick, but there are also more roundabout ways to send information from one area to another. As we mature, our brain begins to solidify its connections, so the neural circuits used when a second language is learned at a later age may not be as direct and quick. Using suboptimal circuits could contribute to the cortex becoming thicker, as neurons increase their connections to follow a roundabout path. Learning both languages at the same time during early childhood appeared to use optimal neural circuits for language learning, because there were no differences in thickness between monolinguals and simultaneous bilinguals.

I found this study to be very interesting because it showed that there are anatomical differences in language regions of the brain that depended on what age a participant learned their second language. It was also very informative because it shows that the brain isn’t a set in stone structure, and our environment can significantly contribute to our development. As a follow up for more concrete conclusions about neural circuits involved in language learning, I’d like to see a study where researchers measure activation of the LIFG rather than just differences in cortex thickness. For example, functional magnetic resonance imaging (fMRI) measures brain activity by detecting blood flow to specific brain regions. Participants could read or listen to their native language followed by their second language in an fMRI machine to measure and compare how much language areas of the brain are active. Results from this would be even more informative in understanding how the age at which a second language is learned plays a role in language processing. For example, variation in brain activity could confirm differences in optimal and suboptimal neural circuits depending on what age the second language was learned. This would allow researchers to understand more about how neural processing, rather than just anatomy, is affected in language areas by learning a new language.


Until next time,

  • Sarah



Johnson JS and Newport EL (1989). Critical period effects in second language learning: The influence of maturational state on the acquisition of English as a second language. Cognitive psychology, 21(1), 60-99.

Klein D, Mok K, Chen JK, & Watkins KE (2014). Age of language learning shapes brain structure: a cortical thickness study of bilingual and monolingual individuals. Brain and language131, 20-24.

Vigneau M, Beaucousin V, Herve PY, Duffau H, Crivello F, Houde O, and Tzourio-Mazoyer N (2006). Meta-analyzing left hemisphere language areas: phonology, semantics, and sentence processing. Neuroimage30(4), 1414-1432.

Cerebral cortex image (Creative Commons): http://www.neuroscientificallychallenged.com/blog/know-your-brain-cerebral-cortex

Left inferior frontal gyrus animation (Creative Commons): https://commons.wikimedia.org/wiki/File:Inferior_frontal_gyrus_animation_small.gif

Neural connection image (Creative Commons): http://maxpixel.freegreatpicture.com/Network-Brain-Cells-Brain-Structure-Brain-Neurons-1773922

French Phrasebook Image: https://images-na.ssl-images-amazon.com/images/I/51pqTbOV1qL._SX350_BO1,204,203,200_.jpg

Figure 1 from Klein et al., 2014

“Hello” or “Bonjour” ?

Hello world,

This past week has been extremely interesting, yet exciting, to say the least. After a TERRIBLE delay at JFK airport, I finally made it to Paris (about 6 hours behind schedule…). Once settled into my room, I met up with my friend, Sasha, to grab a quick dinner. We decided to go to a small restaurant close to where we live, as our long day of traveling left us extremely tired. When we sat down at the restaurant, the waiter walked over and said, “Bonjour, comment puis-je vous aider?” This caught me extremely off guard, as this was the first time I engaged in a conversation with a true francophone.


Sasha (left) and me (right) at dinner


Sasha and me at the Eiffel Tower


Let me rewind a little bit. I have studied French since 6th grade, and although it may not be my primary concentration in college, it plays a huge role in my academic career. However, this was my first time in a French speaking country, so I have not had much experience with French conversation, aside from with my fellow French-speaking peers and professors. So, when the waiter confronted me and asked a question in French, I was rightfully so caught off guard.



(Anyway, returning to the restaurant…) Sasha, being from Montreal and growing up speaking French with her family, swiftly answered the waiter. After a few seconds of gathering myself and adjusting my vocabulary, I too answered him (in French, of course). This event made wonder what physiological differences, if any, occurred in my brain when switching between English and French vocabulary. Were different areas of my brain active for French words versus English words and vice versa? This question sparked my interest, so, upon returning to my room I searched for an answer.

Before I try and explain the studies I found, let me give you a quick and easy lesson concerning neuroscience and language. Broca’s area, a region of the frontal part of the brain, is linked to the production of speech, while Wernicke’s area, a region of the temporal part of the brain (slightly above where your ears are), is linked to the comprehension aspects of speech. In order to engage in a coherent conversation with another individual, one must use both of these areas, as the language one hears must be understood
(via Wetumblr_memuxuR4xw1qf721rrnicke’s area) and the language one speaks must be intelligible (via Broca’s area). So, when looking for an answer to my original question about language, I immediately thought that this must be the sole system affected, but boy was I wrong.


After some quick searching, I stumbled upon an article by Correia et al., 2014, concerning brain activation in bilingual individuals. The researchers in this study subjected bilingual participants, fluent in English and Dutch, to a series of experimentations in which the participants were placed inside an fMRI and told to listen to a series of words. The words consisted of the names of specific animal species, and the language spoken varied between English and Dutch. The fMRI constructed images of the participant’s brains, highlighting the regions most active during this process. By examining and comparing the fMRI images created by solely Dutch words, solely English words, and a combination of the two, Correia et al. isolated several regions of the brain active for both languages. The main region of activity they observed was the anterior temporal lobe (ATL). This cortical region is associated with semantic memory, that is, memory of physical objects, people, information, and (most important to this study) words (Bonner and Price, 2013). This finding is significant as it provides evidence that semantic knowledge is processed in a language-independent form in the brains of bilingual listeners (Correia et al., 2014). Essentially, this means that as the participants listened the either English or Dutch words, their ATLs become equivalently active for each. So, when I was in the restaurant with Sasha, although I may have been caught off guard by the waiter speaking French, similar regions of my brain became active compared to if the waiter spoke English to me.

Screen Shot 2015-06-07 at 12.05.16 PM

A figure from Correia et al. (2014) depicting the language-independent regions of the brain, one of which being the anterior temporal lobe (ATL)

Another interesting study I found was conducted by Mohades et al. in 2012. In this study, the researchers assessed the brain circuitry associated with language in children aged 8-11 years old. They compared this circuitry in children raised monolingual to those raised bilingual. Through this, the researchers discovered significantly different white matter density in specific brain regions involved with spoken language and comprehension of language. Certain areas of bilingual’s brains contained different densities of white matter in comparison to the brain’s of monolinguals (Mohades et al., 2012). This means that the circuitry of the brain involved with language differs depending on one’s language capabilities. So, in relation to my brain and Sasha’s brain, we have different densities of white matter in specific regions of our brains, since Sasha was raised bilingual (woah).


The type of fMRI imaging used by Mohades et al. (2011) to measure white matter integrity (density).


I found both of these articles very interesting because they offer different findings regarding brain activation in bilinguals. In my NBB classes I learn about many regions of the brain discussed in these studies, yet I never knew the role they played in bilingual individuals. With this newfound knowledge, I am interested in doing further research to discover more differences in brain activation associated with language.

~ Ethan Siegel


Bonner M, Price A (2013) Where is the anterior temporal lobe and what does it do? The Journal of Neuroscience. 33(10): 4213-4215

Correia J, Formisano E, Valente G, Hausfeld L, Jansma B, Bonte M (2014) Brain-based translation: fMRI decoding of spoken words in bilinguals reveals language-independent semantic representations in anterior temporal lobe. The Journal of Neuroscience. 34(1):332–338

Mohades S, Struys E, Van Schuerbeek P, Mondt K, Van de Craen P, Luypaert R (2011) DTI reveals structural differences in white matter tracts between bilingual and monolingual children. SciVerse ScienceDirect. 1435: 72-80

How Can You Tell I’m American?

One of the greatest challenges about being in Paris is being constantly exposed to a foreign language. I have found that fewer Parisians than expected speak English. Having studied French for a number of years, I am always eager to test my ability to communicate with native French speakers. I try to practice French in Paris as often as possible, whether it is through ordering food (obviously my most important application of the language), asking for directions, or even giving directions sometimes. Just a couple of days ago I asked a young lady for directions while on the metro and even though I knew that I had appropriately phrased my sentence in French, she responded in English. I knew that my accent had given away the fact that my native language is English, but I had expected her to respond in French. I have found myself in similar situations on many occasions. One time I was speaking French to an angry Cite Universitaire, the campus on which we are living, security guard after I had been locked out of my room and he responded in English, “I don’t speak English.” I was puzzled and in French let him know that I can speak French and he responded, “no you can’t.” I chose not to take this second encounter personally and instead began to wonder what about my speaking bothered him so much. It must have been my accent. Accent perception is such an interesting concept. We can tell what country a person is from, or even perhaps the city in which they were born not by listening to the words that they say, but by listening to the way that they say them.

Cite Universitaire (labeled as A)- where we have been living for the past 3 weeks.

In a study done by Adank et al. (2011), native monolingual Dutch speakers were played Dutch phrases in a Dutch accent and were also played the same Dutch phrases in an unfamiliar accent. While listening, the subjects’ brains were monitored using an fMRI scanner, a machine which uses magnetic imagining to monitor brain activity. The study showed that when the sound stimuli changed from the familiar to the unfamiliar accent, more of the subjects’ superior temporal gyrus (STG), a brain area involved in basic auditory language processing, became activated. The STG has also been shown to be associated with phonetic-analytic listening to speech. Perhaps this gives insight into as to why more of the STG is activated when listening to an unfamiliar accent; the brain is recruiting more cells to help analyze the phonetics of the speech because the speech is foreign. It is important to understand this because when French individuals hear me speaking French with an English accent, their STG becomes increasingly activated and they recognize that not only am I speaking in an accent, but then, through using other areas of the brain, may be able to understand what language I am speaking.

An exhibit in the Louvre Museum spelling out "love differences" in many different languages.

Whenever I’m on the metro and everyone around me is speaking French, it is difficult for me to decipher what they are saying unless they are speaking directly to me. I was curious as to the ways in which my brain would have responded to the sounds on the metro had I learned French at a younger age, but second to English. I wondered how the bilingual brain responds to language perception in general. In a study done by Archila-Suerte et al. (2012), a group of bilingual Spanish-English speaking children (whose native language is Spanish) and monolingual English speaking children were played the English syllables, “saf,” “sof,” and “suf,” while watching a silent film. These syllables were chosen because they are pronounced similarly in Spanish and would provide more insight into activation of the bilingual brain (perhaps because they may activate regions involved in perception of both languages). The subjects were told to focus on the silent film while the syllables were being played to them and simultaneously the group was analyzing the subjects’ brains using an fMRI scanner. The study was performed for young and older bilingual and monolingual children. The group found that the young monolingual and bilingual children had the STG activated (let’s call this area 1) while listening to the syllables. This data implies that the bilingual children when just beginning to learn the second language perhaps relates it to the first language and processes it in the same brain area. However, the older monolingual children still only had area 1 activated during the task whereas bilingual children had area 1 as well as other areas in the brain activated. This suggests that as bilingual children begin to master a second language more, their brain recruits other areas, other than area 1, to help distinguish between the two languages. Perhaps my brain is similar to the brain of the younger bilingual children, since I have not yet begun to master the French language. My brain may not be able to recruit other areas to help area 1 decipher a language other than English and this may be why I am unable to easily pick out French words and conversations while on the train. However, French individuals who are able to easily recognize my accent, process what my native language is, and then respond in my native language perhaps have activation of other brain areas which help the STG decipher the language. This is due to the idea that they are bilingual and no longer need to relate their second mastered language to their native language. It would be interesting to see what the combined results of the first and second study would be; to pursue a study that looked at monolingual and bilingual individuals’ brain activation to speaking their common language in an accent. I am curious to see if by being well versed in more than one language, bilingual children are able to recognize accents easier. Maybe one day I’ll master the French language enough to not have to constantly compare it to English! I guess I’ll just have to ensure that this isn’t my last trip to Paris…

–          Ankita Gumaste

Adank P, Noordzij ML, Hagoort P (2011) The role of planum temporal in processing accent variation in spoken language comprehension. Human brain mapping 33: 360-372.

Archila-Suerte P, Zevin J, Ramos AI, Hernandez AE (2012) The neural bases of non-native speech perception in bilingual children. NeuroImage 67: 51-63.