Polyglot or Noob?

by Kaitlynn Love

art by Anonymous

Language is one of the primary ways we communicate with each other, and it is a fundamental aspect of human social behavior that is rapidly acquired early in life. Languages can be sound, sign, and tactile based, and language acquisition mechanisms differ depending on the time period during which the acquisition occurs. Second language acquisition may be more difficult for individuals due to the alternative mechanisms that are used beyond the sensitive period, a time span during which a young individual’s brain is especially malleable and shaped by experiences. Understanding the differences between the mechanisms used during versus beyond this period may be helpful in explaining some of the specific challenges involved with second language acquisition. 

 

Learning a new language naturally occurs during the sensitive or critical periods when children have access to language exposure. During these specific time periods, the brain is able to rapidly learn a language due to cognitive processing capabilities and neural plasticity. Neural plasticity involves the brain modulating its functions and connections to accommodate newly acquired information, such as exposure to a new language. The neural plasticity in children’s frontotemporal systems allows for a large variety of languages to be acquired, and it has been demonstrated that there are significant similarities in language acquisition regardless of the modality or style. In particular, deaf children that have access to sign language during the sensitive time period acquire sign language using similar neural mechanisms to that of hearing children who use spoken language [1]. Access to language exposure during sensitive time periods is essential for long-term language proficiency. In cases with deaf children, research has shown that individuals with late exposure to sign language have less activation of the left frontotemporal pathways used for language comprehension [2]. The brain attempts to compensate for this by relying on other neural pathways. Typically, indications of strong right-side activity within the brain implies a lack of the left-side neural connections which form in early youth. However, even though the brain attempts to compensate for the lack of adequate exposure, there are still significant delays in language comprehension and proficiency [2]. Since language is a fundamental aspect of our social interactions, it is important that babies are screened for hearing impairments as screening allows caregivers to adjust their approach to ensure that the babies have adequate exposure to language. 

 

LangAcq_JeeyaSharma_3

Language acquisition that occurs in early life involves left hemisphere specialization [1]. Two important areas in the left hemisphere are Wernicke’s area and Broca’s area. Wernicke’s area refers to the left posterior superior temporal gyrus and the supramarginal gyrus, which are in front of the occipital lobe at the back of the brain, and Wernicke’s area is routinely referenced in discussions about the comprehension of language [3]. Recent research has supported that comprehension of key aspects of a first language is possible due to significant left temporal and inferior parietal lobe involvement, which demonstrates that speech comprehension is not isolated to Wernicke’s area. The left temporal and parietal lobes are located between the left frontal lobe and the left occipital lobe at the back of the brain, and their involvement indicates a more extensive language comprehension network. Additionally, Wernicke’s area appears to play a role in speech production [3]. Another area of the brain involved with language is the Broca’s area, which is located in the left inferior frontal gyrus, which is at the bottom of the frontal lobe [4]. It is traditionally defined as the key area involved with speech production, and modern research has demonstrated that the Broca’s area plays a vital role in sending information across the broader speech production neural networks. This contradicts earlier ideas that speech production is isolated to the Broca’s area [4]. Both of these findings demonstrate the large neural interactions between different areas of the left hemisphere that are responsible for language, and these interactions allow for language acquisition in early life to be successful long term. 

 

 

The acquisition of a second language is increasingly valuable in the modern world. Bilingualism is also beneficial on an individual level because of the associated cross-domain transfer and cognitive control observed throughout the lifespan [5]. When compared to monolingual individuals, bilingual individuals may experience improvements in non-linguistic general and executive functions, such as switching between tasks and selective monitoring [5]. The sensitive period for language learning also has implications on second language acquisition, but recent data suggests that the sensitive period appears to extend until adolescence for second language acquisition [6]. Additionally, while acquiring a first language after the sensitive period has significant consequences on socialization and cognitive development, the consequences associated with missing the sensitive period for second language acquisition are not as substantial. Limited proficiency in a second language can harm communication efforts within that language, but the stakes are lower for the second language in the regard that it does not interfere with typical development throughout childhood.

 

LangAcq_JeeyaSharma_2

While individuals that acquire a second language beyond the sensitive period have more limited proficiency long-term, older language learners may make significant gains in the short-term. Second language acquisition later in life may come with specific challenges, but the brain retains mechanisms of neuroplasticity that allows that acquisition to occur [5]. The first language can also interfere with second language acquisition, and when specific sounds are unfamiliar in the first language, it is especially difficult for individuals to make the necessary distinctions in the second language [5]. Despite those challenges, adults may be able to apply knowledge from the first language to the second language if there is a significant amount of overlap between the languages. In fact, recent research suggests that adults even outperform children during the short-term when they are learning a second language with similar materials [6]. However, children have long-term advantages in second language acquisition. Data has shown that there is a rapid decline in second language learning ability at around seventeen to eighteen years old, specifically in the ability to learn and comprehend grammar. It is currently not clear exactly why this occurs, but it may be due to changes in neural maturation and plasticity, increased interference from the first language, or cultural and social factors, including the common transition towards a career occurring in late adolescence. [6]. Ultimately, the findings of this research contradicted earlier claims that the sharp decline in language learning capabilities occurs before adolescence, and this also offers more favorable implications for the acquisition of a second language in late adolescence and early adulthood. 

 

 

Language acquisition within the sensitive period is a significant factor in typical childhood development; therefore, it is paramount that children receive adequate language exposure for lifelong language proficiency. Additionally, beyond this time period, the brain acquires language through the use of alternative mechanisms, which include right hemisphere involvement. The use of alternative mechanisms has a wide range of implications for second language learners, but data also suggests that second language acquisition later in life is still achievable. These findings provide valuable insight into human social behavior and communication mechanisms throughout the lifespan. 

Polyglot or Noob? Read More »

Love on the Brain

by Manju Karthikeyan

art by Danielle Mather

Love_DanielleM_MAIN

We all know that feeling… 

Your special someone walks into the room and before you know it, all ration is thrown out the window. Your heart races, your cheeks flush, and a swarm of butterflies start bursting in your stomach. 

But why is that? Why do our bodies and our brains fall into a frenzy in the face of attraction? And what is the effect of love, or rather, the chemistry of love, on our brains?

 

Topic 1: the immediate reactions, ANS, brain regions, etc.

The initial feeling of desire and attraction starts in a region of the brain known as the medial prefrontal cortex (MPFC). This region is like Tinder, in that it will immediately swipe right or left on what’s in front of you. The MPFC has the ability to instantaneously evaluate something as minute as facial appearance, making lasting judgments from a short conversation and dictating the brain’s decision-making process from there [1]. 

Once the medial prefrontal cortex (MPFC) sees something it likes, the rest of the body is alerted. The central nervous system (CNS) in general plays a crucial role in responding to what is desirable in a person. This is where those traditional symptoms of the “love bug” come in; sweating, blushing, and nausea. By intervening with the autonomic nervous system (ANS), the CNS will send a variety of signals that instigate your heart to elevate blood pressure, alert your pupils to dilate, and elicit parasympathetic responses to sexual desire as a whole [2, 3]. 

The prefrontal cortex isn’t the only player when it comes to love. The limbic system, responsible for mediating primal attraction and behaviors, is also activated in response to attraction in humans. In particular, the hypothalamus, the hippocampus, the caudate nucleus, the anterior cingulate cortex, and the ventral tegmental area (VTA) become active [4, 5]. 

The VTA is crucial to hardwiring the brain’s reward system and dopamine production. When we see something we are attracted to, the VTA is activated in the same way as if it were given a reward, reacting to love like a drug [6]. Moreover, the synaptic firings within the VTA, paired with the influx in dopamine that occurs, mimic neurological patterns similar to cocaine rushes in addiction circuits [7]. Thus, acting “high” when we’re in love is not that far-fetched of an idea. 

However, love, like the brain, has many grey areas. When considering love as a spectrum, from platonic relationships to lust, the science of attraction becomes far more complicated: a complex mix of neurotransmitters, hormones, and other chemical signals that influence behavior.

 


Topic 2: Role of neurotransmitters; love lasting vs. attraction and lust

At the basis of attachment is oxytocin– often referred to as the love hormone. Coupled with another hormone, vasopressin, oxytocin is responsible for the formation of romance and pair bonding– especially in the beginning stages of a relationship [8, 9]. Oxytocin and vasopressin receptors are abundant in regions like the hypothalamus, and will often interact with the dopamine reward system in the brain [8]. Levels of these hormones vary as you move through the different stages of love. Lust, given the primary focus on arousal and reproductive efforts, is more readily associated with activating sex hormones like testosterone and estrogen than with neurotransmitters [10]. 

However, the transition from lust to romantic attachment tends to reflect deficits in testosterone levels for men [11, 12]. When lust transitions into attachment, oxytocin and vasopressin begin to dominate. This differs from attraction, the intermediary stage between lust and attachment, where the brain’s behavior is particularly centered around rewards. As previously mentioned, dopamine reaches peak levels to provide a foundation for love’s reward-centric influences. Norepinephrine is also involved in this euphoria, which is why we sometimes can’t eat or sleep when we’re in love [10]. 

 

Topic 3: Love and attraction impairing our judgment/sense of obsession; anti-love brain technology + potential conclusion

Given this, it is evident that the neurotransmitters, hormones, and chemical signals induced during attraction alter our neurological state. But to what extent? 

One possibility is that love attacks the brain like an addiction. It has been noted that attachment-oriented pair-bonding mechanisms and chemical sequences often overlap with reward-learning and addiction mechanisms of the brain [13, 14]. Scientific literature further emphasizes that one can indeed get addicted to love, and that to be in love with someone is much like being addicted to them [13]. While there are differences in oxytocin levels between romantic love and drug addiction, dopamine reward patterns are quite similar [15]. 

Thus, the ability to get addicted to love is not improbable, and the experience of love in the brain can be overwhelming. If addiction or the consumption of drugs impairs our judgment, and love mimics those effects through a similar addiction-prone mechanism, how does love affect our decision-making ability, our social cognition, or our self-control?

A proposed approach to combating a love addiction is time. While the early stages of romantic love are similar to patterns of addiction, these symptoms diminish as the relationship progresses [15]. As futuristic as they sound, we can harness these patterns of addiction to create anti-love technologies to induce chemical breakups. 

With lust, there are a plethora of drug interventions through antidepressants and androgen blockers that inhibit the release of hormones like testosterone [13]. However, altering one’s sense of attraction and attachment is incredibly subjective, with many raising ethical questions of potentially dictating one’s ability to love. Nevertheless, there have been some MDMA drug trials to induce love and ecstasy in patients with PTSD. Similarly, SSRI interventions (involving the serotonin receptor) are known to cause emotional blunting, detachment, and an overall lack of romantic stimulation as a treatment for obsessive-compulsive disorder (OCD) [16, 17]. Additionally, drugs that stimulate love have been used to treat depressed patients [13, 16, 17]. However, the credibility, accessibility, and safety of these chemical breakups are controversial and yet to be determined. 

 

Nevertheless, this highlights the development of humans to love and be loved, creating a neurological dependency on affection. Love truly has an effect on the brain, mediated by a variety of hormones, neurotransmitters, and brain activations. So the next time you feel your heart racing, cheeks flushing, and butterflies bursting in your stomach from talking to that special someone, ask yourself: is this me, or my brain on love?

 

Love on the Brain Read More »

Synesthesia is Green, Synesthesia Tastes Like Oranges

by Sonali Poobalasingham

art by Kate Richardson

Synesthesia_KateRichar_MAIN

What do Billie Eilish, Pharrel Williams, Kanye West, Duke Ellington, and Stevie Wonder all have in common? Besides being accomplished musicians and singers, they are all synesthetes, or individuals with at least one form of synesthesia [1]. Synesthesia is formally defined as the phenomenon in which “stimulation of one [sensory] modality simultaneously produces sensation in a different modality” [2]. In other words, senses may blend together, leading to experiences such as experiencing music as color or associating certain words with a particular taste. While there are numerous forms of synesthesia, all cases share three defining characteristics: 1) a crossover between two or more of the five senses, 2) inducer-specificity (such as the letter “S” always being paired with the color pink), and 3) the ability to provide detailed accounts of synesthetic experiences [3]. In this article, we will start with hypotheses about the origins of synesthetic experiences and then explore recent research on grapheme synesthesia. Finally, we will investigate attempts to use hallucinogens to induce synesthetic perceptions. 


Synesthesia may be more common than you think – in infants, at least. A scientific review of synesthesia suggests that everyone is actually born a synesthete, challenging the widely-held assumption that synesthesia is a phenomenon that affects only a select few individuals. One hypothesis posits that the root cause of synesthesia in adults is abnormal synaptic pruning, or the process by which communication between brain cells is cut due to lack of use [4]. The human brain is composed of billions of brain cells called neurons. The synapse is the part of a neuron that connects with other neurons, allowing it to communicate with neighboring neurons. Infants have many synapses – the most they will ever have in their life, in fact! Because of these connections, infants’ five senses are very highly attuned, which induces synesthesia. Synaptic pruning is the process by which unneeded synapses between neurons are removed, leaving behind only the most efficient and useful connections to be carried into adulthood. If this seems abstract, picture the way a skilled gardener turns a shapeless hedge into a beautiful hedge sculpture just by cutting out the parts of the bush that were overgrown. This is something the human brain has the power to do to itself! Synaptic pruning in infanthood optimizes neural networks such that most do not experience synesthesia in adulthood. Pruning is a highly controlled process, but mistakes occasionally occur. This incomplete pruning hypothesis explains that an “overabundance of [neuronal] connections” caused by incomplete synaptic pruning allows different brain areas to remain cross-linked when their connections should have been severed [5]. This linkage, in turn, allows different sensory regions of the brain to be tied together, leading to the synesthetic perceptual experience in which certain senses are inextricably linked. 

 

To test this hypothesis, researchers compared levels of connectivity between brain areas in adults with and without synesthesia. Indeed, this study showed that synesthetes had considerably more connectivity – that is, more neural pathways connecting various brain areas – than individuals without synesthesia [5]. Additional evidence shows that hearing spoken words triggers both the auditory and visual cortices in infants, but this effect diminishes once the child reaches around three years of age [6]. This further supports the explanation that synesthetic experiences are produced by an excess of neuronal connections in the brain.

 

Now, let’s examine a specific subset of synesthesia called grapheme-color synesthesia. Of the numerous ways synesthesia can display itself, grapheme-color synesthesia is one of the most common presentations and is the best studied. Grapheme-color synesthesia occurs when the processing of numbers and letters becomes cross-wired with perception of color, leading to individuals associating colors with numbers and letters. These colors may appear over the number or letter being viewed, or the color may present itself “in the mind’s eye” [7]. The cross-activation theory, which proposes that many brain areas may fire together in response to a single stimulus, may explain the occurrence of grapheme-color synesthesia. In the brain, the visual word form area (VWFA) lies next to an area that processes color, called hV4. Proponents of the cross-activation theory believe that neurons in hV4 fire synchronously with neurons in the VWFA in grapheme-color synesthesia, leading to the experience of seeing colors when viewing numbers, letters, and words [8][9].  

Synesthesia_KateRichar_side1

One way to study grapheme-color synesthesia is by using a form of the Stroop Test. The Stroop Test is a classic, well-documented psychological phenomenon: when instructed to read the word “blue,” most people can do so quickly and easily. However, when told to read the word “blue” when it is typed in a non-blue font, people often stutter, falter, and take longer to respond due to the conflicting word and text color [10]. Studies investigating grapheme-color synesthesia utilize the Stroop Test, but instead of changing the colors of words, the printed letter or number is made to differ from each individual’s synesthetic color of the character. If the printed character matches the synesthete’s color perception, the trial is called congruent; conversely, if the printed character does not match the synesthetic’s color perception, the trial is called incongruent [10]. For example, if the individual consistently saw the color blue while viewing the letter “Q”, a modified Stroop trial may involve displaying “Q” in the color blue first (a congruent trial) followed by “Q” being displayed in red (an incongruent trial). When synesthetic individuals were timed on how long it took to respond, they took significantly longer to respond in the incongruent trials compared to the congruent trials. Just as reading a presented word is the default response in classic Stroop trials, the default response in the modified Stroop trials is for a synesthetic individual to view the presented digit in the color that matches their synesthetic perception. In other words, the modified Stroop trials support the notion that individuals with grapheme-color synesthesia have no control over their synesthetic perceptions; their synesthetic experiences are as natural and involuntary as blinking. 

 

Another study on grapheme-color synesthetes sought to examine when exactly  synesthetic perceptions occur in the perceptual processing stages. Researchers used a congruence-incongruence model similar to the one used in the modified Stroop trials and combined it with a simple search task. In each trial, synesthetic participants were asked to find a letter when it is overlaid on a colored background. In congruent trials, the participant’s perceived color of the letter matched the background color; in incongruent trials, the participant’s perceived color of the letter did not match the background color [11]. For example, a congruent search trial for someone who associates “Q” with the color blue would consist of “Q” being displayed with a blue background, while an incongruent trial would consist of “Q” being displayed with any non-blue background. Across these search trials, synesthetes were able to identify the letter faster in the incongruent trials compared to in the congruent trials [11]. While this may seem like an obvious conclusion, it has fascinating implications. If synesthetic perception occurred in a later processing stage, synesthetes would quickly and easily be able to find a letter in a congruent trial before the perception of color appeared, because there would be a period of time in which the letter’s perceived color did not match the background, making it easily detectable. In order for background color to interfere with one’s ability to correctly identify a letter, the synesthetic perception of color (like seeing the letter “Q” as blue) must occur in an extremely early processing stage [11]. 

Synesthesia_KateRichar_side2

Interestingly, synesthesia-like experiences may also be temporarily inducible via psychoactive drugs. A controlled experimental study performed in 2013 showed that administration of lysergic acid diethylamide (commonly known as LSD) can induce synesthesia-like experiences in individuals without synesthesia, particularly perceptions of the grapheme-color and sound-color varieties [12]. However, these temporarily induced perceptions lack consistency and inducer-specificity; the links between graphemes, colors, and sounds are not reliably fixed. Consistency and inducer-specificity are important hallmarks of natural synesthesia that are lacking in LSD-induced perceptions. Nonetheless, the possibility of hallucinogens leading to synesthetic-like perceptions warrants more studies to learn more about how LSD causes synesthetic-like experiences.  

 

In summary, synesthesia is a very fascinating neurological condition. Current research suggests that synesthesia is universal during infancy, and that the neural connections responsible for synesthetic experiences are ordinarily severed via synaptic pruning. The incomplete pruning hypothesis proposes that synesthesia in adults is caused by incomplete synaptic pruning that has left unrelated neural connections intact, as seen in grapheme-color synesthetes demonstrating robust connections between the VWFA and the hV4 brain areas. Hallucinogens such as LSD have been shown to temporarily induce synesthesia-like experiences in non-synesthetes. Studies utilizing a modified Stroop Test and search tasks in individuals with grapheme-color synesthesia support the notion that synesthetic perceptions occur in a very early processing stage. While synesthesia has gained more attention in recent decades, there is still a lot undiscovered about it that merits future scientific study. After all, what better demonstrates the rich human experience if not examining the delicate, beautiful interplay between the human brain and perception of the world around us?

Synesthesia is Green, Synesthesia Tastes Like Oranges Read More »

The Rampant Disease We Call Social Discrimination

by Chloe Helsens

art by Owen Helsens & Kayla Barry

SocialDis_KaylaBar_MAIN
  1. What is social discrimination? 

            Imagine a disease, one that humans have created, is ravaging our population, and we are fallaciously withholding the cure from those who are suffering most. In a sense, that is exactly what social discrimination is: we made it, and although it’s preventable, we turned a social construct into a tangible reality within each victim’s body. Social discrimination is “the differential treatment of individuals based on their ethnicity, cultural background, social class, educational attainment, or other sociocultural distinctions” [1]. Social discrimination is rampant throughout societies around the world and is widely recognized as a type of psychological stressor, being detrimental to the mental state. Discrimination has had — and continues to have – immeasurably damaging effects on people all over the globe. Recent studies have investigated how the distress of social discrimination leads to negative physical and mental health outcomes. We will examine how a discriminatory society has forced its way into our physiological functions.

SocialDis_KaylaBar_side2
  • How does social discrimination change our brains?

There are hundreds of types of discrimination that remain rampant throughout societies for centuries, yet only contemporary scientific studies have begun to recognize the severity of social discrimination’s effects on the brain. For instance, a recent 2017 study examined seventy-four adults from traditionally marginalized communities where all participants completed a self-report regarding their exposure to discrimination. They were assessed through a functional magnetic resonance imaging (fMRI) scan, which measures blood flow in the brain. The fMRI scan measured participants’ amygdala activity, a region in the brain associated with emotion, and functional connectivity, which measures the interactions between two regions of the brain. The results found that individuals reporting greater incidences of discrimination were correlated with having greater amygdala activity. They also discovered increased connectivity between the amygdala and brain regions associated with the salience network, which are multiple regions in the brain that tell us which stimuli we should pay attention to [2]. The results were eye-opening because greater amygdala activity in the brain is associated with higher levels of chronic stress, and greater connectivity between the two regions “has been reported in individuals with PTSD and single-episode depression”  [3]. In summary, adults from traditionally marginalized communities had neural activity reflective of extremely stressed individuals, which can lead to higher risks for chronic stress and depression. This sheds light on the evermore apparent links between social discrimination and negative neurophysiological outcomes. 

Another study examining how US college students’ mental health is affected by social discrimination also found distressing results. Using data collected from the National College Health Assessment (NCHA), researchers surveyed around 417,000 college students from a myriad of universities across the US. They found that out of the 7.9% of students that experienced social discrimination in some form, there was a 37% increase in mental health symptoms and a 94% rise in the number of mental health diagnoses compared to students who reported no discriminatory experiences [4]. They also found other alarming statistics; for example, cisgender men who experienced discrimination were 210% more likely to consider suicide, 900% more likely to attempt suicide, and over 1000% more likely to self-diagnose with schizophrenia compared to cisgender men who had not experienced discrimination. High amounts of discriminatory experiences were reported most among Hispanic/Latino, African-American, Asian, and multiracial students, although Indigenous students had the highest association with poor mental health outcomes [4].  

SocialDis_KaylaBar_side1
  • Allostasis as a result of continual discriminatory encounters 

Now that we see the abhorrent impacts that discrimination has on mental health, it is no surprise that these effects can also manifest as physiological functions. But how do we get from discrimination to disease? Only recently have scientists discovered through what means this can occur; allostatic load, or the body’s response to the cumulative burden on itself because of chronic stress [5]. One example of the effects of allostatic load on the body is shown by a 2014 study conducting a meta-analysis of the physiological impacts of racial discrimination. They found that allostatic load, the culmination of life stress and its effects on the body, can exacerbate the effects of discrimination that the body feels by leading to pathophysiology, or thwarting how the body normally functions. Symptoms can include chronic stress, metabolism changes, and increased disease vulnerability. Essentially, the greater the chronic stress associated with social discrimination, the greater the cumulative impact on their health.

Racial discrimination can also lead to the over-activation of the hypothalamic-pituitary-adrenal (HPA) axis, which is a feedback system in the body where different hormone pathways communicate and interact with each other; principally, it acts as our body’s primary stress response pathway [6]. This over-activation of the HPA axis produces increased levels of cortisol, a hormone commonly released as a response to stress. Chronic release of cortisol can cause “metabolic changes and effects on the immune system as well as behavioral alterations,” which connects physiological effects, such as “mood changes and cognitive impairment,” to discrimination [7]. In conclusion, social discrimination can lead to an allostatic overload, acting as the mechanism for discrimination to convert itself into diseases within the body. 

Not only does discrimination affect our immune system and behavior, but it also activates a separate axis in the body that is triggered by stress, called the sympathetic-adrenal-medullary axis (SAM), that changes our cardiovascular function by increasing blood pressure, heart rate, etc [8]. The over-activation of these pathways consequently becomes a catalyst for heart conditions such as cardiovascular disease. In congruence with these findings, traditionally marginalized groups in the US, such as African Americans and Hispanics, had “chronic cardiovascular, inflammatory, and metabolic risk factors,” likely because of the discrimination they experience on a daily basis, according to Berger [9].

More recent research uncovered the fact that discrimination compromises brain matter integrity due to constant stress states. The study was conducted at Emory University with two groups consisting of both Black and White females, and investigated links between incidences of medical disorders, racial discrimination, and white matter integrity [10]. White matter enables information to be transmitted between different structures within the brain [11]. The scientists found that white matter can act as a bridge between discrimination and physiological health through chronic inflammation. Chronic inflammation damages the connectivity of white matter, which then translates to diseases such as hypertension, cardiovascular disease, and arthritis. Dr. Fani, along with her fellow researchers, found a pathway in which racist experiences may increase the risk for health problems via effects on select stress-sensitive brain pathways. They also linked these changes to “risk for negative health outcomes, possibly influencing regulatory behaviors.” “Now we can see that these changes may enhance the risk for negative health outcomes, possibly by influencing regulatory behaviors” [10]. Altogether, these studies expose the pathways in which discrimination harms marginalized groups, and how it thrashes its way into our physiological functions. 

How much longer are we going to tolerate social discrimination within our societies? Why is it still acceptable? We continually say, in theory, we are advocates for inclusivity and social justice, yet shy away from condemning our peers, friends, or family members’ discriminatory remarks. To what extent will countless bodies pay for a sickness our society instituted? Discriminatory experiences accumulate in our bodies, eliciting both psychological and physiological consequences, while also altering stress load, white matter integrity, and even our risk for diseases. Social discrimination continues to seep its way into all facets of our world, while many of us passively watch as it rots our brains and bodies. Instead of watching as societal discrimination manifests itself into mental health disorders, diseases, and other horrifying effects, we should finally recognize and enact preventative measures against the overwhelming mental and physical burden discrimination poisons us with.

The Rampant Disease We Call Social Discrimination Read More »

I’ll Be There for You

by Kiara Mehta

art by Laura Zang

Who is your best friend? Perhaps it is someone you met years ago from your hometown or simply someone you met this year in school, at your job, or enjoying a hobby. Think about how you established a friendship with this person and how much they impact your life. Did your relationship come naturally? Does it feel as though your friend makes up an essential part of who you are? Can you imagine your life without this person? While these questions might not be the first that come to mind when you think of your best friend, they are the basis of what allows us to understand why we chose to be friends with certain individuals and how they impact who we are. Making friends and establishing relationships is a fundamental human behavior affected by multiple factors, including culture, age, interests, etc. The human brain, among these, resides as an equal, if not more influential, part of who we seek to call our friends. The human brain plays a part in not only who we choose to establish friendships with, but how our friends shape, influence, and change our brains.

 

Who Do Humans Consider Friends?

 

From the people we see on a daily basis, to those who we confide in with our thoughts and typical worries, it can be challenging to pinpoint what a friend is. That is why there is no exact definition for a friend, as it varies individually. However, Lauren Brent, a neurobiology professor at Duke University, defines ‘friends’ as “pairs of individuals that engage in bi-directional affiliative (nonaggressive, nonreproductive) [nonsexual] interactions with such frequency and consistency so as to differentiate them from nonfriends” [1]. From this, friends seem to be two individuals that chose to continually surround themselves with each other on a regular basis. Nevertheless, the subjective nature of friendship needs further examination. For example, are we truly limited to only one friend? What degrees of frequency and consistency are necessary to consider  individuals friends—are people who we feel fond of but see twice a year not actually our friends? If nonreproductive behavior is excluded, do other types of friendships such as those of sexual nature count? While the answers to these questions vary from person to person, Christopher Roberts-Griffin, a researcher who studies frequent and desired qualities of friendship, notes that similarity, among other factors, deeply affects what those answers could be [2].

 

How Does the Human Brain Influence and Decide Who Are Friends?

 

Have you ever come across someone that you were able to get along with so well despite recently meeting them? If you have, you might have experienced the act of “clicking”. One of the most influential factors that our brain takes into account when deciding who our friends are is our ability to click with certain people. In other words, when the key components of an individual’s personality fit with someone else’s so much so that, over time, establishing a friendship becomes almost natural, these two individuals have clicked. However, the underlying science of how people click stems past face value into the world of complex neural processes and components. Carolyn Parkinson, a psychology professor at the University of California Los Angeles, measured the brain activity in individuals to see how likely they were to become friends based on their response to watching a movie. Parkinson notes, “more generally, people who responded more similarly to the videos shown in the experiment were more likely to be closer to one another in their shared social network, and these effects were significant even when controlling for inter-subject similarities in demographic variables, such as age, gender, nationality, and ethnicity” [3]. Because of this, Parkinson concludes that similar brain activity in parts of the brain like the nucleus accumbens and amygdala makes us click with certain people and thus makes friendships more likely. Our nucleus accumbens plays a major role in our motivation, actions, and reward experiences, while our amygdala is vital for basic emotions by processing our responses to external stimuli [4]. Together, the nucleus accumbens and amygdala influence our everyday actions and emotional responses to others and the environment. Our motivations, goals, and emotional responses play a critical role in today’s society. Perhaps individuals whose brain activity shares similar emotional and motivational characteristics are able to click with one another. This may be due to the weight that our motivations, goals, and emotional responses have in today’s society. Thus, how similar individuals are is a fundamental factor in establishing friendships.

 

While clicking with certain people is vital to establish formative friendships, first impressions are just as important. Daniela Schiller, a researcher at the Center of Neural Science at New York University, demonstrates that within seconds of meeting someone, the amygdala and posterior cingulate, which plays a role in cognition, form conclusions and quick decisions about people [5]. Not only does the amygdala take on important roles such as processing our environment and emotions, but it also enables us to form quick ideas and make assumptions about people around us. Ultimately, our amygdala considers these ideas to decide if we should continue becoming close friends after initial first impressions.

 

How Does Brain Activity and Behavior Reflect Those of Friends?

 

Spending an extensive amount of time around our friends can lead to our brain activity and behavior reflecting theirs. For example, Tanya Chartrand, a professor at Duke University, refers to a concept known as the chameleon effect, where people perform “nonconscious mimicry” of those around them [6]. Across three experiments, Chartrand showed this effect by observing participants in his study who unintentionally matched certain behaviors. In the first experiment, the participants’ motor behaviors inadvertently matched the motor behavior of strangers when working on a task together [6]. In the second experiment, mimicking posture and movement was correlated with interactional flow (smoother interaction). Imitating posture and movement was also positively correlated with likeness among participants. [6]. Finally, the third experiment demonstrated that individuals who were shown to be more empathetic carried out the chameleon effect to a greater extent than others [6]. While there is much more to learn about why and how the human brain unconsciously picks up our friends’ behaviors, Chartrand’s results show that by unintentionally mimicking the actions and behaviors of those around us, even just through simple tasks, we become more fond of them. We then continue to strengthen relationships, which, in turn, increases the frequency of the chameleon effect. This provides valuable insight into how our ability to click with others and cooperate is a way for our friends to influence how we carry out our actions. 

It is clear that how we are a reflection of our friends stems from a neurological basis. If our friends affect our physical behaviors, does this mean they are directly affecting our brain’s behaviors too?

Yes, but not in the way you think. Referring back to Parkinson’s research, friends have “exceptionally similar neural responses” compared to individuals who are distant in a social network. Parkinson chose a cohort of friends to watch a collection of video clips that varied in topics and genres to keep familiarity among the participants constant and attention constrained. Despite this, all participants were shown the same video clips in the same order to account for individual differences in the way stimuli were processed. Activity in the hippocampus (which plays a role in memory), putamen (which aids in movement), and amygdala was recorded. FMRIs of the amygdala and putamen showed that friends are very similar to each other in terms of how they “perceive, interpret, and react to the world around them”. Emotionally, we are very similar to our friends on a neurological basis. As a result, we tend to surround ourselves with individuals who not only think but also feel the same way we do. In addition to behaving analogously to our friends, our brain activity shows we are emotionally alike as well. Nevertheless, Parkinson notes further research must be done in order to determine if neural response similarity is a “cause or consequence of friendship” [3].

 

How Do Friends Change the Human Brain?

 

The human brain has a predominant role in choosing our friends. But how do our own friends influence and change our brains over time? Brittany Woods, a psychologist at Boston University, notes that close friends positively affect one another’s brain by lowering response in the lateral prefrontal cortex. The lateral prefrontal cortex is believed to deal with affect reappraisal, the process of reevaluating emotionally salient situations to respond and cope differently by decreasing overcontrol of positive emotions, resulting in an overall more positive affect [7]. Woods also noted, “neural response to their own close friend relative to an unfamiliar peer was related to greater activation in a cluster that encompassed both the caudate head and the septum, a region implicated in many affiliative processes such as unconditional trust” [7]. It is important to recognize that while some individuals vary with their amount of trust when initially meeting someone, that over time, they may learn to become more trusting as a result of establishing formative friendships.

 

Conclusion

 

Given the research that has been done exploring how our brain influences who our friends are and how our friends affect our brain, it is clear that our brain plays a prominent role in our social life. However, that is not to say that the brain is the sole factor that affects how we make friends. Making friends, as straightforward and natural as it may seem, is the result of a very complex process. Even more so, this process differs from person to person. With these underlying variations, it may be beneficial to research why the process of making friends is so different among people. Furthermore, more research can provide insight into why, neurologically, some individuals have a more difficult time establishing close and lasting friendships compared to others. In any case, it is important to recognize that our friends affect us in different ways and that our brain is a direct reflection of their effect on us as well as our decision to establish friendships with them. So, the next time you are around your best friend, remember your friendship, in part, stemmed from your very own brain.

 

I’ll Be There for You Read More »

The “Iron Man” of Biological Salvation: Biofeedback Use in Disease Mitigation

by Sujay Edavalapati

art by Leia Marshall & Nicole Cobb with Dream AI

Everyone knows that one Marvel fanatic, whether it be your friend or your sibling, who has watched all of the movies three times over; this means you are likely familiar with Tony Stark’s beloved piece of augmented reality technology, J.A.R.V.I.S, from the Iron Man movies. J.A.R.V.I.S was designed by Tony Stark, who used his own thoughts to control the AI which gave him formidable power amongst the Avengers. What was only an idea to Stark materialized into reality as a result of his creativity. Although the Iron Man movies failed to explain how Stark was able to create such technology, it seems that the modern world found the solution without Stark’s help. As an idea first coined by researchers at Harvard University, the mechanism of biofeedback has taken the world by force [1]. However, instead of becoming a formidable military weapon like in the Marvel universe, biofeedback technology in the real world serves as one of the premier sources of information to understanding human physiology and disease. 

Prescription drugs have been a longstanding vision of medical treatment in modern society. However, if the evolution of modern epidemiology has taught society anything, it is that medicine is never black and white. Hence, biofeedback provides an alternative approach to understanding certain diseases whereas prescription drugs may not either be effective or available at all. Biofeedback is the technique of using sensor-related data connected to an individual’s body to measure and monitor bodily processes and gather information for providers [2]. For example, neurofeedback, a form of biofeedback, is able to track information such as vitals through attachment of electrodes to the brain, which was one of the ways J.A.R.V.I.S. was able to process Stark’s vitals [3]. Stark was able to create a neural interface that connected to his armor, which allowed him to receive critical sensory and vital information during many of his important battles. 

Now, biofeedback isn’t as convoluted as the Iron Man movies may have made it seem; in reality, it works in four main steps for the average user. The first step involves monitoring one’s vitals to collect data about average heart rate (HR), breathing rate, blood pressure (BP), pulse, and temperature to extremities . This information acts as a baseline for feedback and can give more insight to researchers what type of stress a particular patient is experiencing [4]. Notable physiological indicators of stress include higher HR, high BP, and lower temperature and sensation in extremities [5]. Individuals who use biofeedback therapeutically try to mitigate and relieve these stress indicators to help promote greater well-being. 

After the collection of this data, a biofeedback technique is employed on patients. The biofeedback technique is specifically focused on receiving information about one’s own physiological data, which allows them or providers to understand further courses of action. A common misconception is that biofeedback refers to one specific technique that can be applied to all patients. However, biofeedback therapy falls within a larger subset of positive adaptive coping techniques, which are therapeutic activities that many people follow in their life without noticing. Some biofeedback techniques include progressive relaxation exercise, meditation, yoga, exercise, deep breathing, or even discussion of the issue [4]. The goal of implementing one of these exercises is to assess the vital information before, during, and after the technique to monitor fluctuations and provide positive feedback upon receiving the results. This leads into the second step, which is linking the data received to physiological processes in the human body. For example, if there is an increase in skin temperature, this is associated with increased blood circulation in the body. The third step is to channel that feedback signal as a learning tool towards improving physiological responses, which will allow an individual to control his or her body’s response to certain stimuli by changing thoughts, actions, or behaviors. For example, after receiving the feedback signal from the machine that their temperature levels are elevated, an individual can practice relaxing techniques such as slower and deeper breaths to increase blood flow throughout their body [4]. The last step is to prolong this response over weeks or months to gain greater control of the body’s physiological response, ultimately without the use of a biofeedback device, to help tackle stress and other harmful behaviors. While this treatment does not result in an immediate response, if done correctly, it can serve to have long-term, sustainable treatment benefits that can be more effective than medication [4]. 

Indeed, it has been shown that people who complete a biofeedback treatment sequence experience health benefits. In 2016, a study was performed amongst managers at a particular workplace dealing with stress, where subjects were told to keep a daily stress diary and would also wear wearable technology that detected changes in blood volume and heart rate over a five week period. Overall, subjects reported lower resting heart rate, less stress and anxiety, more energy, less fatigue, better health perception and social functioning [6]. Biofeedback has also been shown to improve those who have been struggling with mental health because it can be performed using wearable technology, or through unconventional methods like video games, to help regulate anxiety and stress levels and promote higher well-being [7]. At a biological level, biofeedback helps inhibit an increase in cortisol, a hormone induced by chronic stressors. This is why those who use biofeedback experience lower cortisol levels over the treatment period. The contributions of biofeedback are also prevalent in physical health like the use of EMG-feedback after knee surgery to help improve the knee’s range of motion or limb rehabilitation in elderly patients [8]. Yet, we can’t forget about its applications to neuroscience, can we? A form of biofeedback, neurofeedback, utilizes the brain’s neuroplasticity, or the ability of the brain to change its functionality over time, to help regulate homeostasis, mitigate disease, and strengthen the stability of the body [9]. 

Biofeedback has been generally regarded as a safe procedure but there are some risk factors like having certain heart rhythm problems or skin conditions [10]. Biofeedback has still not been tested for every condition, and there is not sufficient evidence currently to say that it will completely treat a disease but research is still ongoing at many institutions into gaining a better understanding of its applications and effects on numerous human diseases. Some key limitations to biofeedback research include too small of a sample size in studies as well as lack of consistency between treatment groups, which can occur due to difference in triggers and compliance between subjects [11]. However, future studies hope to compensate for these limitations by including longer treatment durations with larger sample sizes, more physiological parameters, a method to track the change in symptoms over the treatment, and a long-term follow-up appointment to record anxiety and physiological levels after the treatment [12]. 

Biofeedback has been an upcoming therapy technique in the medical field of the recent decades, and while its effects are still fully being studied, it has helped encourage a dramatic revolution in the way people view medical treatment. Current research has shown how instrumental a tool biofeedback has been in not only relieving a plethora of medical conditions but also its functionality for the average person in helping to mitigate detrimental stressors. Although modern biofeedback therapy must make significant strides to become as advanced as J.A.R.V.I.S., its clinical applications to th

 

The “Iron Man” of Biological Salvation: Biofeedback Use in Disease Mitigation Read More »

Head Transplants: Medicine or Experimental Indulgence?

by Elena Perez

art by Victoria Amorim

Eerie music begins to play, fog fills the room, lightning strikes, and a man in a white lab coat exclaims, “IT’S ALIVE!” A figure sits up from the cold metallic examination table, only to look down at their body and discover it is not their own, but someone else entirely. You might recognize elements of this scene from Mary Shelley’s Frankenstein. However, in our story, the mad scientist is not Dr. Frankenstein but Dr. Sergio Canavero, an Italian neurosurgeon, and the creature is not a man brought back from the dead but someone who has experienced HEAVEN. To me, what Canavero calls HEAVEN seems more occult than it is angelic. The head anastomosis venture (HEAVEN) is a surgical procedure in which a head is transplanted from one body to another. Canavero claims the procedure could bring mobility back to people with quadriplegia (paralysis of all four limbs), restore quality of life for individuals suffering from neurodegenerative diseases such as Spinal Muscular Atrophy or Amyotrophic Lateral Sclerosis, and even pave the way for immortality. Some call him a visionary, others call him a nutcase. Although this might sound like an outlandish endeavor, never meant to escape the realm of science fiction, Canavero and his team have already made significant headway. When someone has a faulty organ, it can be replaced with a live organ transplant, so why would someone not have their head transplanted from a faulty body to a healthy one?

 

The Surgical Procedure

No easy feat, the 36 hour procedure would involve four teams of surgeons, each including a variety of medical personnel (surgeons of different specialties, anesthesiologists, nurses, and technicians) [1]. To begin, the teams would anesthetize, intubate, and cool the body of a brain-dead body donor and the transplant recipient to 15 degrees Celsius [2]. Deliberate hypothermia slows down metabolic processes, reducing inflammation and cell death, which allows the surgeons about one hour before tissue damage occurs [3]. Cutting both patients’ necks, the surgeons link the blood vessels from the head of the transplant recipient to the blood vessels of the donor body via tubes. The most critical phase of the procedure comes next, in a process called the GEMINI spinal cord fusion protocol, in which the doctors sever each patient’s spinal cord and place the head of the transplant recipient onto the donor’s body [2]. They use a substance called polyethylene glycol (PEG) as a “biological glue” to help promote spinal cord fusion [4]. Then, the team sutures the muscles and blood vessels of the head to those of the body. To assist the healing process, the patient is put in a medically induced coma for about one month, and the medical staff carries out extensive immunosuppression protocols to reduce the chances of the immune system rejecting the transplant [2].

Endless Possibilities versus Empty Claims

Over the past decade, Canavero has worked with Xiaoping Ren, a Chinese orthopedic surgeon, to refine the procedure and perform experiments on mice, dogs, and even human corpses [1,5,6]. As you can imagine, many medical professionals are skeptical of the feasibility of the procedure for a live patient. In response, Canavero and Ren emphasize that only 10% of descending spinal tracts (neuronal pathways that send motor signals from the brain to the spinal cord) are required for voluntary movement [7]. The GEMINI spinal cord fusion protocol, has a success rate of up to 20% reconnection, which Ren and Canavero have demonstrated is sufficient to allow some motor function in mice and dogs [2,5]. This is largely due to the use of an extremely sharp blade, which cleanly cuts each neuron’s axon in the spinal cord, and the use of PEG, which promotes axon reconnection. Before these techniques, animals who underwent head transplantation were left paralyzed from the neck down [8]. Although PEG-mediated axonal recovery is not perfect, the return of some motor control is a feat in itself. Even the subject surviving the procedure is an accomplishment. The ischemic period, that is, the time that the brain is without blood flow (due to induced hypothermia), can cause massive cell death. One mouse study by Ren shows 12 out of 80 mice surviving past 24 hours [9]. However unfavorable this might seem, a 15% survival rate for head transplantation in a mouse model would have been thought impossible 20 years ago. 

As impressive as Ren and Canavero’s progress has been, the procedure still contains many practical limitations. While the GEMINI protocol does induce some axon regeneration, the effects of such minimal nerve recovery on the spinal cord’s many other functions, including the transmittance of sensory information, pain sensation, and proprioception are uncertain [10]. However, functional outcomes are of little importance if immune rejection occurs. Transplant recipients must take immunosuppressive drugs (often for the rest of their lives) to reduce the chance of the body’s immune system attacking and destroying the foreign organ or tissue [11]. Still a prevalent obstacle, about 85% of hand transplant recipients experience acute immune rejection within the first year [12]. 

Even if the immune response is controlled, coming to terms with having an entirely new body might make someone lose their mind. Transplant recipients frequently experience depression or psychosis as a result of mind-body dissonance [13,14]. The recipients of the first hand transplant and the first penile transplant can attest to this, as the severe psychological distress that ensued led to them having their transplants removed [15]. Considering that head transplantation is a more extreme change in appearance than a single organ transplant, critics argue that the procedure could drive patients to insanity and suicide [16]. Canavero contends that the “self is highly plastic and easily manipulatable”, so with a combination of immersive virtual reality (IVR) and hypnosis, the individual would psychologically adapt. Seemingly unfazed by the weaknesses of the procedure, Canavero continues in his pursuit to push the boundaries of modern medicine, proclaiming that every life saving procedure was deemed outrageous or impossible at some point [7]. 

 

Immortality Imagined

Canavero proposes that the surgery could be used to transcend the laws of man and let us live forever. In his book, Head Transplants and the Quest for Immortality, Canavero posits that by taking the head of an aging body recipient and transplanting it onto the body of a younger clone, a person’s life could be extended for up to 40 years [17]. Supposedly, the blood from the young body would have rejuvenating effects on the brain [18,19]. In his vision of the future, individuals would body swap with younger clones of themselves in an ongoing cycle [17]. At a press conference in Vienna, Austria, in 2017, Canavero said this:


For too long nature has dictated her rules to us. We’re born, we grow, we age and we die. For millions of years humans have evolved and 100 billion humans have died. That’s genocide on a mass scale. We have entered an age where we will take our destiny back in our hands. It will change everything. It will change you at every level.”


As Canavero puts it, aging is genocide, and our only hope is HEAVEN. This places Canavero at the heart of the transhumanist movement, which advocates for the enhancement of the human race using science and technology to extend life spans and improve cognition [20]. It is human nature to fear death and fight against it, but it is also our biological nature to die. In disputing death, Canavero assumes himself a god. After Dr. Frankenstein’s monster comes to life, he exclaims, “in the name of God, now I know what it feels like to be God!” [21]. Perhaps delusions of grandeur are just the trademark of a mad scientist. But still, this begs the question, should head transplantation really be done, or has Canavero’s inflated ego clouded his judgment?


Cerebrocentrism in the Era of Embodiment

While head transplantation possesses many operational flaws, Canavero’s proposal suffers from one principal philosophical pitfall: cerebrocentrism [10]. Fundamentally, the HEAVEN protocol assumes that personhood, selfhood, and identity are localized to the brain. Thus, the body is merely a physical conduit through which the brain can interact with the external environment. Similarly, mind-body dualism, coined by 17th century French philosopher Rene Descartes, defines the mind and body as completely separate entities that can exist independently of one another. As Descartes famously said, “cogito, ergo sum,” meaning “I think, therefore I am” [22]. But are we actually nothing more than our thoughts? Consider the “brain in a vat” thought experiment proposed by Gilbert Harman, a 20th century American philosopher. Imagine your brain was removed from your body, placed in a vat, and connected to a supercomputer that supplied your neurons with all the same electrical impulses that your body normally would. Your “disembodied” brain would register these virtual stimuli as a conscious reality. Many philosophers have tackled the “brain in a vat” scenario to question how one can ever know that what they are experiencing is real and not just a bunch of electrical impulses from a supercomputer [23]. But I would like to pose the question: If you were just a brain in a vat, would you still be you? This thought experiment, as well as Ren and Canavero’s HEAVEN, work off of the assumption that a person is their brain. Or rather, the essence of a person is merely made up of electrical impulses and that the body is expendable to the human experience.

 To me, this 17th century philosophy seems a bit out of date. We live in an era of embodiment, which recognizes that our bodies are integral to our personal identity and our experience of being. In many ways, our bodies characterize who we are, to ourselves and others, defining how we move throughout the world [24]. As Dr. Thomas Fuchs, a professor of philosophy and psychiatry, puts it in his book In Defense of the Human Being, the brain “is only the necessary, but by no means sufficient condition for personal experience and behavior. It is not the brain, but the living person who feels, thinks, and acts” [25].  Indeed, advances in the field of embodied cognition show us that our bodies influence emotion, personality, and identity in profound ways. For instance, the enteric nervous system (ENS), often called a “second brain”, controls the gastrointestinal tract, and has five times more neurons than the entire spinal cord [26]. Research shows that the ENS might have a significant influence over our emotions and mental wellbeing [27]. The gut-brain connection is further governed by the gut microbiome, referring to the collection of microbes (bacteria, fungi, and viruses) that live inside the gut [28]. Gut microbiota can influence sociality, perceptions of stress, and even the development of neurological disorders [29,30,31]. Although the roles of the ENS and gut microbiome in shaping personhood are not fully understood, evidently the body is integrated with cognition in complex ways. Ren and Canavero’s stance that bodies are transient frames that can be arbitrarily switched out without distortion to the self is an irresponsible one that completely disregards the prevailing rhetoric of embodiment. 


Takeaways

Understandably, the prospect of functionally curing every disease or injury that does not impact the brain is enticing, however unrealistic. The 10 million dollar procedure strikes me more as a pompous passion project than an altruistic undertaking. Canavero’s radical and idealistic mission generates many more ethical dilemmas than one commentary can engage with; provoking questions about body sourcing, animal experimentation, and accessibility. This is not to say that there are no practical benefits to his work. Ren and Canavero’s advancements in the understanding of neuronal plasticity and axon regeneration in the spinal cord could be applied to helping those with spinal cord injuries, tens of thousands of which occur every year [32]. From a utilitarian perspective, the greatest good that could come of this research would be that which improves the lives of the most people [33]. But alas, Canavero is not a utilitarian, and perhaps neither are you. I urge you to consider the evidence presented here and decide for yourself: Is head transplantation ethically supportable? Should humans pursue immortality, and at what cost? Are you your brain, why or why not?


Head Transplants: Medicine or Experimental Indulgence? Read More »

Out of Body, Out of Focus: The Disconnect Between Mind and Body

by Drew Lawless

art by Charlotte Kaufmann

             Without warning, our minds have the terrifying power to separate us from being in focus to swimming in a sea of disorientation. This feeling, known to most as “dissociation,” is a bodily response that causes one to feel disconnected from their thoughts, emotions, and surroundings, which can be a startling and abstract experience. Sometimes, this cognitive response can go beyond its general function; this is the area of dissociation that can be officially recognized as a disorder. This article aims to explain the neuroscientific perspective of why dissociation occurs, what causes it to go haywire in the form of a disorder, and how it varies from person to person. 

             The origins of the documentation and study of dissociation can be traced as far back as the late 18th century. Only after the Vietnam War in the late 1970s was dissociation truly recognized as a legitimate psychological state of being. Over time, it also began to be identified as a common symptom of other mental disorders [1]. This was primarily due to dissociation being tied to post-traumatic stress disorder, or PTSD for short. This disorder is now known to be heavily influenced by dissociation, mainly due to how it was seen to confuse the traumatic event (such as gunfire or an explosion) with non-traumatic, everyday stimuli (such as a car backfiring) [2]. The sounds and sights of wartime carried over into certain veterans’ post-war lives, leaving them to suffer through an unknown “out-of-focus” sensation when triggered by their environment [2]. Following this period, various tests and DSM (Diagnostic and Statistical Manual for Mental Disorders) classifications were developed for the “subcategories” of dissociation and disorders that have it as a major symptom. It has now become a major source of psychological research and inquiry [1]. 

Unfortunately, dissociation is still a relatively unexplored field of neuroscience. Consequently, psychologists and therapists lack the knowledge to fully accommodate the needs of their patients. There is not much general knowledge on dissociation, as well as no true definition of the condition. Such a large variety of related symptoms and perspectives exist that it is difficult to pinpoint what it exactly means to dissociate or have a dissociation disorder [1]. Generally, however, dissociation can be grouped into two separate sub-experiences: depersonalization and derealization. Depersonalization describes “out-of-body” moments, while derealization is associated with “out-of-focus” moments. These sub-experiences range in terms of how extreme they are, how often it is noticed in varying individuals, along with all the other symptoms experienced during dissociation. 

From a physiological perspective, dissociation is the human body trying to protect us from negative external or internal stimuli. This can be in the form of a stressful situation or general anxiety. From a neuroscience perspective, research points to the brain’s fear response and emotion-processing centers as the primary cause [3]. The lack of signals being sent and relayed to other brain regions prevents the individual from being able to comprehend what is actually happening. Electrical signals from active brain cells, or neurons, are slowed down. This means that fewer activating messages or commands are being sent to these regions and that they take much longer to reach their designated target. Thus, they have no lasting effect on this region of the brain known as the “Cortico-Limbic System,” the portion of the brain that associates our thoughts or awareness with our emotions [3]. The overarching result of this is a lowered sensory awareness of one’s self and their surroundings [3]. 

            What happens, however, when this natural response occurs without a reasonable or legitimate stimulus? This is where general dissociation turns into a dissociative disorder. Various dissociative disorders exist, but three very prevalent disorders worth focusing on are dissociative amnesia, depersonalization-derealization disorder, and dissociative identity disorder. While plenty of other mental disorders are involved with dissociation, these three, in particular, showcase very explicit and/or extreme dissociative symptoms. 

Dissociative amnesia is categorized as an unusual inability to recall information or memories related to a traumatic or stressful event in one’s past. It is usually developed in response to trauma at a young age, mainly due to how critical those years are for our brain’s development [4]. However, there are, of course, instances of adults experiencing severe trauma and developing the disorder as well. This is because the person’s mind works extra hard to shut out the traumatizing stimuli, to the point that it is quite debilitating and prevents easy memory retrieval related to that time or event [4]. 

Depersonalization-derealization disorder is just like it sounds: it features extreme, maladaptive cases of both depersonalization and derealization symptoms. The person diagnosed might feel lifeless or aimless, as if they were watching a movie of their life instead of actually experiencing it. Obviously, they are still conscious, active humans, but they internally function much differently when going through an episode of the disorder [5]. The symptoms can be brought on by the biological response to stress or for no apparent reason; however, it is  kicked into overdrive and makes the observer hyper-aware of their “trance-like” state [5].

Dissociative identity disorder, easily one of the most interesting and debated mental disorders, is what was once referred to as “multiple personality disorder.” It can be defined as the creation of alternate “senses of self” that appear to most outside observers as distinct personality changes. This occurs in response to intense childhood trauma, abuse, or neglect, mainly to protect the original personality from the stressful stimulus or pain being suffered. The dissociation aspect specifically involves the mind, causing the original personality to forfeit control of the body and awareness of their surroundings to an alternate personality [6]. This is not a voluntary action by the primary self; to them, it is as if time has skipped forward, and they have no memory of what happened between their last thought and now. Of course, this changes for everyone who has the disorder, but some very extreme cases, such as someone having 27 personalities, have made the illness a major cultural spectacle. Psychologists and neuroscientists have studied the illness for decades. Despite the major belief that it was not a real disorder, based on more recent scientific evidence, it has been proven to have a biological basis as a legitimate chronic disorder [6].

In regards to these and other disorders related to dissociation, treatment is a feasible yet abstract process for those suffering through its wide range of symptoms. No specific medication is available to target areas of the brain responsible for dissociation, and there is not one definitive therapy option or style in use. However, a mixture of various therapeutic strategies and particular medicine available can improve one’s well-being [7]. Common examples of this include talking through the traumatic stimulus that causes the symptoms with a licensed therapist or psychiatrist, learning how to bring oneself back into focus when dissociating with a strategy called “grounding,” and regulating the stress itself with medicine [7]. 

Since most disorders related to dissociation involve issues with the body’s stress response system, antidepressants, anti-psychotics, and anti-anxiety medications can help lower the afflicted mind’s natural tendency to pull one’s consciousness away from their surroundings. The intended result of this is to lessen the severity of the dissociative symptoms and their frequency. Neurotransmitters, chemicals responsible for changing the electrical activity (or inactivity) of certain brain regions, are the main focus of these medications [8]. These include serotonin, which controls mood; dopamine, which controls feelings of reward and happiness; GABA, which inhibits anxiety or stress; and glutamate, which controls learning and memory [8]. 

Current research on dissociation is done with attempts to find the more accurate source of dissociative problems in the brain. A recent study from Stanford found interesting evidence pointing to new areas of the brain and even our cells that might be the root cause for these dissociative feelings. More specifically, these are the posteromedial cortex, which controls subjective thought, and ion channels on the cell membrane, which control the sending of signals via neurons. By stimulating and/or accessing these regions of the body in humans diagnosed with epilepsy (those with recurrent seizures), they experienced mild dissociation but no seizure whatsoever [9]. Although this might not be a complete solution to the dissociation symptoms, this research can provide great assistance in understanding mechanisms of neural signaling that can also cause dissociation when underactive, compared to being overactive as seen in Epilepsy. We now also have a better idea of where to look for the cause of dissociation [9].

Due to the complex nature of dissociation, there is not one correct answer. Despite these unfortunate circumstances, there are various means to seek help for symptoms. The amount of research available on dissociation is inadequate to properly understand and treat its many facets. With time, resources, and continued support for the neuroscience community, there is great hope for what can be discovered and done for those dealing with any form of dissociative and related disorders. What can be done now is informing oneself on what people go through with these unfortunate dissociative mental disorders. Although the average person can’t change dissociation in a single night, it helps tremendously to be aware of dissociation in general and how prevalent it actually is in human life.

Out of Body, Out of Focus: The Disconnect Between Mind and Body Read More »