All About Georgia OTTers

The otter is an amphibious mammal that can be found throughout the United States and other regions of the world. The good news is that these cute animals can be found in our home, Georgia. History reveals that the North American river otter (Lutra canadensis) is an amphibious playful member of the weasel family as are the mink and sea otter. The fur color of a river otter can be described as a dark brown to almost black.

In the United States, many states have experienced a gradual decline in the river otter populations in time. Despite this, otters are resilient creatures, as they were almost hunted to extinction in the late 1800’s, due to their luxurious fur. However, in the year 1911 the “International Fur Seal Treaty” banned the sale of otter fur. This period, in which fur trade was happening internationally, caused otters to be hunted almost to extinction.

Today, Georgia’s river otter population continues to remain fairly widespread throughout much of the state. These otters love to hunt, play, and sleep in rivers or streams in Georgia. The river otter thrives throughout the state of Georgia, including areas throughout north Georgia where its populations were once at risk for diminishing. A main factor that influences the population of river otters is pollution. For instance, in areas where water is very polluted there is a more limited otter population. Like many animals, the river otter abundance is directly dependent on habitat quality and availability. Otters will create dens with lots of vegetation present in various locations, such as near waterways, and these dens function to protect them from flooding or even protect from predators (Malzahn, Caven & Wiese, 2020).

In Georgia, there are several local places where residents and visitors have the chance to see river otters up close. The Georgia Aquarium is a wonderful place to see, interact, and even feed otters! This is a unique opportunity for individuals to touch, feed, and talk to sea otters with a professional trainer. Also, “The North Georgia Wildlife Park and Zoo” offers an interactive experience that is great to learn and see our native Georgia river otter. There are different experience options that include Otter Splash, Otter Experience, and VIP Otter Experience. Both of these places offer a great opportunity to learn more about river otters and how they behave in their natural habitats.

Facts about the River Otter


Malzahn, J. M., Caven, A. J., & Wiese, J. D. (2020). Characteristics of a river otter (Lontra canadensis) maternal den in the central Platte River Valley, NE.

What is Informed Consent?

Informed consent is the process of obtaining a patient’s or participant’s permission prior to conducting a medical procedure or investigation on said person. It involves ensuring that the participant completely comprehends and agrees to the potential consequences of any procedures that they will undergo. Examples include, a health care provider asking their patient to consent to a surgical procedure before providing it, or a psychologist discussing information about the study with a future research participant prior to enrolling them into an experimental study. As such, informed consent is collected according to guidelines from the fields of medical ethics and research ethics, and centers around the protection of patient welfare and security.

When a healthcare provider recommends specific medical care, the provider must clearly outline all aspects of a given procedure to the patient, who has the option to agree to the entire proceeding or only parts of it. Beforehand, the patient must complete and sign a consent form, which serves as a legal document of agreement and participation. This form will most often contain essential information regarding a procedure, such as the name of the patient’s condition, the form of intervention that the provider recommends, risks and benefits of said intervention, as well as the risks and benefits of any other options (including not conducting the intervention). In order for consent to be properly given, the patient must have received all information about potential treatments, understood the information, had a chance to ask questions, used the information to decide if they wish to receive the recommended treatment options, and agreed to receive some or all of the treatment options. Only in completing these essential steps can informed consent be satisfactorily given in a medical context.

Meanwhile, the main purpose of clinical trials is to study new medical products in people. As such, informed consent for research or clinical trials is also required, as newly-developed medical products may contain unforeseen side effects or risks. It is therefore important for those considering participation in a clinical trial to understand their role as a research subject rather than a patient, allowing them to make educated decisions about their participation in a study. Participants must be informed about what will be done to them, how the research will proceed, what risks or discomforts they may experience, and that their participation is a completely voluntary decision. A potential research subject must also have had the opportunity to read the consent form, ask questions about anything they do not understand, and have had a sufficient amount of time to make an informed decision.

The processes for healthcare and research are similar in nature, with both having three main ideas that must be fulfilled in order for an individual to have given valid informed consent:

  1. Disclosure. The provider has supplied the subject with the information necessary to make an autonomous decision.
  2. Capacity. The subject has both understood the information provided and formed a reasonable judgment based on the potential consequences of their decision.
  3. Voluntariness. The subject has made an autonomous decision without being subjected to unfair external pressures.

In general, informed consent can only be given by adults who are capable of making their own medical decisions. Children and those who are unable to make their own medical decisions, such as individuals with mental disabilities, must have their informed consent given by a parent, guardian, or other surrogate: individuals who are legally responsible for making decisions on that person’s behalf. The duty of obtaining informed consent for participation within a research study by children, who are unable to provide full consent themselves, is endowed in the parents or guardians, who are thought to have the best interests of the child in mind. Issues can often arise, nonetheless. For instance, there is data supporting that informed consent for research by adults for themselves is often faulty, associated with a poor comprehension of the voluntary nature of study participation, or the meaning of language used in the trial, as well as other issues.

Institutional Review Boards (IRBs) have been put in place by the FDA to preserve the rights of human subjects in biomedical research. An IRB has the ability to review research and can request modifications, approve, and disapprove research to ensure the safety and wellbeing of the research’s subjects. However, an IRB is also able to grant complete waivers of informed consent in the case of research on medical records if it is not practical to obtain consent and as long as there are appropriate guidelines in place to protect the sensitive information. With institutions that serve as “learning healthcare systems,” such as Emory, people may be involved in research that will serve to benefit society, without knowing which studies their records are being utilized for specifically. However, the patient is able to request a record of all disclosures of their HIPAA-protected information for research purposes at any time.

One of the only exceptions to informed consent is in the context of medical emergencies, when a decision must be made urgently and the patient or their surrogate is unable to partake in decision making. Under such circumstances, physicians may initiate treatment without prior informed consent. Even then, the physician should seek to inform the patient or surrogate at the earliest opportunity and obtain consent for ongoing treatment in order to maintain ethical standards.

Overall, informed consent is a procedure that protects patients and participants from undergoing procedures that they may not completely understand nor agree to. Through the key points of disclosure, capacity, and voluntariness, informed consent can also protect individuals from potential mistreatment or falsified information. As a result, the process of informed consent ultimately plays a vital role in medical and research ethics, allowing for more transparency in operations that continue to improve our society and world.

Further Resources

An Introduction to OTTers


The otter is a resilient and fascinating animal that can be found throughout the United States, as well as throughout most continents. Otters can be found all over the world, from Asia to Alaska, these animals are globally loved. Their scientific name is Mustelidae. This adorable animal is a member of the weasel family, yet they are the only one within the family that can swim. Otters are described as tiny, with short ears, lengthened bodies, and very soft fur. These charming otters are small animals that average about four feet in length and can weigh up to 30 pounds. There have been 13 identified species in total of the otter. Otters live primarily on land that is very close to bodies of water, as they are amphibious. Their fur is dense and soft, and this remains important, as it can play a large role in insulating these animals when they are in water.

Environmental Impact and Otters

It is important to acknowledge that the environment’s health has a direct relationship to the health of many animals, including the otter. Today, many otter populations are still at risk! More importantly, as countries become industrialized, the wildlife can slowly but surely disappear (Duplaix & Savage, 2020). The history of the otter is very unique, because due to their luxurious fur, these animals were trapped, killed, and illegally sold throughout continents. These issues caused otters to have a period of major population loss during the 1960s and 1970s, paired with a slow-growing population recovery (Mason, & Macdonald, 2009).

How Otters Use Tools to Eat

Otters often use tools to eat, and it is important to note that the otter is an expert hunter. Otters are very innovative creatures and are one of the few animals that will use tools to obtain food. Their diet can vary depending on the season; however, they are considered opportunistic predators and thrive on fish. Despite their diet mainly being fish, this doesn’t stop otters from forming dens and adapting to their natural habitat. In fact, sea otters are able to open mussels by smashing them on stones. Click this link to see sea otters eating! Otter habitats can vary based on the species, but these animals thrive in both land and water.


  • Duplaix, N., & Savage, M. (2020). The global otter conservation strategy. eScholarship, University of California.

  • Mason, C. F., & Macdonald, S. M. (2009). Otters: ecology and conservation. Cambridge University Press.

History of Ventilators

Ventilators are machines that can help patients breathe, or, in some cases, breathe for them. Doctors use ventilators on patients in very severe cases, when it is determined that the patient does not get enough oxygen from regular breathing or through increased oxygen supply. While on a ventilator, the patient’s lungs have the opportunity to start healing and receive much needed medications, until breathing can be restored. Ventilators are now a standard part of critical care and have significantly evolved in their technology over the last 100 years.

The earliest attempt to support breathing mechanically can be traced all the way back to the late 18th century. These early visions of ventilators relied on negative pressure that is also seen in the most widely used ventilation device of the 20th century, the iron lung (for more information on iron lungs visit During the polio epidemic of the early 20th century, children with paralyzed lungs were placed in these machines, which expanded and contracted to force air into and out of the lungs. This technique required a patient to be fully encased in the iron lung with only their head sticking out. In the 1960s, researchers started developing positive-pressure machines, which force air directly into the lung. This technology caught on fast, and nowadays all modern ventilators rely on positive pressure. These machines require the insertion of a tube into the patient’s trachea, while the patient is sedated (intubation), making them more invasive than negative pressure ventilators.

Modern mechanical ventilators are much more portable than their predecessors and provide many adjustable features that can facilitate air flow and adjust the pressure and rate according to the patient’s needs. The goal is to optimize the process for each patient, to ensure as much comfort as possible and have a better outcome. While they are generally computerized microprocessor-controlled machines, patients can also be ventilated with a simple hand-operated bag valve mask in case of emergency.

Given the importance of ventilators in hospitals, we expect that future developments will allow them to integrate even further with other components of critical care. This will likely be assisted by electronic means of communication between different bedside devices for a more efficient interaction. Other possible features are the incorporation of ventilator management protocols into the basic operation of the ventilator, displays with organized information instead of rows of unrelated data, and smart alarm systems. Doctors hope that these improvements will lead to better outcomes for the patient and a higher level of care.

(For a further depth study on the past, present, and future of ventilators visit:

The Institutional Review Boards 101

New discoveries of therapies and drug mechanisms are not always the daily news headline, but today ethical guidelines exist to continue to keep a standard of the production of any new medication or treatment. However, the history of clinical research has not always been so ethical. For instance, the PHS Syphilis Study in Tuskegee, AL and the Willowbrook Hepatitis Experiments, are only two of many notorious examples of horrifically unethical clinical trials. The purpose of this article is to bring light into the role of the Institutional Review Boards (IRB) in relationship to on ongoing  clinical trials today to ensure safety for human participants.

In 1974, Richard Nixon passed the National Research Act. This act was created to ensure excellence of biomedical and behavioral research within the United States. These guidelines within the National Research Act emphasized a respect for autonomy, beneficence, and justice for research participants. As a result, the Institutional Review Boards (IRB) was formalized, for all DHHS-funded research, as a Committee. This committee would reside either within the research institution or be external (e.g., commercial IRB’s); and would be an ethical review board designated to protect the rights and well-being of human research participants.

The IRB must be independent from the institution for which it reviews research to avoid any inherent bias within the study, though it is often made up of faculty and staff of the institution. The IRB functions to review and monitor research involving any human subjects. This board has the power to approve, enforce any change, or reject research of a clinical trial. Moreover, patient safety is a priority in clinical trials and the IRB plays a fundamental role in this. Thus, the board will review the protocols and progress throughout the study. The main goal of the IRB is to confirm that the right steps are taken to protect the welfare and the rights of participants. The IRB also operates to verify the integrity and quality of the data being collected. IRB is also required by the Federal Drug Administration (FDA) regulations and may perform audits of the clinical trial study records.

Prior to a patient being recruited for a clinical trial, there must be both legal and ethical steps to ensure the patient fully understands what their part in the clinical trial will entail. The IRB will review documentation presented to participants to ensure procedures, risks, and benefits are discussed. This process is known as informed consent. Informed consent consists of verbal and written documentation that confirms the participants acknowledge and understand their part in the clinical trial in its entirety. A signed informed consent document is part of the process for ensuring that the institution is compliant. This process is designed to help patients thoroughly understand what to expect as well as the risks and benefits of participating. It’s important to note that the informed consent form is only one part of the informed consent process; there must also be an ongoing process, including updating the participant of any new information throughout the study.

In conclusion, while some of the history of clinical trials is disheartening, today the IRB continues to provide advocacy and protection for any participant within in a clinical trial and remains to be an integral component to the welfare and safety of human participants within clinical trials.


The History and Role of Institutional Review Boards: A Useful Tension:

Being in a Clinical Trial:

Clinical Trials: What Patients Need to Know:

Thinking about joining a clinical trial? Here’s what you need to know:

Fats: The Good, The Bad, and The Ugly

Fats are confusing. There are some good ones, a lot of bad ones, and it is hard to keep track of the ones you want and the ones you don’t. Hopefully, this article will help keep things straight.

The body contains three types of lipids. Lipids are a class of organic compounds that are insoluble in water. One of the least talked about but most important types of lipids in the body are phospholipids. Phospholipids are the main constituent of cell membranes and play an important role in determining what enters the cell and what is left out.

The second type of lipids are called sterols. Cholesterol is a sterol and is used by the body in the synthesis of hormones. Cholesterol is, of course, infamous for its links to cardiovascular disease. However, there are two types of cholesterol – “good” cholesterol and “bad” cholesterol. This classification is based on the type of lipoproteins in which the cholesterol is contained. Lipoproteins are essentially large droplets of fats. The core of lipoproteins is composed of a mix of triglycerides and cholesterol and this core is enclosed in a layer of phospholipids. There are five different types of lipoproteins, but the two types that are most known are low density lipoproteins (LDL) or “bad cholesterol” and high-density lipoproteins (HDL) or “good cholesterol.”

Bad cholesterol, in high quantities, accumulates in the walls of arteries, where LDLs are oxidized.           Oxidized LDL causes damage to the walls of arteries. This damage leads to inflammation which leads to a constriction of arteries (leading to high blood pressure) and to further accumulation of cholesterol, leading to the formation of plaques. These plaques further narrow arteries, decreasing the flow of blood and oxygen to tissues.

High density lipoproteins, or good cholesterol, on the other hand plays an important role in reverse cholesterol transport, a process by which excess bad cholesterol is transported to the liver for disposal. Good cholesterol also has anti-inflammatory and vasodilatory properties and protects the body from LDL-oxidative damage.

Perhaps unsurprisingly, fried food, fast food, processed meats, and sugary desserts lead to increased bad cholesterol levels while fish, nuts, flax seeds and – you guessed it! – avocados lead to increases in good cholesterol levels.

The final type of lipids in the body are triglycerides. Triglycerides are the fat in the blood. Any calories that are not utilized by the body are stored in the form of triglycerides. The effect of high levels of triglycerides on the heart have not been as well understood. Excessive triglyceride levels are typically accompanied by high (bad) cholesterol levels and research in the past couple of years has indicated a relationship between high triglyceride and risk for cardiovascular disease.

The fats that we consume, however, are not in the form of triglycerides. The fats that we consume are broken down and converted into triglycerides and cholesterol. The major dietary fats are classified into saturated fats, trans fats, monounsaturated fats, and polyunsaturated fats.

Saturated fats are fats whose molecules have no carbon-carbon double bonds. Saturated fats are fats to be avoided because they increase LDL levels by inhibiting LDL receptors and enhancing lipoprotein production. Saturated fats are solids at room temperature and are found in fatty beef, lamb, pork, butter, lard, cream, and cheese.

Trans fats are also bad fats. They are typically found in margarine, baked items, and fried food. They suppress chemicals that protect against the build up of plaques in artery walls, increase bad cholesterol and decrease good cholesterol.

Monounsaturated fats and polyunsaturated fats are fats that have one (mono) and many (poly) carbon-carbon double bonds in their molecules respectively. These fats are liquids at room temperature and are found in salmon, nuts, seeds, and vegetable oils. Polyunsaturated fats are associated with decreased bad cholesterol and triglyceride levels.

Keeping track of which fats are found in which food can seem intimidating, but foods that lead to increased good cholesterol levels are foods that are typically considered healthy – nuts, seeds, fish, fruits, and vegetables, while foods that lead to excessive bad cholesterol are foods that we are taught to avoid in excess anyway – such as processed and fatty meats, processed food, and fried food.

Contains both information on what various types of fats are and also food that contains the respective fats:
A guide to choosing healthy fats:

A History of the Hippocratic Oath

The Hippocratic Oath is arguably one of the most famous oaths of ethics in our history. Originating in Ancient Greece, it centers around medical practitioners swearing, “by all gods and goddesses,” physicians will uphold various ethical standards in their medical practice. Contrary to popular belief, the oath does not actually contain the renowned phrase, “First, do no harm,” an expression that has now become synonymous with the oath itself. Dated back to the fifth and third centuries B.C., the oath is often attributed to the Greek doctor, Hippocrates, though scholars have contended that it could, instead, be a work of the Pythagoreans. While its oldest remaining fragments date back to AD 275, the oath has been continually rewritten and adapted over the centuries to better suit the values and beliefs of evolving cultures and ethical standards.

Following the collapse of the Roman Empire and its religious ideals, today’s “multiethnic, multicultural, and pluralistic world” no longer worships ancient divinities such as Apollo or Asclepius (Indla, Radhika, 2019). As history progresses, the Hippocratic Oath has faced ideological challenges due to new and emerging technology, that did not exist in the era of Hippocrates. For instance, the Hippocratic Oath did not take into consideration a patient in a vegetative state, a patient suffering from pain, a patient requesting for an abortion, or addressing other autonomous rights of a patient. Considering that technology has and continues to advance the ancientHippocratic Oath has faced many modern-day dilemmas.

Consequently, the period following World War II, saw one of the Hippocratic Oath’s most significant revisions: the Declaration of Geneva. During this period, the tradition of medical graduates reciting the Hippocratic Oath became more than a mere formality. As such, the World Medical Association (WMA) altered the oath in the 1960s to state that providers would “maintain the utmost respect for human life from its beginning.” Making the custom a more secular obligation, that the oath is not to be taken in the presence of any divine figures, but before only other people. This served as a test of a practicing physician’s ethical, moral, and emotional standards, an especially remained an important notion after the atrocities of WWII.

As a result of this, in 1964 the Hippocratic Oath faced further revision. These alterations are most notably addressed by Dr. Louis Lasagna’s 1964 revision of the oath, which cites that “[doctors] do not treat a fever chart, a cancerous growth, but a sick human being, whose illness may affect the person’s family and economic stability.” Such changes represent the increasing humanization of the relationship between a doctor and a patient. Despite the controversies that have come with these changes, such alterations begin to represent the influence that cultural identities and contextual values demand on the form of the oath. In fact, in 1973, the US Supreme Court rejected the Hippocratic Oath as a guide to medical ethics by determining that the oath is unable to maintain changing medical ethics and codes. The final, most modified document of the Hippocratic Oath, known as “Pellegrino’s Precepts,” which functions as a set of principles. These precepts directly speak to doctors and are a “universal set of precepts about the nature of medicine” in contrast to the Hippocratic Oath.

In modern times, the Hippocratic Oath has essentially been replaced by more extensive and pragmatic ethical codes issued by national medical associations, such as the AMA Code of Medical Ethics, or the British General Medical Council’s Good Medical Practice. These documents offer a more comprehensive overview of the responsibilities and professional behavior expected of a doctor to their patients and to society, rather than to healing gods and other divinities. As such, in the United States, many of the medical schools use the Osteopathic Oath in place of the Hippocratic Oath. For instance, schools such as, New York Medical College, University of California, or Tulane, have had medical students vow to not discriminate against patients based on “gender, race, religion, or sexual orientation.” Hence, as time passes, many of today’s doctors face various ethical issues that are not included in the Hippocratic Oath. Therein lies the question, “is our society in a post-Hippocratic era?” With a modern society, continuing to evolve, physicians have begun question whether the Hippocratic Oath holds outdated principles. If so, how can medical students incorporate an evolving society to protect patients. Despite this, many providers argue that the Hippocratic Oath epitomizes ideologies of gratitude, beneficence, and humility.

While there is no direct punishment for breaking the Hippocratic Oath, a notable, modern equivalent is ‘medical malpractice’ which carries a wide range of punishments from legal action to civil penalties. Doctors who violate these principles are at risk of being subjected to disciplinary proceedings, including the loss of their license to practice medicine.

Overall, what began as an ethical principle in Ancient Greece saw itself transformed frequently through time as a result of contemporary ideals and beliefs. From a prominent idea to a mere formality, the importance of the Hippocratic Oath has fluctuated almost as much as its content. While it may no longer be the central ideal of medical ethics, its ideas have ultimately pioneered modern practices and formed the crux of what we now call medicine. Today, 100% of medical school graduates in the United States swear to some variation of the Hippocratic Oath; therefore, the responsibility to continue to pursue beneficence, compassion, and humility within the field of medicine maintains its utmost significance.

See the Emory Class of 2020 Hippocratic Oath at Emory School of Medicine!
“The Oath of Hippocrates”
As the ancient Greeks swore by their pagan gods, so do I solemnly affirm that as a student in medicine at Emory University, according to my ability and judgment, I will keep this oath and stipulation. I will consider dear to me those who have taught me this art and will impart the precepts and instruction of the profession to all those who qualify as students of the art and agree to the standards of the profession. I will follow that system of regimen, which according to my ability and judgment I consider for the benefit of my patients, and abstain from whatever is deleterious and mischievous. Into whatever house I enter I will go into it for the benefit of the sick, and will abstain from every voluntary act of mischief and corruption. Whatever in connection with my professional practice or not in connection with it, I see or hear in the lives of men and women which ought not be spoken of abroad, I will not divulge, as reckoning that all such should be kept secret. While I continue to keep this oath inviolate, may it be granted to me to enjoy life and the practice of the art, respected by all people in all times, but should I trespass and violate this oath may the reverse be my lot.”
– Emory School of Medicine Class of 2020


How Intellectual Property Gave Rise to the Film Industry

Documenting the history of the film industry through patents and Thomas Edison provides an interesting and entertaining perspective. From the invention of the first movie camera to the movie industry that exists today, patents have played a key role in the industry’s change and growth. The story begins in the early 1890s when Thomas Edison developed a movie camera called the Kinetograph. Although this was not the first camera invented to capture sequential motion, Thomas Edison’s camera was different from earlier inventions because the Kinetoscope used celluloid film.

This allowed Edison to receive a patent for his unique movie camera. Edison also filed and was granted many, many U.S. patents for other motion picture technologies, which provided him ownership of the majority of the existing U.S. patents in the field. The Edison Manufacturing Company used patents to eliminate all of their competition on the East Coast by filing patent infringement lawsuits against them. In 1898 Edison sued a studio called American Mutoscope and Biograph (Biograph) under the claim that the studio infringed on his patent for the Kinetograph. This studio was founded by his former assistant, W. L. K. Dickson. In 1902, the U.S. Supreme Court of Appeals rejected his case and ruled that Thomas Edison’s patent meant that he owned the rights to the system that moved perforated film through the camera, not the entire concept of the movie camera.

In response to this decision, as well as the rise in studios and cinemas across the states, Edison and Biograph joined forces with other competitors in 1909 to create the patent licensing company called Motion Picture Patents Company. This company operated in New York and other cities on the East Coast with the intention of protecting patents and controlling the film industry. Motion Pictures Patents Company, also known as Movie Trust, possessed most of the available motion-picture patents for camera and projection equipment from 1909 through 1912. The company dominated the market by refusing equipment to uncooperative filmmakers or theater owners

The authority of the Movie Trust began to weaken in 1912 due to the success of European and independent producers. The end came for the Trust in 1915 when the District Court ruled in the case of the United States v. Motion Picture Patents Co. that the Movie Trust had exceeded their patent rights. The District Court ordered that Movie Trust be dissolved, stating that: “While the patent and antitrust laws must be accommodated to one another, ‘it cannot be that the grant of a patent right confers a license to do that which the law condemns.’ A patentee may simply enforce his right to exclude infringement, but he must not use his patent “as a weapon to disable a rival contestant, or to drive him from the field,” for “he cannot justify such use.”

Innovation, and patents, continue to fuel the movie industry today. There are several examples of recent patents in the film industry. For example, The SteadiCam® (US Patent No. 4,017,168), is used in films such as The Shining and Star Wars: Return of the Jedi. This invention is important for filmmakers that desire to provide a smooth action shot uninterrupted by a cameraman’s movement, and it was patented in 1977 by cinematographer and inventor Garrett Brown. Steven Spielberg has also patented a method and apparatus for producing a screenplay (U.S. Patent No. 8,091,028). Automated Story Generation (U.S. Patent No. 8,422,852), where themes scripts are used to produce a finished product with minimal user input or direction, is another recent and interesting patent in the industry.

Just as Movie Trust possessed most of the movie patents in the industry’s beginning, today Sony and Samsung lead the film industry in number of film industry related patent applications. Recently, Sony recently filed for patent protection of the animation process and technologies that were used in the widely popular Spiderman: Into the Spider-Verse. The animation style in the film is regarded as original and “envelope-pushing”, which is why Sony desires to protect and patent it. The Walt Disney Company also has a considerable amount of filmmaking patents and has filed for 2650+ patent applications since the year 2000.

The patents that protect these inventions are important to encourage inventors to continuously improve, change, and bring creativity to the industry. This is just one entertaining example of how intellectual property protection and build and support an industry.



The Four Parts of Blood

The blood in your body is equivalent to seven percent of your body weight. This important substance has many different elements that make it the main carrier of oxygen, carbon dioxide, and essential nutrients throughout the body. There are four parts of blood: platelets, plasma, and red and white blood cells.

When an injury to a blood vessel occurs, platelets, which are fragments of cells, rush in to help the blood clotting process. They bind to the site of the damaged blood vessel and create a layer that the blood clot can build on. Platelets have an active shape that resembles the tentacles of an octopus that spreads over the injured site of the blood vessel.

components of blood graphic

Plasma is the liquid in your blood that carries all the other parts of blood throughout the body. Plasma is a mixture of salts, proteins, sugars, water, and fat, and it makes up more than half of your blood! The role of plasma is to transport necessary nutrients to the body, but it also helps remove waste excreted from cells.

The most abundant type of cells in blood are red blood cells, or erythrocytes. Red blood cells are shaped like donuts, and after maturing in the bone marrow, they are released to the bloodstream. The main function of red blood cells is to carry oxygen to the body and carbon dioxide to the lungs, aided by the protein hemoglobin.

White blood cells only account for 1 percent of your blood, but they are vital to fighting off bacteria and viruses to protect the body against infection. The more common type of white blood cells are neutrophils, which are deployed first to the site of an infection and release enzymes that kill harmful microorganisms in the blood.

These four parts of the blood work together to create an extensive system of protection, transportation, and healing that allows your body to perform at the highest level.

American Society of Hematology:,to%20the%20lungs%20and%20tissues



15 Good Minutes: Hari Trivedi

After completing an undergraduate degree in engineering at Georgia Tech, Emory Assistant-Professor Dr. Hari Trivedi began medical school with an open mind about what field to specialize in. While exploring different fields, Trivedi began to grow interested in the intersection of medicine and technology. He eventually settled on his chosen field, radiology, after witnessing how it combined his interests in both medicine and engineering.

“During radiology rotations, I thought radiology was just so cool because radiologists get all the newest toys,” Trivedi said. “I remember seeing my first 3D reconstruction of a CT scan, and that’s when I was like, OK, this is really interesting and powerful stuff.”

Today, Dr. Trivedi is both a practicing radiologist as well as a researcher in the field. He has worked on innovative improvements to medical diagnostic procedures such as breast cancer screening. Much of Trivedi’s research involves using Artificial Intelligence (AI) algorithms to deliver faster and more reliable diagnoses from medical imaging. Trivedi’s work involves balancing the development time and accuracy of this technology to ensure it can be deployed within a reasonable time frame while providing accurate diagnoses. Getting innovations deployed so they can improve patient outcomes is something that Trivedi always tries to stay focused on.

“Deployment is something that’s often overlooked, as 99% of AI machine learning technology gets stuck in the lab, Trivedi said. “So, while getting it deployed and integrated to a healthcare system is extremely complicated, unless you take that step, you really haven’t necessarily created anything of value.”

Trivedi views being a practicing clinician as an advantage for his research, as it provides him with a firsthand look at clinical issues that could be addressed by new innovations. This dual role can also be a challenge however, with the added complexity juggling different responsibilities brings. Trivedi views a key component of successful research as keeping in mind the expertise of each individual involved with a given project.

“There’s a lot of people that need to come together for a project succeed, which can sometimes take time, but that persistence is the key,” Trivedi said. “As long you’re persistent and stay on the radar, I think people are generally very good about making sure things get done.”

Trivedi has also had success commercializing some of his innovations, a process he views as a “natural extension of utility.” Under this principle, Trivedi always tries to provide innovations for free to other researchers who can find use for it. In some cases, however, Trivedi has filed for protection of intellectual property and licensed it out to commercial entities if this is the only way to financially sustain the innovation. One example is Trivedi’s work on algorithms for the anonymization of medical data. The tool requires ongoing maintenance and support, which necessitates charging a fee for use. Any other proceeds from commercialization go to supporting the needs of the lab and future research.

“That’s the way we look at commercialization, as if we build something useful, we do everything we can to give it away for free, Trivedi said. “But if it’s not going to be sustainable by giving it away for free, then we would try to license it to the appropriate person and use those funds to support the project.”

For those also seeking to become researchers, Trivedi’s key piece of advice is filtering out the noise to focus on individual goals and pursuits. This means worrying less about what others are doing and striving more to maintain focus on one’s own projects. As Trivedi believes, there’s never going to be a lack of discoveries to be made, and every researcher can make their own unique contributions to the scientific community.

“No problem is every really solved, there’s always room to innovate,” Trivedi said. “Things that we do literally hundreds of thousands of times per year, they’re still not perfect. So, I’d recommend not staying fixated on what others are doing, and I’d rather focusing on actually fixing and solving a problem at hand.”

Hari Trivedi: