“Being the vanguard of ice cream has vanquished its radical sensation.”

I have a favorite local pub that specializes in offering a wide variety of craft beers that rotate weekly – they have dozens of taps and whenever I go there’s always a couple beers on special I’ve never heard of.  Let’s just say, it’s not the kind of place you order a Budweiser.  But I was surprised when they stopped serving Guinness to focus their selection toward offering unique flavors from a variety of breweries.  As an American pub, Guinness still seems exotic enough to offer, right?

I didn’t want to order Guiness that night and I have never ordered it there before, in fact I barely noticed it was absent from the list. Yet I was still surprised and annoyed that they wouldn’t offer the “classic” or “go-to” stout. C’mon, it’s a classic.  So what if Guinness is the “vanilla” of the stout world?  Ben & Jerry’s has dozens of flavors and yet they always have vanilla.

Of course, I never order vanilla at ice cream shops either because who wants vanilla ice cream?

It’s just so common and plain.  Multitudes of things have vanilla varieties – yogurts, cookies, cereals, sodas, even alcohols.  Then there are the candles, lotions, air fresheners, and body washes.  If there are multiple flavors or scents available for a product, it is likely one of them is vanilla.

Vanilla has become synonymous with “plain” and in contexts other than just flavors.  A “vanilla” person has a conventional and unadventurous personality type.  A “vanilla” computer game is the original game with no expansions or extras.  It can be used to describe anything that is boring, modest, basic, or simple.

But vanilla hasn’t always had such a boring reputation.  It was once worshiped by the Aztecs as a sacred plant, and when Cortes brought it back to Europe in the 1520s it quickly became one of Spain’s hottest commodities.

Europe’s demands were mainly exported from Mexico because it was difficult to cultivate in non-native regions as it required pollination from a local bee species not found in Europe.  The exotic origin of the plant and the difficulty in its growth and acquisition made it all the more desirable and prestigious.

According to the Totonac, an early civilization that lived in Mexico, the vanilla plant sprung the blood of two beheaded lovers, an immortal princess and the mortal she was forbidden to marry.

In the mid 19th century, the invention of hand-pollination allowed the plant to be cultivated anywhere and global production began. But even with production possible in non-native regions, it was still a labor-intensive process: the flower will die within hours after blooming if not pollinated, the pods grow for 9+ months, and then curing processes take several more months.

Even after vanilla production increased, its price and demand, and therefore prestige, stayed high.

So what changed?  When did “vanilla” stop being a widely sought exotic spice and become the bland flavor we see it as today?

Vanilla first came to America when Thomas Jefferson brought it as an ice cream flavoring from France.  Though the flavor was exotic, it is possible it began to acquire it’s “plain” reputation here: Ice cream was commonly flavored with nuts and berries, so although the flavor was exotic, it’s white, plain, smooth texture must have seemed very plain in comparison.

A portion of Thomas Jefferson’s personal vanilla ice cream recipe.

However it is more likely that the high popularity of vanilla is what caused its slow progression towards anonymity and blandness. Vanilla was such a great spice that it was used for everything, but once it was available everywhere it ceased to be special.

Most things now aren’t even made with real vanilla anymore, but rather vanillin.  Vanillin is the main compound in natural vanilla, but while real vanilla extract also contains hundreds of other compounds, imitation vanilla is mainly all vanillin.

Production of vanillin, which started as a profitable use for certain by-products of the paper making industry, further led to widescale use of “vanilla” as a flavoring, where it eventually faded to a background or complimentary flavor.  Now it seems vanilla needs a special flavor partner to really grab one’s eye – Vanilla Java, Vanilla Caramel, Vanilla Toffee, Vanilla Rum, the list goes on.

But it still holds true that vanilla is the one flavor you can get anywhere, so why get it (insert name of the next bakery or ice cream shop you visit)?  Because really, who is famous for their vanilla ice cream? Or their pastries with vanilla frosting?

But who knows, maybe its absence from the food spotlight for a while will respark our love of vanilla. Maybe when chocolate goes out of style?

Although I doubt that’s ever happening.

“That doesn’t look very scary. More like a six-foot turkey.”

One of the things I enjoy saying the most about my experience with wildlife rehabilitation is that I have worked with raptors – birds of prey like owls, hawks, and falcons. They are amazing, interesting birds but I’ll admit that part of that fun is that when people hear “raptor” they often imagine this:

This association is probably partly due to the blockbuster “Jurassic Park” and other dinosaur movies. A major theme of the film, and one that I love as a student of evolutionary anthropology, is demonstration of the bird-dinosaur relationship. The species Compsognathus is described as walking chicken-like, and the final scene of Jurassic Park shows a flock of birds flying from Isla Nublar and the dinosaurs that dwell on it. After Dr. Alan Grant is laughed at for stating that dinosaurs obviously became birds, he retorts with this:

“Well, maybe dinosaurs have more in common with present-day birds than they do with reptiles. Look at the pubic bone: turned backward, just like a bird. Look at the vertebrae: full of airsacs and hollows, just like a bird. And even the word ‘raptor’ means ‘bird of prey.’”

However, this quote implies that there was foresight in the naming of both birds of prey and certain dinosaurs “raptors” because birds are the evolutionary descendants of some dinosaur species. But the way in which language evolves through translation and interpretation means one cannot assume that the naming intentional. And in terms of the history of the knowledge of evolution, this naming was actually more of a coincidence.

The word “raptor” comes from a Latin verb rapere “to seize by force”. Birds of prey are called raptors because of they way they hunt, seizing their prey out of the air, and Websters dictionary first defined birds of prey as raptors in 1823.

The term “velociraptor”, and other species containing the suffix -raptor, probably originated in scientific literature around 1924. They were so named because velox means “swift” and so velociraptors were thought to be speedy predators who seized their prey.

However, both of these groups received their designation as “raptor” long before paleontologists were able to link birds as the descendants of dinosaurs. The evidence of the bird-dinosaur evolutionary lineage was not concretely supported until the 1980s when dinosaur phylogeny was more fully understood with increasing knowledge of genetics, phylogeny, and evolution.

Assuming that the two share a name because they share an ancestor is a false cause logical fallacy, meaning that a false assumption is made when one believes a relationship seen between two things automatically implies one caused the other.  In fact birds of prey and some dinosaur species were named before their evolutionary relationship was understood and so their name was more a convenient coincidence of word choice.

If early orinthologists had decided that birds of prey didn’t seize their prey but rather grabbed, snatched, clasped, clutched, or caught them, the Latin translation never would have matched the dinosaur term that came later.

“Scientists, especially those who are Catholics, will by their research establish the truth of the Church’s claim”

Pope Francis has recently said, to the outrage of the more traditional followers of the Catholic Church, that Church teachings have focused too heavily on matters of homosexuality, abortion, and birth control. However, his personal stance and the official stance of the Church, still oppose use of contraceptives except in extreme cases where they are used to address another medical issue.

Not surprising, as this has been the view of the Church since Pope Paul VI outlawed the use of oral contraceptives in 1968. What may be surprising to some however, is that the pill was sanctioned for a brief time by the Church and its very development and design were intended to merit the approval of the Catholic Church.

John Rock was a strongly devout man. He was also a pioneer in sperm cell preservation and in-vitro fertilization, and was a major developer and proponent of oral contraceptives. And he wholeheartedly believed that his work in the development of the birth control pill was “perfectly compatible” with his faith in the Catholic Church.

But he was by no means a radical proponent of women’s reproductive rights – he was traditional minded in many ways and a conservative who still opposed the admittance of women to medical schools. But he supported birth control because as a doctor he saw the necessity of preventing pregnancy in ill patients and for families who could not afford more mouths to feed.

The complicated chart used to help women determine their periods of infertility according to the Rhythm Method, "Nature's Method"

The complicated chart used to help women determine their periods of infertility according to the Rhythm Method, “Nature’s Method”

However, in the field of contraception, he was radical.  He boldly signed a petition to repeal the Massachusetts ban on sale of contraceptives (1931) and later was the first medical doctor to open a Rhythm method clinic in Boston (1936). At the time, the rhythm method was the only contraceptive sanctioned by the Church because it was a “natural” method of regulating procreation, unlike other methods which killed sperm (spermicides) or disrupted natural biological processes (vasectomies).

The pill works by providing women with a constant dose of Progestin, a synthetic version of Progesterone.  Progesterone is a hormone released during pregnancy to prevent the release of more eggs which may threaten the current pregnancy. The pill therefore was an arguable extension of nature by duplicating what already happens naturally, but more often and consistently.  This was the logic with which Rock believed the pill would be approved by the Church.  Plus, to its credit, the pill regulated women’s cycles and could be used as a aid to the rhythm method.

However, the design of the pill contains one aspect which is biologically unnecessary – a week of placebo pills which enable menstruation to occur.  Ironically, this aspect is present only to satisfy the whims of Church approval.

Menstruation occurs because ovulation produces an egg and the lining of the uterus becomes flooded with blood and nutrients in expectation of fertilization. If fertilization does not occur, the swollen endometrium is shed. The pill prevents ovulation all together. No ovulation means no swelling, and no need for the menstrual shedding.  

There is no medical reason why women should have to have the week of placebo pills which allow menstruation. Yet it is found in nearly all birth control regimens, for two reasons:

  1. It was Rock’s belief that women who took the pill will feel safer and more natural if they still had their monthly cycles.
  2. By providing women with a reliable cycle of menstruation, it technically aided in the rhythm method (though the method is unnecessary with the pill).  If the pill aided in an already acceptable form of birth control, logically, one could go one step further and say the pill itself was pre-sanctioned by the Church.

Unfortunately, only 8 years after the first birth control pill was released for mass purchase, the Church rejected it and banned its use.  And Catholic women are still stuck fighting for their own reproductive rights today.

“Diets, like clothes, should be tailored to you”

Fad diets, as their name suggests, are easy-to-follow and trendy diets focused more on short-term success than long-term maintenance of health and weight. Focus is usually placed on eliminating or emphasizing a particular food group for health benefits.

However these diets tend to be difficult and sometimes even dangerous to follow for long periods of time. The Paleo Diet is considered by many to be a fad diet, which I why I was surprised that it is still a somewhat prevalent.

The Paleo Diet, short for Paleolithic Diet, is based on the diet of the early humans (called hominids) which lived during the Paleolithic era of 2.5 million years ago to 10 thousand years ago. It is believed to follow the general “ancestral human diet”.

It claims that humans are most adapted to the diet of their Paleolithic ancestors – fish, eggs, vegetables, fruits, roots, nuts, and wild game. Therefore sugars, salts, oils, and grains and vegetables which came about during the Agricultural Revolution are discouraged.

The claim of the Paleo Diet is that natural selection has not strongly acted on humans since the rise of agriculture, thus humans are maladjusted to the modern diets which they now consume. Hunter-gatherer societies, which still follow the basic ancestral human diets, have overall lower prevalence of disease.

By this reasoning, foods which humans are considered ill-adapted for are partly responsible for the increasing rates of obesity, cardiovascular disease, high blood pressure, Type 2 diabetes, and even cancer.  To help avoid the diseases caused by the “modern affluent diet”, humans should follow the Paleo Diet and avoid modern foods.

But, while abstaining from excesses of sugar and oil is beneficial to human health, the actual scientific reasoning behind the Paleo Diet is shaky at best.

1. Lifestyle is a major factor in any diet, not just the Paleo Diet

The lifestyle that the Paleo Diet promotes is itself healthy – growing or hunting one’s own food, and not eating in excess are healthy additions to any diet protocol.  The main bulk of the health advantages stemming not from what hunter-gatherer peoples actually eat, but how they live, and their diets are a result of their culture.

Basically, hunter-gatherers are more active and tend to consume fresher foods because they must acquire it for themselves.  Any modern American who hunts and gathers their own food will surely already lead a more active lifestyle and is also less likely to consume processed foods and junk food, whether or not they are following the textbook Paleo Diet.

However, the changes that have occurred among urban societies, even in the past century or so, have made following a hunter-gatherer lifestyle next-to-impossible.

2. The Paleo Diet assumes human evolution abruptly stopped at the rise of agriculture

The assumption that humans have not changed or adapted to their environment in the past 10,000 years ignores everything we know about evolutionary change. If adaptation comes naturally over time, why should change suddenly stop? While it is true that natural selection may be acting less on populations now with modern medicine and more consistent food sources, but fluctuations in population genetics are always present.

Furthermore, there is evidence that rapid changes have occurred in human populations since the rise of agriculture, and the evolution of lactose tolerance is a good example. A few thousand years ago there was an increased interest in animal husbandry in human populations, which led to more access to dairy products like cheese and milk. Early humans naturally lost lactose tolerance after weaning, but populations with access to dairy have since evolved a genetic ability to continue to digest lactose into adulthood and therfore retain the ability to acquire nourishment from dairy products.

3. Humans are opportunistic omnivores and are designed to eat a variety of diets

The premise of the Paleo Diet is that humans are specifically adapted to a certain diet and are maladjusted to consuming different diets, which has already been proven false by the evidence of change in lactose tolerance. Furthermore, this assumes that all early human groups ate the same diet, which is wildly blind to the fact that human populations arose all over the world and couldn’t possibly have eaten the same foods. And anthropologists can never be sure of the exact diet of any ancient group because it would have been widely variable based on location, season, and food availability.

Furthermore, the fact that humans only had access to certain foods during their evolutionary history doesn’t automatically mean that they aren’t capable of eating other things.

4. Diet alone cannot guarantee health

There is no specific relationship between genotype (one’s genes) and phenotype (one’s physical traits), so even if humans were specifically adapted for a certain diet, genes are not the only factors which influence final health, and environment and early development can have equally strong impacts later in life.  Hormones, musculature, and metabolism, among other factors, mean that some people will naturally weigh more or less no matter the diet they follow.

So, while the reasoning and science behind the Paleo Diet seems inaccurate at best, this isn’t to say that it can’t be beneficial to one’s health to follow some of the suggestions it makes. However, strictly following the Paleo Diet based on the belief that you are eating as ancestral humans did won’t be anymore beneficial than any other fad diet out there.

“Who was the fool, who the wise, who the beggar, or the emperor?”

Living during the Dark Ages, as the name suggests, was quite a struggle – plague and famine was rampant, wars and persecutions were common, and the medicine of the time could be worse than the condition it sought to cure. Science and technology of the time could do nothing to prevent this, and people simply had to accept the dangers of day-to-day Medieval life.

"The Triumph of Death" Pieter Brueghel the Elder, 1562

“The Triumph of Death”
Pieter Brueghel the Elder, 1562

This created an understandable gloom and the uncertainty of daily survival lead to constant forced confrontations with morbidity and death. But out of this fear, people were inspired to live in the moment, and accepting death as natural and inevitable allowed it to become an artistic source.

Themes such as the universality of death and the uselessness of vanity were common, and were demonstrated most strongly in Le Danse Macabre, an artistic genre of Medieval Europe. It focused on the understanding that treasures and worldly possessions were useless after death, and that life was a fragile gift. Furthermore, without this vanity and wealth, everyone is equal in death.

One of my favorite products of this genre is the graveyard soliloquy in Hamlet, where Hamlet ponders the possibility that the remains of great men such as Alexander and Caesar may now be present in the mud used to seal buildings and barrels.  When Hamlet finds the skull of Yorick, he then comments:

“Now get you to my lady’s chamber and tell her, let her paint her face an inch thick, yet to this favor she must come”
Hamlet, Act V, Scene I

The most common expression of this genre is a dancing skeleton leading victims to their graves – the dancers come from all walks of life, but having been stripped of worldly goods, they enter the next world together as equals.

But these images should not be taken as being particularly dark or imply that Medieval culture was death-obsessed.  Rather, they come from an understanding and acceptance about the nature of life and death.

Beyond reminding people to cherish life, constant reminders of the inevitability of death can turn one’s thoughts to the afterlife. With the feeling of death all around, both literally and in the art and literature of the time, there was an increased desire for religious absolution and preparation of one’s soul.

The emptiness of earthly treasures combined with the frailty of life might work to turn one’s thoughts to the future and the afterlife.  This in turn may encourage people to focus more on faith and piety for a permanent, and more important, existence after death.

Le danse macabre belongs to a larger genre known as memento mori, literally meaning “remember that you will die”.

Supposedly this phrase comes from a tradition during a Roman Triumphal Parade: a conquering general would march his legion, as well as captured treasures and slaves, through Rome in a glorious parade to demonstrate his greatness.  But all the while a servant would constantly utter something along the lines of “memento mori” as a way of keeping him humble even during one of his greatest moments.

In modern times, one may find a parallel in the Catholic practice of Ash Wednesday, which is celebrated as a day of mourning, repentance, and a reminder of mortality. An observer receives a mark of ash, sometimes a cross, upon their forehead while a priest repeats a famous line from scripture noting the inevitability of death.

“Remember that thou art dust, and to dust shalt thou return”
Genesis 3:19

But beyond practices and scripture, art too could turn man’s thoughts toward death and therefore piety. Michelangelo’s fresco “The Last Judgment” did just this when it was originally revealed in 1541. Centuries before television and movies could provide a rich visual source for emotions to feed off and fears to take form, paintings and sculptures served this purpose.

The image, located directly behind the altar, focuses on Christ’s Second Coming and the Judgment of all mankind. Saints and the Virgin Mary hover with Christ along with the pure souls that rise to Heaven, while demons and monsters drag the guilty to hell. People are skinned, burned, beaten, and consumed by serpents in a general atmosphere of chaos and fear. Even the Saints and Mary, who sit safely among Christ, seem fearful of the display below them.

The painting was perceived as being so terrifying and so real that it was meant to inspire fear, and faith, in all who saw it. Legend says that Pope Paul III, who commissioned it, was so filled with fear upon seeing it that he fell on his knees and exclaimed “Don’t charge me with my sins when you come on Judgment Day!

One can only imagine the fear that this painting might instill, if during a sermon the only place to gaze is upon it, while the world outside is filled with death, famine, war, and disease. I probably would have paid a little closer attention to the sermon too.

"What does my praying avail me now? I must step into the dead's dance"

“What does my praying avail me now? I must step into the dead’s dance”

“Now I have – against knight's order – become coerced to this dance”

“Now I have – against knight’s order – become coerced to this dance”

“Now I must dance and can’t yet walk”

For a full translation of these images, plus many more images, click here.

“Emperor, your sword won’t help you out
Scepter and crown are worthless here
I’ve taken you by the hand
For you must come to my dance”

Unknown, ~1460

“And so many died that all believed it was the end of the world”

The infamous outbreak of the Bubonic Plague in the 14th century was one of the worst epidemics in human history, killing 30-60% of the population of Europe.  It caused such an impact that if the world’s population over time is charted, one can clearly see the decline caused by the pestilence, and it took decades for the population to recover from this devastating blow.

After the wave of Black Death slowed, there continued to be major outbreaks for centuries and even now plague is present in most of the world, causing hundreds of cases a year.

Before the Black Death there was the Plague of Justinian in the mid 500s, and after it was the London Plague in the mid 1600s.  But the medieval outbreak of the mid 1300s was by far the worst.  It seemed no one was safe and people died so quickly it was said that there wasn’t time to bury the dead.  This unsanitary situation only propagated the disease further.

“How many valiant men, how many fair ladies breakfast with their kinsfolk and the same night supped with their ancestors in the next world?!”
-Giovanni Boccaccio

It strongly impacted human population growth for years, but beyond this it also led to major political, cultural, and religious upheavals in Europe as society tried to cope with the devastation which surrounded it.

The Bubonic plague, also called the “Black Death” because it caused lymph nodes and extremities to necrotize and turn black, is a disease transmitted through the bacterium called Yersinia pestis. It arrived in Europe from Asian trade ships, carried by fleas and rats.

Yersinia pestis

Yersinia pestis

However, at the time of the Medieval outbreak, little was known about bacteria and disease transmission. Instead, it was believed that a disease was caused by spirits or demons. An early belief about the Black Death was that it was a plague sent by God to punish Europe for descending into sinful ways.

A related, but more secular, theory was Miasma – the idea that diseases were spread by bad air and bad smells. Therefore breathing the same air as an infected individual could spread the disease, as could foul-smelling things.  While there is a bit of truth in this, as airborne plague infections are possible, this theory suggested that anything which smelled bad could cause disease.

Miasma inspired the traditional image of the Medieval Plague doctor with a “beaked mask” because the mask was meant to hold strong smelling herbs and oils to block the smell of decay and therefore block transmission of the disease.

Well, obviously this didn’t work.

Doctors got infected just as often as everyone else. Monks and priests were also at especially high risk because they took care of the sick, buried the dead, and were constantly exposed. The commoners started to notice that even the servants of God were getting sick in the so-called time of judgment. This led to two things:

  1. people began to believe they were not merely being punished, but that the Apocalypse was coming
  2. people began to question the Church’s authority and power because it failed to protect them

The 4 Horsemen of the Apocalypse, the harbingers of the world’s end, are named Pestilence, War, Famine, and Death. During medieval times, a common portrayal of the Horseman of Pestilence was a horse with black spots, perhaps as a reference to the Black Death and a prophecy of future events.

When people began to believe that the Church could not protect itself, they feared the Church could not protect them or prepare them for the rapture. Faith in traditional churches started to crumble and radical Catholic groups which promised salvation in a new way began to rise.

One of these groups were the Flagellants, a militant and radical sect of the Catholic Church that believed self-mutilation was a form of penance. This was usually done publicly along with chanting and singing. This group was outlawed by the church as it superseded the basic Church teaching that Jesus’ death removed all need for sacrifice by mankind.

Nevertheless, these alternate forms of worship were used by people who were dissatisfied with the Church as a form of protest, but it was also a last resort when the Church failed them. The popularity of the Flagellants and other similar groups wavered over time, but the era of the Black Death marked the highest membership in cult history.

 Xenophobia also rose with suspicion and fear, and in terror the afflicted masses sought for a scapegoat.  With wavering faith in the Church, religious leaders began to panic and tried to keep order by shifting the blame to anyone they could: Jews, lepers, witches, pagans, minorities, beggars, foreigners, and even widows.

Due to their isolation within cities, because of location (the Jewish ghettos were typically far from docks and city centers) and culture (Jews followed much stricter sanitary laws than most of the population), the prevalence of disease was much lower among Jewish populations.  This was noticed by those suffering and people became suspicious.  Jews were often accused of poisoning wells and there was a mass persecution of Jewish communities for years to come.

Jews being burned alive as part of Medieval persecutions

Jews being burned alive as part of Medieval persecutions in response to plague

In Northern Europe (where the Jewish population was lower) widows and old women suspected of witchcraft were seen as a strong threat.  A common metaphor for a plague infection striking a village was the arrival of an old hag in black robes. If she brought a broom with her, the whole village was “swept away”. But if she only brought a rake then some of the village would survive being hit by the pestilence.

The Church and society as a whole also suffered from the sheer loss of manpower – laborers were especially susceptible to disease, causing a labor shortage which led to a food shortage.  Within the Church, the loss of faith and also the loss of monks and priests who contracted the disease at higher rates than most of the population, caused a decline in current and future numbers of men entering the service of the Church.

Although the major outbreaks of plague during the Middle Ages only lasted about 5 years, it took human populations decades to recover and effects on the culture echoed for centuries after the initial outbreak as a constant reminder of the fear and panic that had once swept through all of Europe and Europe did not emerge from the Dark Ages until over a hundred years later at the birth of the Renaissance.

“Food is not rational. Food is culture, habit, craving and identity”

And of their flesh shall ye not eat, and their carcass shall ye not touch; for they are unclean to you.”
-Leviticus 11:8

Having well considered the origin of flesh-foods, and the cruelty of fettering and slaying corporeal beings, let man entirely abstain from eating flesh.”
-Manusmrti 5.49

O ye who believe! Fasting is prescribed to you as it was prescribed to those before you, that ye may learn self-restraint But fear Allah, and know that Allah is with those who restrain themselves
-Surah 183, 194

Religion can hold a strong sway over culture and it influences how people behave in social situations by placing a strong emphasis on moral behavior and promoting group cooperation. However, the benefits of religion may come at a price and sometimes an individual must forego his or her own personal interests for the good of the religious group, perhaps by becoming a celibate religious leader, payments of tithing, or following dietary restrictions and fasting laws.

Religious taboos that prohibit the consumption of certain foods or food during certain times are particularly interesting because they seem to go against all basic survival instincts that humans have.

  • Judaism mandates that its followers must keep Kosher, which are foods acceptable to eat under Jewish Law, and there are many restrictions and taboos during the time of Passover, including prohibition of leavened bread.

  • Followers of Islam should only eat foods that are Halal or “permissible in Islamic Law” and must abstain from consuming any food during the fasting periods in the month of Ramadan.
  • Catholics may not eat red meat on Friday or during the time of Lent, when it is typical for other forms of luxuries to be given up as well.

Many different explanations for the historical origins of dietary restrictions have been proposed in the past, but research in the fields of anthropology and psychology suggests that the most plausible explanation for these seemingly detrimental rituals is that they signal devotion to a group.

An individual associates with a specific group of like-minded individuals and this membership grants them the benefits of others’ altruistic acts – aid that is given simply because someone is in the “in-group”. Being part of the group therefore provides safety, relationship opportunities, and the possibility of help from a group member. Individuals form a group by entering into a social compact where they all agree to work together and adhere to rules of the group for the greater collective good of all members.

This is known as reciprocal altruism: you help a member of the group because you expect that at some point, they would do the same for you, and everyone wins.

However, this system can only work if everyone follows the rules and if their promises of aid are honest. Otherwise the group breaks down when people invest and are not rewarded. And it is difficult to organize voluntary group cooperation without the risk of some people taking advantage of the system, so-called free riders, that reap the benefits of being in a group without returning the favor.

If a group relies on cooperation and altruism to function, there must be a way to determine who is part of the group, usually through shared behaviors, customs, dress, etc. Common forms of signaling group membership can include clothing style, such as identifying oneself as a Michigan student by donning a blue and maize sweatshirt or identifying oneself as Christian by wearing a rosary or crucifix.

Similar behavior and dress clearly identifies all member of this group as loyal fans to the University of Michigan

Similar behavior and dress clearly identifies all member of this group as loyal fans to the University of Michigan.

The flaw with these signs of group membership is that anyone who wants to take advantage of the benefits to be reaped from group camaraderie can, and by simply wearing these articles of clothing, they can appear as though they too are part of the group.

A so-called "wolf in sheep's clothing" can integrate themselves into a group to benefit from it without any intention of returning the favor

A so-called “wolf in sheep’s clothing” can integrate themselves into a group to benefit from it without any intention of returning the favor.

A group’s capacity to find and then punish or oust cheaters increases the overall success of the group, so a more effective and selective form of group identification is often required. 

Therefore, a more complex way of signaling group membership may arise in the form of a costly signal.  This is a behavior that does not directly benefit the member of the group or the group as a whole, but demonstrates a commitment to the group. If an individual is willing to go out of their way to demonstrate that they want to be part of the group, it is more likely that they have a true vested interest in the group’s outcome.

It can be argued that a dietary restriction or food taboo is an example of this type of costly group signal – health and happiness are not gained by following any such rule (except the happiness one finds in being devout in their religion). Yet, nearly every religion in the history of mankind has requested that its followers obey some sort of dietary law.

An early Judeo-Christian belief held that pork was prohibited because pigs were used by pagans such as the Romans to worship false idols, and therefore the animals were tainted in the eyes of God with a connection to idolatry and were unclean for believers to consume. However, if this were the case, then most domesticated animals should have been considered unclean to eat, because many other animals associated with pagan practices, such as the bull, ox, or sheep were not considered unclean.

This facade from the Ara Pacis in Rome indicates that sheep and cattle were also important animals in Roman ritual, yet there are few Western taboos regarding the consumption of their meat.

This facade from the Ara Pacis in Rome indicates that sheep and cattle were equally important animals in Roman ritual, yet there are few Western taboos regarding the consumption of their meat.

Many different theories and explanations have been proposed for why most major religions demand that their followers obey a variety of dietary restrictions and taboos, and they cite reasons that range from historical symbolism to biological issues.  Clearly, traditions in a religious practice have important symbolic meaning for its followers.  The practices need not require sacrifice in order to maintain this symbolism, but typically, they do.

But it turns out that where history cannot, evolutionary theory can provide an explanation for the persistence of dietary laws: following dietary restrictions is a way to show one’s commitment to a group and indicate a genuine interest in cooperation and altruism.

Any rule that elicits a food restriction immediately divides people into groups of those who follow it and those who do not. Every culture has special protocols or traditions associated with acquiring or eating certain foods, and food taboos figure prominently into many societies around the world:

  • A traditional American thanksgiving would not be complete without the male head of household sitting at the head of the table, ready to carve the family’s turkey.
  • A successful Netsilik Eskimo seal hunt ends when the meat has been meticulously divided among a hunter’s lifelong “seal partners” during a village-wide celebration.
  • A Catholic communion involves the drinking of wine and eating of bread in a highly symbolic and meaningful way, and only members of the Catholic church may participate in this special event.

These rituals are performed in such a way that anyone who is not a member of that group would not fully understand and would thus be disconnected from the others during celebrations. Consequently, it is easy for others to determine which group an individual associates with through their knowledge of food customs, taboos, and restrictions. Furthermore, anyone willing to follow complicated rules that require a sacrifice of luxury demonstrates they are not simply fair-weather followers but devoted members of the group.  

By this obvious outward sign of who is part of the culture, the religion, the “in-group”, dietary laws can function as a way of keeping groups more united because members can be more assured that their fellow group members are equally committed to the group.

 

Anyone interested in reading more on these ideas should definitely check out my inspiration:

Irons, W. 2001. Religion as a Hard-to-Fake Sign of Commitment. Evolution and the Capacity for Commitment. R.M. Nesse (ed), pp 292-309. New York, NY: Russell Sage Foundation.

Also, researchers conducted a case study of group membership signaling among religious communes.  Their findings indicated that groups which require more commitment, more “inside” knowledge, and more adherence to ritual, were more likely to be successful.

Sosis R. and E. R. Bressler. 2003. Cooperation and Commune Longevity: A Test of the Costly Signaling Theory of Religion. Cross-Cultural Research. 37, 211-239.

“If we can teach people about wildlife, they will be touched”

Out of genuine interest rather than any particular career planing, I spent this past summer working an internship at Avian Wildlife Center that rehabilitates and releases injured wild birds, anything from hummingbirds to herons. Most of the birds we dealt with were brought to us after unfortunate interactions with humans in some way – nest disrupted, hit by car, poisoned by pollution, etc. At the center, birds receive care until they can be released back into the wild.

3 little victims of an illegal nest removal, these fledgling American Robins are a few weeks away from release back into the wild

3 little victims of an illegal nest removal, these fledgling American Robins are a few weeks away from release back into the wild

Before release they are also tested for parasites, ability to self-feed, and feather condition.  During their time at the center people interact with them as little as possible so they don’t learn to associate humans with food and approach them after being released.

It’s a great and rewarding job, if you aren’t expecting high pay, flexible hours, or a stress-free work environment. It’s also pretty interesting, and I could (and did) leave work every day with multiple bird stories to share.

This baby Sandhill Crane was everyone's favorite, and an opportunity to take charge of the hand-feeding was a contested role during his visit.

This baby Lesser Sandhill Crane was everyone’s favorite, and an opportunity to take charge of his bi-hourly hand-feeding routine was a contested role during his visit.

One particularly interesting case we had was a lady who brought in a fallen sparrow nest, with three baby birds. She commented that she was surprised one of the babies was twice the size of the other two.  This is because one wasn’t a sparrow at all, but a cowbird. They are incredibly interesting birds, particularly in how they raise their young – they don’t. Instead, they are nest parasites: the mother cowbird flies around laying eggs in other birds nests to be raised by an unsuspecting parent bird, in this case a sparrow.

A juvenile Brown-headed Cowbird being fed by its foster parent, a Chipping Sparrow, in Baltimore Co., Maryland (6/5/2011). Photo by Jon Corcoran (http://www.flickr.com/photos/thrasher72/).

A juvenile Brown-headed Cowbird being fed by its foster parent, a Chipping Sparrow, in Baltimore Co., Maryland (6/5/2011). Photo by Jon Corcoran (http://www.flickr.com/photos/thrasher72/).

The lady, who before this information had been impressed by his advanced growth, was suddenly appalled at the poor little cowbird in her sparrow nest. She then asked if we would euthanize the “parasite” since it disrupted the life cycles of the other birds. Of course that is not the case, and we explained that we would take care of it just the same – the center takes any injured wild bird, irregardless of how many individuals of that species they might already have because it makes no attempts to influence natural population ratios.

She wasn’t convinced why it should be saved, which was a common sentiment among several of the rescuers of cowbirds we spoke to over the summer.

Perhaps the term “parasite” gives them a bad reputation, but cowbirds are truly fascinating. Where most other species would imprint on whatever they first see – imagine the classic example of a baby duckling who imprints on a human when it hatches and spends its day following people instead of fellow ducks. Cowbirds, however, are smart enough to know what they are without having to see another cowbird during their whole infancy.  This is because they recognize their own coloration and use that information to find mates in the future.

Though barely related, I had to include this image of 2 ducklings imprinted to a Corgi

Though barely related, I had to include this image of 2 ducklings imprinted on a Corgi

Generally, to the public we simply try to explain that it is the bird’s natural behavior which should not be tampered with. Cowbirds are not an invasive species and are completely meant to coexist with other birds in their natural habitat, which ranges all across North America from southern Canada to southern Mexico.

They can’t thrive without this method of reproduction, which arose naturally through co-evolution with competing bird species.  It is simply how they live and reproduce, and the individual should not be blamed for its innate biological behavior, any more than a hawk should be blamed when it kills a dove for its dinner.

This isn’t to say that cowbirds don’t harm other birds – I am sure that unknowingly raising a baby cowbird takes its toll on a sparrow mother, who will be half the size of her baby before it leaves the nest. But they don’t outright kill their hosts (a good parasite doesn’t kill its host, or it loses its livelihood), and the parents with whom the cowbird tries to leave her eggs are not completely defenseless in the matter, as they sometimes detect and eject foreign eggs.

Cowbirds are known to parasitize over 100 different species, so their eggs seldom match those they are laid with.

Cowbirds are known to parasitize over 100 different species, so their eggs seldom match those they are laid with.  Here, a large speckled cowbird egg is alongside 3 smaller blue Chipping Sparrow eggs.

Still, there is so much love (and funding to care for) birds of prey, who must kill to consume at least 20% of their body weight a day to sustain themselves. People marvel over the beauty of an eagle soaring in the sky while nest parasites, such as cowbirds, cuckoos and several other species, are met with animosity – even though they are usually not responsible for the deaths of any other birds and are equally fascinating creatures.

  • (An exception is if a cowbird egg/baby is discovered and tossed from the nest by the duped parent. A response, nicknamed the “Mafia Behavior”, occurs where the mother cowbird will return to the nest and destroy the other eggs, in hopes of forcing the victim to create a new nest and lay a new brood, also giving her another chance to lay new eggs).

Cowbirds are somewhat infamous for contributing to the near extinction of the Kirtland’s warbler and there were even several mass attempts to remove cowbird eggs from warbler nests, although later it was found there were several other factors leading to their decline besides cowbirds, mostly from human damages to the ecosystem. And studies have even shown that when humans try to remove cowbirds, we end up helping them – removing birds from an area signals less competition, so they are able to reproduce more in that area and end up parasitizing even more nests than they would normally would have.

As with any animal that makes its way through life by competing with others, there are winners and losers.  As a rehabilitator, helping one means eventually harming another, as the circle of life continues in the wild and someone must be preyed or parasitized upon. That doesn’t mean efforts to protect the environment are any less meaningful and perhaps the best thing we can do is try to fix the damages done by humans and restore the balance that existed before human activity began to cause serious disruptions.

After all, these species got along just fine before humans showed up to observe, monitor, and “fix” nature.

“Why’d it have to be snakes?”

A common motif in Western literature and art is the representation of snakes as the embodiment of evil and deceit. We could ask ourselves, as Indiana Jones usually does on one of his adventures, “why’d it have to be snakes?” You would be hard-pressed to find a positive portrayal of a serpent in a film or book: In Harry Potter, Voldemort has a pet snake in which he places part of his soul and a dark wizard is identified by his ability to communicate with snakes. The long-running television show Dr. Who depicts a being of pure hatred that survives on fear as the giant snake Mara. In classic stories like the Jungle Book, Kaa the python is always up to something treacherous through use of hypnosis and deceit. And older still, is the famous example in the Book of Genesis, when a serpent seals Adam and Eve’s expulsion from the Garden of Eden through cunning trickery.

Countless more examples abound, but why are serpents always linked to evil?  Why not sharks, spiders, lions, or bats, all of which tend to instill equal amounts of fear among people and yet don’t have the long-lasting associations with evil like that of snakes?  Most species of snake are not particularly vicious or dangerous, at least compared to any other animal that might be associated with evil, and yet it is always snakes.

Historically, snakes have always been a common symbolic motif, and in many early human cultures they did indeed represent evil.  Nearly all cosmogonies of early civilizations – origin myths that explained the creation of the world – depicted snakes as evil beings set on world destruction.

  • The story of Gilgamesh from early Mesopotamia told how a stole the plant that provides eternal youth, causing Gilgamesh to lose his immortality – a bit like the story of the garden of Eden, where immortality in the garden was lost due to the trickery of a serpent.
  • In Ancient Egypt, Apophis was the serpent that tried to stop the sun god Re from bringing forth morning and thus he had to be battled and conquered every night before the sun could rise again.

  • The Vikings believed that Jörmungandr was a serpent so large it could encircle the earth and bite its own tail. It was the serpentine arch-nemesis of Thor that would one day kill him and initiate Ragnarok by squeezing his tail and destroying the world.

And even when they aren’t screwing over all of mankind with plans to destroy the world, snakes are still up to mediocre bouts of evil – in Greek mythology the half-human monster Medusa, who could turn men to stone with a single glance, had snakes for hair. This probably fueled later medieval folklore that warned of a giant serpent called a basilisk, whose gaze rendered its victims dead.

The Basilisk has remained a popular mythical monster, starting in Ancient Greece and continuing on through the Dark Ages, and reappearing in Harry Potter and the Chamber of Secrets

The Basilisk has remained a popular mythical monster, starting in Ancient Greece and continuing on through the Dark Ages, and famously reappearing in Harry Potter and the Chamber of Secrets

Although the stealthy behavior and sometimes venomous bite of a snake is part of the natural, biological world, it still gives people reason to dislike them, and the way that they move and slither through grass unseen makes snakes a useful metaphor for a deceitful or sly person.  They flick their tongues in and out in a slightly sinister way, therefore having a “serpent’s tongue” makes one untrustworthy.  If there is a legitimate reason to fear or dislike something in the physical world, it makes it easier to transfer that fear into a symbol that could represent pure or supernatural evil, because people already have a negative connotation to it. 

But in non-Western cultures, snakes aren’t always evil, and sometimes were represented in a duality of good and evil – although Apophis opposed the sun god Re, a Uraeus was a cobra image atop the crown of an Egyptian king meant to protect him.  Serpents meant different things to different members of society and different societies as a whole.

The Caduceus of Greek Mythology was the “messenger” staff carried by Hermes and Iris and was wrapped with two winged snakes. In modern times it is sometimes confused with the Rod of Aesculapius – which only had one, un-winged snake. As the god of medicine and healing, Aesculapius and his followers worshiped snakes and products derived from them, especially venom, were thought to have medicinal properties in ancient times.

On the left is Hermes carrying the Caduceus, which has been adopted as a symbol of medicine in place of the Rod of Asclepius, who was the god of medicine and healing.

On the left is Hermes carrying the Caduceus and on the right is Aesculapius carrying the Rod of Aesculapius.  Although the Rod of Aesculapius was carried by the god of medicine and healing, the Caduceus is sometimes portrayed in modern times as a symbol of emergency and medical services.

The imagery of a snake shedding its skin and emerging anew has also lead to its representations of rebirth, especially in Hindu cultures. During the festival days of Shravana, the “Nag panchami” involves snake worship in a quest for fame and knowledge. But a snake can also represent sexual desires and passions, both positive and negative, and could therefore contribute to an individuals downfall.

And remember Apophis, whom the Egyptians had to defeat every night? Though a figure of evil, he might not have been truly hated but instead seen as a power to be reckoned with and a necessary part of life. Good cannot triumph over evil if there is none to defeat and the ancient Egyptians valued a balance of good and evil, order and chaos, ma’at and isfet – the world is not balanced if there is no evil and an unbalanced world was seen as the true danger.

Likewise, in Norse belief, there was no avoiding Ragnarok – it was fated to happen, and Thor knew he would die to Jörmungandr long before it would happen.  So with this cultural perspective, the serpent may be seen as less of an agent of evil and more as an agent of fate, a messenger that acts to ensure the world carries on as it is meant to, whether this be good or bad for everyone involved.

The definitive imagery as serpents being fully evil didn’t really exist until Christianity came along. It is possible that the early ideas of the duality of snakes – both good and evil – is what sealed the future perceptions of snakes in a negative light. The early Church did not like dualities in ideas and early Christianity usually saw things as all good or all bad, with little middle ground. Dualities left too much open for interpretation by commoners, which was a disadvantage in a time when the Church was trying to spread quickly across cultures and into new territories.

Sadly, it was not uncommon for Christians to vilify the pagan beliefs that they did not adopt, making them wholly evil and therefore more straightforward to the common people. (Even the meanings of the words “vilify” and “villain” come from an early Christian attempt to associate the pagan French “vilain”, simply meaning a peasant farmhand, into something evil because they were, after all, pagan.) Snakes, which represented many ideas of both good and evil, came to be associated with devil worship, sorcery, and deceit because an individual could be deceived by an idea with more than one meaning.  Therefore, placing a serpent in the Garden of Eden as the ultimate downfall to mankind and as the form that Satan himself chooses when he tempts Eve, has created a permanent connection between serpents and “evil”, which has lasted for hundreds of years and still persists in Western culture today.

“Trust not too much in appearances”

I’ve been re-watching some of my favorite childhood movies – I think its a quarterlife crisis thing as graduation draws near – and one that I especially made a point to watch was “Prince of Egypt”. It was one of Dreamwork’s first animated films and is based off the Biblical story of Moses and the Exodus out of Egypt.

Val Kilmer voices Moses and Ralph Fiennes (Voldemort) voices Ramses II. Patrick Stewart, Jeff Goldblum, Michelle Pfeiffer, Sandra Bullock, Danny Glover, Helen Mirren, Martin Short, and Steve Martin also lend their voices. And you get to hear Ralph Fiennes, Martin Short, and Steve Martin sing. It’s pretty epic.

I always loved it as a child, but I recently realized I loved it for different purposes than were intended – being the story of the Exodus, a viewer is perhaps supposed to learn about the Bible, Moses, and the power of God in freeing his people from oppression.  And I will admit, the burning bush as sign of God’s power is a well-done scene and very cool… but I always loved the portrayal of the Egyptian gods more.  The Egyptian priests were far more suave than Moses or Aaron, especially with their jackal- and hawk-headed gods and their love of all things “cat”.  Not to mention, young Ramses II had a way cooler haircut.

As kid I thought the Egyptian gods might have really existed alongside the Christian God (though I surely wasn’t supposed to believe that from the Bible), and I always figured they actually had animal-heads. As a student who enjoys archaeology and mythology, I now question what the purpose of the animal-heads were and how literally they were meant to be taken.  Simply:

“Did the Egyptians actually belief their gods had animal heads?”

And the answer is not a simple one. Our understanding of Egyptian religion is lacking because we still can’t even read all of the hieroglyphics we have found. Furthermore, Egyptians considered it bad luck to write about the afterlife, religious practices, and ideas about evil, so for some topics we have no information at all.

Early Egyptologists may have believed the Egyptians had somewhat primitive ideas about religion and would have actually believed in the improbabilities of an animal-headed god. We in modern society tend to have an obsession with exotic cultures or mysterious traditions and do sometimes forget that our ancient ancestors were intelligent and capable of the same critical thinking we are, and their religion was highly organized and complex. And their religious leaders were well-educated scholars – the Dalai Lama is an incredibly well-educated man though admittedly his lifestyle, religion, or appearance can be seen as a bit “exotic” in Western culture.

The Egyptian gods are described as being “therianthropomorphic”, meaning partly human and partly animal. But representations varied widely – Anubis always has his jackal head, but Osiris is usually represented in human form. Osiris can also be represented by the “crook and the flail”, an “atef” crown, sometimes a bull (these animals were sacred to him), and even the color green (representing rebirth and fertility – understandable considering he is the king of the afterlife and fathered a son after his death). Since Osiris was a deity connected to several aspects of life and death, a mere single representation of him is neither an accurate nor fair way to demonstrate his power.

An ancient image of Osiris – note the green skin on the human form, and that he is wearing an Atef crown and holding a crossed Crook and Flail across his chest.

The famous Classicist and author Edith Hamilton wrote that the Egyptians deliberately made their gods unhuman to distance them from mortals, to make them more awe-inspiring and something to be feared. Indeed the Greeks saw the Egyptian gods as uncompanionable, mysterious, aloof, and beast-like, unlike their own gods who were human-formed with idealized beauty and very human personality traits.

And this may be slightly true, as the afterlife was seen by ancient Egyptians as being very hierarchical – gods were better than kings, who were better than elites, and at some points the possibility of a mere peasant going to the afterlife wasn’t even considered. Therefore, if the gods were meant to demonstrate their status above kings through imposing appearances, perhaps they were meant to appear aloof because they were in fact inhuman.

However, the Egyptians were sophisticated and methodical, and from their art we know they valued symbolism. Therefore it is possible that the images of gods were meant to be completely symbolic and not literal. If you have ever examined an Egyptian drawing, you will immediately notice that detail, size, perspective, and realism are absent – the information that the image projects is much more important. If pictures weren’t meant to be taken literal, we shouldn’t assume that because Horus is drawn as hawk-headed that Egyptians believed if they physically saw Horus in real life, he would have a beak.

The animal associations of power and magic were much more important than realism. It is likely that Anubis was shown with a jackal head not because it was believed that he was half dog but because the jackal was associated with cemeteries because they were scavengers and it was feared they might unearth buried corpses, and so the jackal was associated with the god who was associated with protection during burial, mummification, and the afterlife. Anubis’s jackal head is also an excellent example of the symbolic nature of the animal-headed deity because during mummification rites, priests would often don a jackal mask to emulate Anubis – but in no way was it believed this priest actually became Anubis.

Image taken from the "Book of the Dead', showing an Egyptian priest wrapping a mummy, meant to invoke protection from the god Anubis

Image taken from the “Book of the Dead’, showing an Egyptian priest wrapping a mummy, meant to invoke protection from the god Anubis

And Egyptians weren’t the only ones to do this – In classical Greek, it was believed Zeus often came to earth as a bull or swan, or disguised as a mortal (usually to seduce a maiden), but this wasn’t the actual likeness of Zeus himself. And how often is Jesus portrayed as a lamb, or the Holy Spirit as dove, even in modern society? Jesus is described as the “lamb of god”, but not because it is believed he is or ever was an actual lamb. Animal representations simply give us a more basic understanding about the nature of a deity.

This was especially important in Ancient Egypt, where the majority of people could not read. Hieroglyphs are complex and there are thousands of them to learn. It is much easier to represent the violent nature of the female demon Ammut (who eats the hearts of evil men) by showing her as having the body of a river beast like a hippopotamus rather than by describing her wrath in writing.

And another possibility to consider is that Egyptians might have accepted that they didn’t even know what their gods would look like. The gods are described as being able to hide themselves from mortals and even from other gods. Likewise, they could transform themselves and hide their true forms and secret names from mortals as well as each other, never looking the same to two different individuals. Egyptian religion therefore acknowledges there is no single concrete form a god can take, and even the idea of Egyptian bodies are multifaceted and complex – there are 5 aspects of a person, each with different appearances and functions.

So probably the safe answer is that we don’t really know what the Egyptians believed, and they might not have fully known either. But, they did not worry about actual representations – images of specific individuals and even their mummy death masks are never lifelike portraits but are instead idealized representations of what the perfect person or mummy might look like. And the fact that the gods could change their forms, even among each other, might imply there is no one specific way a god would look.