learn stuff, review stuff, just stuff

Rite of Passage

Warning: Please do not use my work and submit it as your own. Students have been caught plagiarizing from this site, and at least one university knows about this site due to that issue. This blog is not peer-reviewed, and thus is also not acceptable for scholarly research. Feel free to read the articles and papers here, but do your own research for your own schoolwork. Thank you!

I suppose you could say that I have been through several rites of passage, from high school graduation to my wedding day, my first period, or my first child. However, I don’t really see these as equivalent to those in most cultures. Especially the ones we saw in videos this week, many cultures have rites that involve some form of challenge- three days of dancing and behaving stoically, actual pain and suffering, the requirement to fend for yourself instead of always relying on your parents (or government) to support you. Continue reading


07/05/2012 Posted by | College Papers, Learning, Thinking | , , | 2 Comments

Persephone’s Capture: Duality in Myth

Warning: Please do not use my work and submit it as your own. Students have been caught plagiarizing from this site, and at least one university knows about this site due to that issue. This blog is not peer-reviewed, and thus is also not acceptable for scholarly research. Feel free to read the articles and papers here, but do your own research for your own schoolwork. Thank you!

Persephone’s Capture: Duality in Myth

            “To see life as a poem and yourself participating in a poem is what the myth does for you” (Campbell, 1988, p. 65).  Like poetry, myth uses recurring themes, symbolism, and metaphor, but myth is not about fanciful stories.  Myth is about what it means to be human, our story in its entirety, and how to live in harmony with your society.  Sometimes, as is the case in Levi-Strauss’s analysis of hare-lips, twins, and children born feet first, several myths of varying cultures have similar symbols, speaking to the similarity of humans no matter the culture.  Sometimes, as in the Virgin of Guadalupe, one of these symbols can grow to encompass the entirety of a culture’s beliefs as a “master symbol.”  In both examples, the myth is shown to be symbolic, not factual, and this is the foundation of studying myth.  By identifying and comparing the symbolism inherent in myth, one can envision the universal nature of man.

One common theme in myth is that of duality, whether it be male/female, good/evil, Heaven/Hell, life/death or spring/winter.  A well-known Greek myth about this duality is that of Persephone.  Persephone, sometimes called Kore when associated with spring as the Maiden of Corn, was a beautiful child, loved by her mother, Demeter.  Demeter was the Greek goddess responsible for bountiful harvest, grain, and growth, and Persephone/Kore was a fitting version of her.  One day, she was in the fields with her mother and found a beautiful flower.  She was so entranced by it that she did not hear the ground opening behind her.  Hades, King of the Underworld, rose up with his horse-drawn chariot, abducted the girl, and took her back to the Underworld to be his Queen.  Demeter realizes her daughter is missing and goes on a search, forgetting about her duties.  The world experiences its first season of winter as the crops wither and die because the mother is in mourning.  She discovers the location of her daughter, but Hades has tricked Persephone into eating a number of pomegranate seeds, sealing her fate.  In some tellings, she adapts well to her new role as Queen of the Underworld and greeter of new souls, but in all accounts, she misses her mother as well.  Eventually, a balance is stricken.  Persephone comes to the middle ground of Earth to be with her mother as Kore, Goddess of Rebirth in the spring, and Demeter’s happiness is seen in the new life given to plants across the Earth.  She then returns to her throne in the Underworephld with Demeter returning to her mourning for the fall and winter.

Duality is emphasized symbolically throughout this myth.  The obvious example is in spring and winter, with spring emphasizing youth, happiness, and rebirth, while winter shows Demeter as a sad, old crone, and death of the crops.  The goddess Persephone herself is a great example of duality.  She is at once Kore, child goddess of grain, youthful and joyous at the feet of her mother, and Persephone, Queen of the Underworld, grim and terrifying at the side of her husband.  Duality not only discusses, in Campbell’s view, a reference to before the transcendent entered into the field of time, and the balance required for a full life, but also the common human belief that to understand, appreciate, and experience the good, one must also experience the bad.


Campbell, J.  (1988).  The Power of Myth.  New York: MJF Books.

Moro, P. & Myers, J.  (2010).  Magic, Witchcraft, and Religion: A Reader in the Anthropology of Religion.  Eighth edition.  New York: McGraw-Hill.

Strong, L.  (n.d.).  The Myth of Persephone: Greek Goddess of the Underworld.  Retrieved June 30, 2012 from http://www.mythicarts.com/writing/Persephone.html

07/03/2012 Posted by | College Papers, Learning, Thinking | | Leave a comment

Kwakwaka’wakw: Masks of Our Ancestors

WARNING: Don’t plagiarize, for crying out loud! Do your own work. A faculty member at my college has contacted me to let me know people have been caught copying my work. Don’t be next!

Kwakwaka’wakw: Masks of Our Ancestors

            “At the beginning of the world, a bird flew down from the sky and sat on the beach near Tsaxis (Fort Rupert).           The bird took off its mask and became a man.  His name was ‘Namugwis, and he became the founder of an important family of the Kwagu╪” (Umista, n.d., Ṫseka Animal Masks: Xisiwe’ Wolf).  If there is one recurring element of Kwakwaka’wakw myth, it is the ability to transform animals to men and men to animals by removing and donning masks.  The stylized appearance of the masks is also used in family totem poles, describing the animal ancestor that founded the family.  This everyday appearance of the supernatural is not relegated to the artwork of the Kwakwaka’wakw.  They live in a world filled with spirituality as demonstrated by the way they treat their food, the structure of their families and government, and the ceremonies and rituals they take part in.


Known as the Kwakkewlths by Indian Affairs, the Kwakiutl by anthropologists, and rarely by their individual tribal names like Kwagu’ł, Mamalilikala, and ‘Namᶃis, the Kwakwaka’wakw name literally means “people who speak Kwak’wala,” and is the chosen name for the group of tribes living on the northeastern part of Vancouver Island and the mainland directly opposite (U’mista, n.d., The Kwak’wala Speaking Tribes).  As is common with many people on the coast, whether Maryland’s crabs, Louisiana’s crawfish, or Maine’s lobster, the Kwakwaka’wakw rely mostly on seafood, specifically fish for their diet.  The year is broken up into two distinct parts: the summer months of intensive food collection and the winter, which is set aside for spiritual and social activities (Berman, 2000).  Food collection is ritualized, and always proceeds in order from the oolichan, herring, and king salmon to the halibut, sockeye, coho, humpback, and dog salmon (Berman, 2000).

The importance of the first fish, especially to the nineteenth century Kwakwaka’wakw, cannot be overstated.  The birthright of the chief of the Qәmqәmtalał descent group of the Dәʼnaxdaˀʬ was to fill his dipnet with oolichan at the exact position where his ancestor had first fished, and pray to the fish, welcoming them “for [they] were trying to come to [him]” emphasizing the reciprocal nature of the relationship between man and spirit (Berman, 2000, p 60).  The other fishermen would wait for the chief to fill and empty his net four times before beginning, each praying to his first catch as well (Berman, 2000).  Each species of fish also had different rituals involved with the catching, preparing, and eating of the first catch.  After eating the first coho, the fat is not washed or wiped from the hands, dogs and menstruating women were not permitted to eat fresh fish, and some parts of the fish, such as the intestines, were taboo as well (Berman, 2000).  Finally, once the first fish is finished being cleaned, all the remains are placed into a basket, then poured back into the mouth of the river (Berman, 2000).

Inherent in each of these taboos and rituals is the sense that the fish have chosen to bless men with their flesh to use as food for their sustenance, and the proper treatment of their bodies and remains will ensure the spirits decide to bless the men again (Berman, 2000).  By placing the first fish in the water, all the fish would be reincarnated for the next harvest, thanks to the concept of the Water of Life.  In Kwakwaka’wakw myth, many spirits have the Water of Life, which is a liquid that grants the power of resurrection and is usually associated with the urine of the chief of Ghosts, but never the salmon (Berman, 2000).  Salmon have this power in their very skin, activated when they reach water (Berman, 2000).  Many of their spirits, and “deities” if a poor correlation can be made, emphasize their reverence for fish, such as Fish Maker and Oolichan Woman (Berman, 2000).  Their two season cycle and the underlying reciprocal nature of their relationship with the spirits is summed up as “in the summer, man hunts for fish (spirits), and in the winter, spirits hunt for man.”  All creatures must eat to survive, including the spirits who are sustained by the ceremonies, rituals, and belief of the Kwakwaka’wakw.


As is common with many smaller groupings of people, political structure and kinship among the Kwakwaka’wakw are closely related.  The Kwakwaka’wakw as a whole were a collection of different tribes that all spoke the Kwak’wala language (Kwakiutl, n.d.).  The tribes were composed of groups called ‘na’mima, each of which had a head chief, lesser chiefs, commoners, and their families (Kwakiutl, n.d.).  The members of a ‘na’mima, the ‘na’mima itself, as well as the tribes were all ranked against each other in terms of prestige (Codere, 1957).  The Kwagiulth Museum itself has organized its collection into the ranked order of the owners at the time of the potlatch confiscation, emphasizing the view that the rankings and rights to privileges were the backbone of Kwakwaka’wakw society (Mauze, 2003).  A chief gained prestige for his ‘na’mima or his tribe through the tradition of potlatch, discussed in greater detail later in this paper.  Individuals are granted status and nobility by their peers, and the titles are generally passed on to someone else such that even chiefs die as commoners (Codere, 1957).  In addition, warriors could claim the names, positions, family crests, and privileges of their victims as spoils of war, further emphasizing the fluid nature of the so-called “class” system (McLuckie, 2007).

Franz Boas reported that there were four classes of members, chiefs, nobility, commoners, and slaves in 1906, but refused to classify them as such in later work, saying in 1920 that while the ranking system existed, the Kwakwaka’wakw exist as a classless society (Codere, 1957).  The head chiefs were direct descendants of the founding ancestor, usually thought to be animals that removed their masks, like the Thunderbird, the seagull, orca, or grizzly, but may also be descended from humans from distant places (Berman, 2000).  These ancestors were displayed prominently on the totem poles, giving visitors an easy way to tell where they may find kin in a new village by simply looking for their common ancestor (Berman, 2000).  Chiefs were responsible for organizing the management of resources, and were given a portion of the harvest in return, in a sort of government taxation analogy (Kwakiutl, n.d.).  The somewhat misleading term “commoner” in the Kwakwaka’wakw culture refers to a person who, at that very moment, does not hold a “potlach [sic] position, chief’s position, or standing place” or to a person who has a low ranking but still holds a “standing place” or position (Codere, 1957, p. 474).  Slaves were generally prisoners of war, but were not segregated from the family in any way that can be observed except, perhaps, through burial practices (Ames, 2001).  Typically, the slaves would be held for ransom, but even if the expensive ransom were paid, the former slave would have that shame follow throughout his life (Ames, 2001).

In the winter months, when spirits were believed to visit the villages, everything changed, from individuals’ names to the classes of society.  The uninitiated were simply the audience to the ceremonies and dances (Berman, 2000).  The “Seals” were high ranking members that were under the influence of spirits, and the “Sparrows” were hereditary officials that managed the proceedings (Berman, 2000).


The Kwakwaka’wakw may embody spirituality and ritual in everyday life, but they also have intricate ceremonies and celebrations.  In the nineteenth century, when First Nations people were still being discovered and the thrill of finding “savages” still motivated whites, the Hamat’sa, a dance featured in the Winter Ceremonial, was everything those “civilized people” had hoped for.  There still exists quite a bit of controversy over whether the Hamat’sa ever included actual cannibalism, or if it was ceremonial and dramatized, and simply misunderstood by the non-native audience.  Ruth Benedict, a well-known anthropologist, certainly believed that “the Cannibal ate the bodies of slaves who had been killed for the purpose” as late as 1959 (McLuckie, 2007, p. 150).  Other sources state that the “bites” were actually created by use of knife, and every piece of flesh “consumed” was either hidden using sleight-of-hand or was regurgitated after the performance, each piece closely tracked to ensure that none was actually ingested (McLuckie, 2007).  The dance is a reenactment of the origin story of Baxbakualanuxsiwae, the Man-Eater-at-the-Mouth-of-the-River, who was killed by the sons of a chief, Nanwaqawe, with help from a long-lost sister, the qominoqa (McLuckie, 2007).  In the ceremony, one initiate is abducted after being “sacrificed” to the Man-Eater, in reality sequestered away learning the rites and rituals associated with the dance (McLuckie, 2007).  The initiate is always male and has earned the right to participate, either through inheritance, as dowry, or as spoils of war (McLuckie, 2007).  When he enters society again, it is as a wild creature who must be tamed by other members in a ritual dance, providing a metaphor for the effects of a strong society against the unpredictable, often dangerous forces of the spirits (McLuckie, 2007).  McLuckie points out that it is akin to the Greeks dramatizing violence as a way to confirm cultural values and transfer them to the next generation (McLuckie, 2007).  In all, the dance is representative of a common theme- the introduction to the supernatural causes frenzy that is once again tamed by society (McLuckie, 2007).

The Winter Ceremonial itself, of which the Hamat’sa is a part, is part of series of myths about Raven, his friend and possibly younger brother, Mink, and the Wolves.  Raven is seen as a great benefactor for the Kwakwaka’wakw, traveling among the spirits striving for balance and cycles in all things, like the weather, high and low tides, and light and darkness (Berman, 2000).  After a great deal of trickery, insults, and threats between members of both parties, the wolves decided to hold a winter dance, complete with the red-dyed, shredded cedar bark regalia that is worn during the Winter Ceremonial by the Kwakwaka’wakw (Berman, 2000).  They try to keep the dance a secret, especially from their enemy, Raven, but he has already been listening in (Berman, 2000).  Wolf’s children continue to trick Mink, who retaliates by killing the wolves and then dances in the ceremony wearing the eldest son’s head as a mask (Berman, 2000).  Coupled with a two-headed serpent he had captured and used, the wolves ran away in shame, becoming true wolves forever, leaving the dance with Raven and Mink (McLuckie, 2007).  This action brought about the permanent separation between humans and animals in many different versions of the mythology (McLuckie, 2007).

The potlatch has been viewed in many different ways by outsiders, but remains a sort of social contract for the Kwakwaka’wakw.  It was banned by Canada in 1884, partially for economic reasons, but also because of the threat of religious implication in the ceremony (Mauze, 2003).  Many natives continued the tradition, not only because it was a part of their culture, but it was also a part of their record-keeping (Umista, n.d.).  In 1921, Dan Cranmer, a Nimpkish chief of Village Island, organized a large potlatch to repay his wife’s bride-price (Mauze, 2003).  To clarify, while there was a payment made for marriage, it was not the woman who was purchased, but the hereditary rights of the future children created from the union that were purchased from the bride’s family (McLuckie, 2007).  There were between three and four hundred people attending Cranmer’s potlatch, which Indian agent William Halliday heard about, despite the attempts at secrecy (Mauze, 2003).  Thirty-four people were charged with such terrible crimes as “distributing gifts, delivering speeches, singing songs, and so forth,” with all pleading guilty and required to surrender their potlatch “coppers, masks, head dresses [sic], potlach [sic]blankets and boxes and other paraphernalia used solely for potlatch purposes” (Mauze, 2003, p. 505).  Those who agreed were given a suspended sentence, while the others were sent to jail in Vancouver (Mauze, 2003).  While the anti-potlatching law was never officially repealed, it was deleted from the legal codes in 1951, and the Kwakwaka’wakw still potlatch to the present day, and have been mostly successful at repatriating their confiscated potlatch goods (Umista, n.d.).

The potlatch ceremony itself is social, religious, legal, and cultural all in one event.  “[F]amilies gather and names are given, births are announced, marriages are conducted, and … families mourn the loss of a loved one,” (Umista, n.d., The Potlatch).  In addition, the potlatch is where, as mentioned earlier, a chief will pass on his rights, titles, and privileges to his eldest son (Umista, n.d.).  The events occur in a specified order, from the ~seka (or t’seka) dance, which includes the Hamat’sa, the T’╪asala or Peace Dance, the sa╪a mourning ceremony, and the sale or transfer of the ceremonial coppers to marriage ceremonies and feasts and a grand gift-giving (Umista, n.d.).  The gift-giving is often likened to a redistribution of wealth, since the chief receives a portion of all harvests for his management of the resources, which he uses to throw potlatches, but the gifts are given for witnessing, recording, and passing on the events as a sort of social contract (Umista, n.d.).  A chief, and thus his tribe or ‘na’mima, will gain status based on how much they can afford to give away at these gatherings, and it is indeed a wealthy and enviable chief who can afford to have several of these in a relatively short time (Umista, n.d.).


As is common with Native American and First Nations people, the separation between sacred and profane in the world of the Kwak’wala speakers is nonexistent.  Spirituality infuses their entire lives, from the food they eat, to the structure of their tribes, to the ceremonies they take part in.  Attempting to cut a “pagan” religion from the culture of the Kwakwaka’wakw is to cut the culture itself.  The band is currently thriving and committed to bringing their heritage with them into the technological future, updating and upgrading where necessary to ensure their culture and beliefs are still relevant in today’s ever-changing world.


Ames, K.  (2001, June).  Slaves, Chiefs and Labour on the Northern Northwest Coast.  World Archaeology, 33(1), pp. 1-17.  Retrieved June 8, 2012 from JSTOR.

Berman, J.  (2000, May 1).  Red Salmon and Red Cedar Bark: Another Look at the Nineteenth-century Kwakwaka’wakw Winter Ceremonial.  BC Studies, (125/126), p. 53.  Retrieved June 8, 2012 from EBSCOhost.

Codere, H.  (1957, June).  Kwakiutl Society: Rank without Class.  American Anthropologist, 59(3), pp. 473-486.  Retrieved June 8, 2012 from JSTOR.

Kwakiutl. (n.d.).  Kwakiutl Indian Band homepage.  Retrieved June 20, 2012 from http://www.kwakiutl.bc.ca

Lobo, S., Talbot, S., & Morris, T.  (2010).   Native American Voices: A Reader.  Third Edition.  Boston: Prentice Hall.

Mauze, M.  (2003, June 1).  Two Kwakwaka’wakw Museums: Heritage and Politics.  Ethnohistory 50(3), pp. 503-522.  Retrieved June 8, 2011 from EBSCOhost.

McLuckie, A.  (2007).  Reinterpreting the Kwakiutl Hamatsa Dance As an Expression of the Apollonian and Dionysian Synthesis.  Religious Studies and Theology, 26(2), p. 149.  Retrieved June 8, 2012 from ProQuest database.

U’mista (n.d.).  U’Mista Cultural Society.  Retrieved June 8, 2012 from http://www.umista.ca/kwakwakawakw/index/php

Umista.  (n.d.).  Virtual Museum Canada.  The Story of the Masks.  Retrieved June 24, 2012 from http://www.umista.org/masks_story/en/ht/index.html

06/25/2012 Posted by | College Papers, Learning | , , | Leave a comment


Over 10,000 hits! Just a quick note to say thanks to everyone who reads this. I know it can’t all be my mom!

06/25/2012 Posted by | Uncategorized | Leave a comment

Populating the New World: Theories of Initial Migration

Warning: Please do not use my work and submit it as your own. Students have been caught plagiarizing from this site, and at least one university knows about this site due to that issue. This blog is not peer-reviewed, and thus is also not acceptable for scholarly research. Feel free to read the articles and papers here, but do your own research for your own schoolwork. Thank you!

Populating the New World: Theories of Initial Migration

            For decades, if not centuries, the origins of man have been some of the most heated scientific battles ever known.  The biggest debate is probably between the “Out of Africa” model and the multiregional theory of the first Homo sapiens, but the debate about the initial population of the North and South American continents cannot be far behind.  Nearly everyone has an idea of the popular “Bering Strait land mass” theory of migration, as it is the one typically taught in schools.  There are actually two theories that stem from the Beringian model, one claiming that the Paleo-Indians found an interior passage that was free of ice, and the other arguing a coastal route.  Finally, there is also a theory that states the first people to make their way to North America were not those from Siberia at all, but Southwestern Europeans by way of the Solutrean culture.  This paper will give a brief overview of the two main theories, followed by an analysis and author’s opinion.

The reigning theory holds that man migrated from the Asian landmass across a land bridge between modern Russia and Alaska.  That is where the semblance of agreement ends.  Until fairly recently, it was generally accepted that the initial migration occurred about 12,900 years ago, after the Last Glacial Maximum (LGM) (Schurr, 2004).  The people who migrated found an ice-free corridor into the continent, likely following large animals, and used “Clovis” tools (Schurr, 2004).  Because there were no older sites found, and other early man sites seem to derive from these Clovis people, the Clovis First model claims to be the “earliest occupancy of the Americas by modern human groups” (Schurr, 2004, p. 552).  However, this claim has been contested.  Findings at sites such as Meadowcroft, Cactus Hill, Topper, and Calico make claims of earlier migration, as early as 16,000 years ago.  Again, until recently, this was thought impossible because the migration would predate the earliest known settlement of the Siberian region, but the Yana River site shows human occupation as early as 30,000 years ago, well before the LGM (Schurr, 2004).  The ice-free corridor theory has also taken a hit due to the lack of any animal bones found in the area between 21,000 and 11,500 years ago (Schurr, 2004).  In all, the evidence seems to point towards a coastal method of migration, which would also help to explain the incredibly rapid expansion of man throughout the two continents, which is estimated to have happened in only a few centuries (Schurr, 2004).

Bradley and Stanford have presented a new theory to challenge that notion.  They maintain that the Beringian theory is simply educated conjecture, not evidence, and that as scientists we must be careful not to create dogma and ideology in our studies (Bradley & Stanford, 2004).  In The North Atlantic Ice Edge Corridor, they go into exacting detail about what makes Clovis points so unique, and then explain that no tool like the Clovis point has ever been discovered prior to Clovis except in southwestern Europe (Bradley & Stanford, 2004).  They do not claim that people never came across the Beringian land bridge, just that they were not the first, and were not the ancestors of Clovis (Bradley & Stanford, 2004).  The biggest challenges to the Solutrean migration are the 6,000 year difference in time and the incredible difficulty of a journey across the North Atlantic (Bradley & Stanford, 2004).  The “complexity and difficulty of [overshot flaking to create thin bifaces] and its rarity” lead Bradley and Stanford to believe it is more likely that the Atlantic was crossed than to believe that Clovis points and Solutrean points were independently developed (Bradley & Stanford, 2004, p.465).  The claim is that the Solutrean people took to the sea to hunt seals, and eventually chased their prey “too far” until they found land on the other side, and may have even started a cyclic hunt that spanned the ocean, setting up camps with entire kin groups on the opposite coast (Bradley & Stanford, 2004).  Sites like Meadowcroft (Pennsylvania), Cactus Hill (Virginia) and Page-Ladson (Florida) have been dated to pre-Clovis times and have some of the same tools as have been found in Solutrean sites (Bradley & Stanford, 2004).  Dating of Clovis sites seems to uphold this model, with the oldest dates coming in the east and the youngest at the west (Bradley & Stanford, 2004).

Personally, I believe that minds must always be kept open, particularly in the pursuit of science and history.  There was a time when we knew the world was flat, and a time when we knew that there was no way man was in the Americas before 4,000 years ago.  Now we know both of these claims are false, but we are still claiming to “know” facts that have not been proven.  In any case, I always search out “new” or “controversial” studies because if the truth is true, it will withstand any dissent.  Therefore, I admire Bradley and Stanford for publishing their views.  I also hesitate to “pick sides,” particularly when the answers are not mutually exclusive.  Without years of intensive research, I could not possibly add any insight to this discussion, nor could I determine from two articles which is “more correct,” but I must say that though I had not heard of the Solutrean connection before this paper, I find it fascinating and possible.  If genealogy says Native Americans came from Asia, and tools say they came from Europe, why can it not be said that they both came to the Americas and created the first “melting pot” before the melting pot we refer to now.  Why is it not possible that a smaller group of Solutreans spread their tools to a large group of Asian migrants who wound up overtaking the former?  Without considerable evidence to back either claim, it seems that the discussion will have to wait until “we know” where the Paleo-Indians came from.


Bradley, B. & Stanford, D.  (2004). The North Atlantic Ice-Edge Corridor: A Possible Paleolithic Route to the New World.  World Archaeology 36(4) pp. 459-478.  Retrieved June 1, 2012 from JSTOR.

Schurr, T.  (2004).  The Peopling of the New World: Perspectives from Molecular Anthropology.  Annual Review of Anthropology Vol. 33 pp. 551-583.  Retrieved June 1, 2012 from JSTOR.

06/04/2012 Posted by | College Papers, Learning, Thinking | , , , , | Leave a comment

Case Study: Ethics of EA and Susan G. Komen

Warning: Please do not use my work and submit it as your own. Students have been caught plagiarizing from this site, and at least one university knows about this site due to that issue. This blog is not peer-reviewed, and thus is also not acceptable for scholarly research. Feel free to read the articles and papers here, but do your own research for your own schoolwork. Thank you!

Case Study: Susan G. Komen for the Cure and Electronic Arts


Electronic Arts, a video game company, and Susan G. Komen for the Cure, a non-profit breast cancer awareness and research charity, are both under fire for ethical concerns. EA has misused employees and treated customers as cash cows while Komen tends to make decisions that show a desire for money to spend on research and awareness, regardless of how the money is raised. This paper emphasizes the need for a clear ethical stance that is consulted in every step of decision-making.


Regardless of the theory of ethics used, acting ethically is often among the priorities of any individual or group. Sometimes, what is ethical to one may be unethical to another, which leads to ethics violations. Often the ethical issues that arise stem from failure to fully determine ethical guidelines for oneself or one’s organization. Two groups will be examined here in light of ethical conduct. One, Electronic Arts, is a company whose primary objective is to make money for shareholders. The other, Susan G. Komen for the Cure, is a non-profit organization dedicated to raising money for their chosen cause. Both have had their ethics called into question and have both ongoing and resolved issues. Both of these companies have made missteps that could have been avoided if they had clear ethical stances that were used in decision-making.


Electronic Arts (EA), a “global interactive entertainment software company,” was founded in 1982 and is responsible for many high-profile video game franchises like Madden NFL, The Sims, Star Wars: The Old Republic, Dragon Age, and Mass Effect (About, n.d.). The company operates many smaller studios under the EA umbrella and has acquired many smaller studios like EA Sports and BioWare. The ethical issues faced by EA come from their business practices and their use of Digital Rights Management (DRM).

EA rose to the top of the video game industry using a very effective business model. They used licenses granting sole rights to many non-game franchises to create tie-in games like with the NFL or Harry Potter. This means that even though a rival company could make a football game, they could not use actual player names or teams, a major draw for football fans who play video games. Similarly, fans of a particular movie or movie series will be more likely to purchase a game using licensed material such as plot lines, characters, etc. A common complaint among gamers is that licensed games are generally less entertaining than an original IP (intellectual property). Movie tie-in games especially have been plagued by bugs and poor quality due to the rush of production, trying to release the game the same week as the movie’s release. The same issue is present in the yearly releases of sports games that change very little beyond the updated rosters. Madden NFL, for example, has a number of “legacy issues” that have been apparent for several years, but are not addressed each year in favor of slicker UI (user interface) and other cosmetic changes. In addition, the company has, in the past, been referred to as the “Evil Empire” due to their tendency to buy out smaller studios, essentially just to purchase their IPs, disregarding the acquired company and its employees (Chella, 2005). These business practices have often caused gamers to feel as though the company sees them as little more than cash cows who will blindly purchase any game simply due to the title, regardless of actual quality, essentially using their customers as a means to an end, rather than an end themselves. It may have worked in the past, but with gaming reaching mainstream popularity, more information is available about games prior to release, allowing more informed decisions and making EA lose quite a bit of popularity. Beyond just the consumers, EA has a history of using its employees in an unethical manner, paying salaries but demanding as much as a 100-hour work week, even outside of “crunch time,” prompting comparisons to sweatshops in descriptions of working conditions (Mieszkowski, 2004). Recently, EA has decided to shift their business practices to models more like their competitors, allowing acquired companies to maintain most of their autonomy and creative control and treating employees more fairly by providing an hourly wage. They have also started to focus more on new and original IPs, and even this year’s edition of Madden, Madden NFL 13 is acknowledging the legacy issues and trying to address them (Miller, 2012).

Piracy is always an issue for any sort of digital content, whether in gaming, music, or movies. EA has attempted to curb piracy using various Digital Rights Management (DRM) methods. Unfortunately, most of them have backfired. One method used was to prevent multiple installations of a game, at first limiting Spore, an ambitious creature creator simulation, to only three installations over the life of the game. Considering that many “hard-core” gamers have multiple computers and upgrade them constantly, three game installations may only last a short time. Gamers claimed that this essentially meant that they are renting the game for full price (Copyright, 2008). Even more insulting, Spore became the most pirated game in history, followed by EA’s The Sims, partially due to backlash against their “draconian” DRM policy (Rosenberg, 2008). EA’s next big release increased the number of activations to five, but consumers were still not satisfied. EA eventually went back to a traditional “CD-key” method of anti-piracy, but it was not to last. In fact, the DRM wars seemed to have faded into the past when issues related to Dragon Age: Origins started appearing. EA publicized their decision not to use the Spore-related SecuROM in Dragon Age: Origins, but their usage of online verification was less well known. In April of 2011, gamers with legitimate copies of the game were locked out of their offline, single-player campaigns due to server issues with the EA authentication servers (Ewalt, 2011). Meanwhile, pirates who were playing “cracked” versions that had the authentication stripped from their games were happily enjoying their illegal product (Ewalt, 2011). Essentially, EA’s anti-piracy DRM is actually encouraging piracy and punishing their consumers who purchase legal copies of the games. It is not uncommon to find gamers asking one another if it is ethical to purchase a copy of a game but download a “cracked” version to actually play so that the company still receives the monetary compensation for the product, but the issues related to DRM are bypassed. Once again, EA is using their customers as a means to an end, specifically making them put up with annoying factors in an attempt to bolster the bottom line. Unfortunately for EA, it seems to have backfired every time.

Susan G. Komen for the Cure is a non-profit organization that raises money for breast cancer awareness and research. The organization was founded by Nancy G. Brinker in 1982 as a promise to her sister, Suzy, who lost her life in a battle with breast cancer (Komen Home, n.d.). Originally “Susan G. Komen Breast Cancer Foundation,” the name and logo were changed in 2007 on their twenty-fifth anniversary (Brainerd, 2007). As the most well-known and popular breast cancer research charity, if not the most well-known non-profit organization in the country, Komen is responsible for turning October into the month of pink ribbons. Concerns about the over-use of pink and the unhealthiness of some of the products adorned with the pink ribbon have raised ethical concerns in a scandal termed “pinkwashing.” In addition to pinkwashing, a recent public relations nightmare involving Planned Parenthood has revealed questionable ethics on the part of the board members.

Pinkwashing is a term that was coined by Breast Cancer Action, a sort of watchdog group that operates the website ThinkBeforeYouPink.org, trying to stop consumers from blindly supporting any group with a pink ribbon (Think Before You Pink, n.d.). There are two aspects of pinkwashing. One is what many survivors feel is the commercialization of their disease (Begos, 2011). The pink ribbon is not copyrighted, so companies are legally allowed to decorate their goods with pink ribbons without making any sort of commitment to donate proceeds to Breast Cancer Awareness. In addition, many of those who do partner with Komen, or other cancer societies, have particular methods of donation which are not always clear to the customers. According to Think Before You Pink, some companies have a cap on total donations, meaning that purchases made after the cap is reached do not go towards breast cancer awareness, even though they have no intention of removing the pink labels (Think Before You Pink, n.d.). Another tactic was used by Kentucky Fried Chicken. KFC donated a flat amount to Komen, and then sold special pink buckets of chicken (Anonymous, 2010). The amount donated would not change no matter how many buckets of chicken were purchased or whether those buckets were pink or not. This all basically amounts to free advertising for both Komen and the company involved.

The second form of pinkwashing comes from Komen partnering with products that are unhealthy, sometimes even causing cancer themselves. Back to the KFC scandal, their fried chicken can contribute to an unhealthy lifestyle, and obesity can be a factor in a number of diseases including cancer (Anonymous, 2010). Komen itself has marketed a perfume called “Promise Me,” referencing Brinker’s promise to sister Suzy, that contained chemicals that were known to cause cancer (Szabo, 2011). In addition, according to Brenda Coffee, a breast cancer survivor, “patients treated with chemotherapy often become hypersensitive to scents, and perfumes can give them headaches, dizzy spells or nausea, even years after treatment” (Szabo, 2011). When confronted with this information, Komen said they would reformulate the perfume, but did not remove existing products from the shelf (Think Before You Pink, n.d.). “Promise Me” in particular, and other examples in general, show a growing insensitivity to the very victims Komen attempts to help, using a sort of “ends justify the means” method of fund-raising. Apparently the method of raising the money makes little difference as long as research is done, awareness is spread, and Nancy Brinker and friends are paid.

Another example of Komen not determining their ethical stance prior to policy making came early this year. A conservative politician opened an investigation into Planned Parenthood. A conservative board member urged Komen to stop grants that funded Planned Parenthood, citing a previously unknown policy not to fund projects under investigation. Whether this policy was enacted for Planned Parenthood or it had always been a mindset is unclear. Liberals across the nation reacted with fury to the news that Komen was pulling funding for the “sexual and reproductive health care provider and advocate,” which is also the country’s leading provider of abortions (About us, n.d.). Donations poured into both organizations, pro-life groups celebrating Komen and pro-choice groups making up for lost funds by donating directly to Planned Parenthood (Feldmann, 2012). With the backlash caused by the defunding of Planned Parenthood, Komen issued a statement retracting its previous decision, pledging to continue the grants, which was accompanied by resignations of some of the Komen decision makers, including the conservative who first suggested the defunding (Feldmann, 2012). With this action, Komen simply looks inept, or willing to reverse their ethical stance when convenient, neither of which are favorable. The move to continue the Planned Parenthood grants alienated pro-life donators, some of which were unaware of the ties to Planned Parenthood before the dispute, and did little to convince the pro-choice donators to return (Hopfensperger, 2012). This was evident in May when Komen’s annual Race for the Cure registration was down over 5,000 runners from last year (Hopfensperger, 2012). Some have accused that board members were letting personal politics color their judgment; others have accused the company of straying too far from their stated mission. Some wonder why Komen funds Planned Parenthood at all, considering that they do not provide mammograms, and only help recommend clinics that do (McCormack, 2011).

Alternatives and Solutions.

One of the most important actions a group can take, especially when in a field that can be littered with ethical problems, is to have a clear understanding of the group’s ethical stance. Typically, this can be made simple by taking a utilitarian or deontological approach to issues. Sometimes, however, a more in-depth review must take place.

EA’s former business model was not in line with either school of ethical thought. Initially, it was providing shareholders a great profit as it rose through the ranks to become a top three company in the industry, but as the product began to suffer and complaints from employees started to surface, stocks dropped. In all, the plan did not suit any of the parties involved. With a firm ethical standpoint, EA could have avoided the “sweat-shop” and “monopoly” claims and instead took up a position more in line with their current business model. Among the changes were a fairer approach to employee pay, more creative control for acquired studios, and a new approach toward franchises, attempting to fix legacy issues rather than just pumping out the same product with new names.

Ethical treatment of one’s customers is also an issue that can be determined well ahead of issues such as EA’s DRM scandals. To purposefully annoy your customers to attempt to stop pirates in a manner that only hurts legal consumers is madness. EA should realize that DRM only encourages piracy and stop punishing their customers.

Not all pinkwashing is Komen’s fault, though one could argue that Komen is what made the pink ribbon so “fashionable,” and some blame can be laid at the feet of companies looking to unethically boost sales. However, the majority of the trend is indeed due to Komen’s tireless efforts to get the pink out. Some survivors are tired of the cheer associated with “celebrating pink” that simply glosses over the pain and loss associated with a terrible disease (Begos, 2011). The pervasiveness of the pink ribbon makes one wonder why Komen still needs to spend so much of its budget on “awareness” and not on “research.” Companies should be honest with their customers about the donations that are made and to which organization they will be made.

Komen seems to have made one ethical standpoint clear: money is money no matter where it comes from, and money is needed to combat cancer. They show little, if any, worry about endorsing an unhealthy product, focusing more on how they will use the money to support survivors and patients, and less on how the money was earned. Partnering with companies like KFC and Komen’s own “Promise Me” perfume proved this.

Finally, had Komen’s ethical stance been clear, there may not have been an issue regarding Planned Parenthood. Komen should have known if it was going to be influenced by politics and which “side” they would be on if it were. When funding such a controversial organization as Planned Parenthood, it is amazing that they were not better prepared for this eventuality. In addition, by reversing their decision, it made it look as though Komen’s ethics were a matter of convenience, not right and wrong.


Each company must have a clear vision of what their ethical stance is. It should be reviewed often and practices should be held against it to determine if the company is on the right course. EA must continue to treat employees fairly and attempt to create products that gamers will continue to purchase. By creating good products and actually improving existing franchises, they will not only be benefitting gamers, but also their shareholders as stocks improve. EA must also deal with piracy in another manner. This author suggests that DRM is a failure, piracy will always happen, and attempting to end piracy will only punish legal consumers. Remove DRM and ship your games. If the product is worth the price tag, even pirates will purchase a copy, understanding that sales are what drive more content into production.

Komen must make every decision with breast cancer patients and survivors at the forefront of their minds. While the choice between more money for research and only appropriate product deals is a difficult one, this author believes that awareness is no longer an issue. Komen, especially after the most recent issue, needs to be dedicated to nothing but the best interest of their constituency. They should only partner with companies that are appropriate and have clear donation plans that are easily available to consumers. They must also stick by their decisions. If a decision needs to be revoked, it was not properly considered prior to implementation. We all have the right to learn from our mistakes, but such an easy reversal does not look good for anyone.

In short, know your ethical stance and follow it with every decision.
About. (n.d.). Electronic Arts home page. Retrieved May 2, 2012 from http://www.ea.com/about
About Us. (n.d.) Planned Parenthood About Us page. Retrieved May 20, 2012 from http://www.plannedparenthood.org
Anonymous. (2010, April 22). Breast Cancer Action calls shame on KFC’s Pink Buckets campaign: KFC’s new pinkwashing campaign to ‘raise money for breast cancer’ is half-cooked! U.S. Newswire. Retrieved May 2, 2012 from ProQuest.
Begos, K. (2011, October 12). ‘Pinkwashing’ has some seeing red. Telegraph- Herald. Retrieved May 2, 2012 from ProQuest.
Brainerd Dispatch. (2007, January 22). Susan G. Komen for the Cure: New name, renewed mission to fight breast cancer. Brainerd Dispatch. Retrieved May 2, 2012 from http://brainerddispatch.com/stories/012207/new_20070122016.shtml
Chella, B. (2005, January 12). The Evil Empire of Video Games: Electronic Arts. Buffalo News, p. N8. Retrieved May 2, 2012 from ProQuest.
Copyright. (2008, September 10). Copyright row dogs Spore release. BBC News. Retrieved May 2, 2012 from http://news.bbc.co.uk/2/hi/technology/7604405.stm
Ewalt, D. M. (2011). Dragon Age: Origins owners locked out due to DRM failure. Forbes.com, 16. Retrieved May 2, 2012 from EBSCOhost.
Feldmann, L. (2012, February 3). Susan G. Komen Foundation relents: Planned Parenthood grants restored. Christian Science Monitor. Retrieved May 2, 2012 from EBSCOhost.
Hopfensperger, J. (2012, May 6). Planned Parenthood flap puts Race for Cure off former pace; Registration for the yearly fundraiser hits a decade low. Star Tribune. Retrieved from ProQuest May 7, 2012.
Komen Home. (n.d.). Susan G. Komen for the Cure Home Page. Retrieved May 2, 2012 from http://ww5.komen.org
McCormack, J. (2011, March 30). Planned Parenthood President Falsely Claimed Clinics Provide Mammograms. The Weekly Standard. Retrieved May 2, 2012 from http://www.weeklystandard.com/blogs/planned-parenthood-president-falsely-claimed-their-clinics-provide-mammograms_556015.html
Mieszkowski, K. (2004, December 2). Santa’s sweatshop. Salon. Retrieved May 1, 2012 from http://www.stage1.salon.com/2004/12/02/no_fun_and_games/singleton/
Miller, G. (2012, April 25). Madden NFL 13 Sees What’s Broken, Tries to Fix It. Retrieved May 15, 2012 from http://xbox360.ign.com/articles/122/1223710p1.html
Rosenberg, D. (2008, December 6). ‘Spore’ leads 2008’s most pirated PC games. CNET News. Retrieved May 2, 2012 from http://news.cnet.com/8301-13846_3-10116502-62.html
Szabo, L. (2011, July 17). Komen’s pink ribbons raise lots of green, many questions. Gannett News Service. Retrieved May 2, 2012 from ProQuest.
Think Before You Pink. (n.d.). Think Before You Pink. Retrieved May 1, 2012 from thinkbeforeyoupink.org/?page_id=13

05/21/2012 Posted by | College Papers, Learning, Thinking | , , | 3 Comments

Voter ID Laws

Warning: Please do not use my work and submit it as your own. Students have been caught plagiarizing from this site, and at least one university knows about this site due to that issue. This blog is not peer-reviewed, and thus is also not acceptable for scholarly research. Feel free to read the articles and papers here, but do your own research for your own schoolwork. Thank you!

Another warning: this post may not make a whole lot of sense in the later paragraphs due to the specific question that I was responding to, however, I believe the general idea is clear enough.

In 1965, part of the Civil Rights movement was to put into effect the Voting Rights Act.  This act was basically to restate the 15th amendment, that the “right of citizens of the United States to vote shall not be denied or abridged by the United States or by any state on account of race, color, or previous condition of servitude” (LOC, n.d.).  Of particular note in the VRA are sections 2 and 5.  Section 2 prohibits practices that discriminate against race, color, or language minority groups, just like the amendment, and specifically deals with intent to discriminate. As shown by Village of Arlington Heights v Metropolitan Housing Development Corp. in 1977, the discriminatory effect alone is not sufficient to deem an action unconstitutional (Case Briefs, n.d.). Therefore, beyond the effect of discrimination, intent to discriminate must be proven. This case, despite having nothing to do with voting, provided the framework to determine whether intent to discriminate is part of the proposed legislation. Section 5, however, has some distinct differences. First, Section 5 consists of nine states, and parts of seven others, that have a history of voter discrimination.  I do not know how recent that history is supposed to be, but every state that had been in the Confederacy, with the exception of Arkansas and Tennessee, as well as embattled Arizona are on that Section 5 list.  There are two major differences in Section 5. First, every change of the laws regarding voting policy must be pre-approved by the US Attorney General before being put into place. Second, the law must show no discrimination in intent OR effect. This is where states like Texas and South Carolina have had issues. A law that works in Ohio may not work in Texas, or a law that works in part of California may not work in another part of the same state because Section 2 states must only satisfy “intent” while Section 5 states must satisfy both “intent” and “effect.” That should provide enough background information to understand why the law is so hotly contested in Texas and South Carolina while 31 states currently have ID laws, 15 of which require photo ID ((Herting, 2012).

Advocates of this legislation typically claim that it is to prevent voter fraud, and many of the states requiring ID have provisions that include free ID at DMVs or Department of Public Safety offices (KDH News, 2012; America Live, 2012). Opponents claim that there is no significant voter fraud, and that of the known cases, it has been supervisors and officials, not the actual voters who are performing the fraudulent actions (America Live, 2012; Yost, 2012). However, if IDs are not required, it would be near impossible to “prove” fraud without confession or otherwise monitoring the ballots, itself an ethical issue. The Huffington Post reported that Hispanics in Texas are between 50 and 200% more likely to lack ID than non-Hispanics, stating that this is a discriminatory effect (Yost, 2012). The DOJ says that the distance to the nearest ID facility, and the short hours of most ID facilities in Texas, particularly in the rural areas, makes the burden too great for these voters. Therefore, because Texas is a Section 5 state, and the effect of discrimination is claimed, this instance of identification law has the potential to be unethical. I believe that more research would need to be performed to determine the accuracy of the DOJ’s claims, considering the huge margin of potentially affected voters. Also, considering that IDs are required to drive, work, or even buy beer, it is hard to believe that law-abiding, voting citizens do not have some form of ID as is. Personally, my view on this matter is that the only people the concept of Voter IDs discriminates against are those who cannot legally obtain an ID because they are not citizens, and thus should not be permitted to vote anyway. That said, there are times that the implementation of an idea can stray from the ethical path, and that is something that must be guarded against. More information, beyond what is revealed in these news articles, is needed to determine the actual ethical component of the Texas ID law in particular, but the concept is sound for Utilitarianists: Nobody wants election fraud. The fact that voting under multiple names or dead people’s names is easy makes legitimate, law-abiding voters feel as though their voice does not matter in an election. After an election, the losing side (whether the Democrats in 2004 in Florida or the Republicans in 2008) will claim that the other is rigging the vote. Without such measures, there is no way to track fraud or prevent it. The greatest good for the greatest number would be to institute measures to restore confidence in our democracy, specifically in the electoral process.

It is 2012, and it would be very difficult to find anyone who will willingly stand up and say “I believe in discrimination based on sex, race, color, religion, etc.,” except maybe Fred Phelps of the Westboro Baptist Church. It is nearly universally condemned. However, there are a number of unintentional examples of discrimination that people face around the country. Some people cannot let go of old white/black animosity. Some people think anyone with brown skin is in America illegally. Some people think everyone who believes in Islam is out to destroy America.

Once again, I must refer back to the Navy, since it is the only group I have been a part of since graduating high school. The Navy officially has a policy of zero tolerance for a number of things, including sexual harassment and discrimination based on a number of factors. An example is when in training, I was the class leader, and the only female in the class of 20, and after telling a fellow student to return to his seat, he called me a derogatory term for “female.” Word got around to my instructor who asked me to come to his office. As I entered, the other student was standing at Parade Rest outside the door in his dress uniform- a sure sign that he was about to receive a serious butt-chewing. The instructor told me that he was going to be disciplined for insubordination (not listening when I told him to sit down) and disruption of the classroom. He then asked if I wanted to file sexual harassment charges for the slur he used. Officially, the Navy is proactive in addressing instances of discrimination. On the other side, after I was on a ship, I was one of three women in our division of near 90. Once you report, you are often branded a “slut” or a “b—-,” depending on whether you are friendly or not. I had a very tense relationship with one of my superiors in particular who constantly changed standards to exclude me, gave me stricter requirements to qualify than other sailors, required me to stay to perform menial tasks after releasing the rest of the sailors for the day, and went out of his way to look for ways to get me in trouble. I have strong suspicions that if those actions were not specific to my gender, then they were based on his suspicions about my personal life. When he finally did find a way to get me demoted, I wanted to appeal, but his boss (a Chief) told me not to. When the Chief was one day from transfer, he told me I should go through with my appeal. This, to me, says that the Chief knew I was being unfairly treated, but did not want it to tarnish his own reputation along with my supervisor’s. As a whole, what this tells me is that officially, on the grand scale, what may be stated as policy may not always translate to individuals with personal prejudices.

America Live. (2012, March 12). DEBATE: Are Voter Identification Laws Discriminatory. Retrieved April 26, 2012 from foxnewsinsider.com/2012/03/12/debate-are-voter-identification-laws-discriminatory
Case Briefs. (n.d.). Village of Arlington Heights v Metropolitan Housing Development Corp. Retrieved April 26, 2012 from http://www.casebriefs.com/blog/law/constitutional-law/constitutional-law-keyed-to-chemerinsky/equal-protection/village-of-arlington-heights-v-metropolitan-housing-development-corp-2/
Herting, K. (2012, April 12). DOJ Calls Texas Voter ID Law Discriminatory. Retrieved from jurist.org/paperchase/2012/04/doj-calls-texas-voter-id-law-discriminatory.php
KDH News. (2012, March 17). Voter ID Push Valid, despite Texas setback. Retrieved April 26, 2012 from http://www.kdhnews.com/news/story.aspx?s=65015
LOC. (n.d.). 15th Amendment to the Constitution. Retrieved April 26, 2012 from http://www.loc.gov/rr/program/bib/ourdocs/15thamendment.html
Yost, P. (2012, March 12). Texas Voter ID Law Blocked by Justice Department. Retrieved April 26, 2012 from http://www.huffingtonpost.com/2012/03/12/texas-voter-id-justice-department_n_1339004.html

04/26/2012 Posted by | College Papers, Learning | , , , , , , | Leave a comment

Black Death

Warning: Please do not use my work and submit it as your own. Students have been caught plagiarizing from this site, and at least one university knows about this site due to that issue. This blog is not peer-reviewed, and thus is also not acceptable for scholarly research. Feel free to read the articles and papers here, but do your own research for your own schoolwork. Thank you!

Yersinia Pestis: The Black Death and Religion

            “[S]uch terror was struck into the hearts of men and women by this calamity, that brother abandoned brother . . . the wife her husband.  What is even worse and nearly incredible is that fathers and mothers refused to see and tend their children, as if they had not been theirs. . . .” (Sanders, Nelson, Morillo, & Ellenberger, 2006, p. 398).  As told by Giovanni Boccaccio in The Decameron, the Black Death was a terror unrivaled by any other that turned life upside down for nearly all inhabitants of Eurasia.  The plague, Yersinia pestis, once called Pasteurella pestis, was caused by ravenous fleas aboard burrowing rodents (McNeill, 1976).  The fleas’ throats would close due to the disease, making it impossible for them to feed from the blood they took from the rodents (McNeill, 1976).  Since they could not swallow, the blood would be spit back into the wound, along with infected blood, and they would continue trying to feed to prevent starvation (McNeill, 1976).  When the rodents, black rats in the case of the Black Death in Europe, would die, the fleas would find new hosts, such as people, and the disease would manifest as massive swellings in the groin and armpit that became dark with internal bleeding (McNeill, 1976).  From the Mongol Yuan dynasty in China and Mamluk dynasty in Egypt, states fell to be replaced by the Ming dynasty and Ottoman Empire, respectively (Bentley, Ziegler & Streets, 2008).  Between sixty and seventy percent of all people afflicted by the disease would die within days of symptoms appearing, and nobody, from doctors to the Church, could stop it (Bentley, et al., 2008).  When natural means could not explain the horrors afflicting the people, they started attributing the plague to God, and even Pope Clement VI referred to “this pestilence with which God is afflicting the Christian people” (Sanders, et al., 2006, p. 392).  In a deeply religious period, the Black Death and its repercussions proved to be a challenge to each of the three major religions in Europe: Islam, Christianity, and Judaism.

During the Middle Ages, Europe was experiencing something often called the “Dark Ages,” while Islam had something of a “Golden Age.”  There are doubts as to whether either was as good, or bad, as they sound, but the Black Death not only evened the playing field, but actually knocked dar al-Islam out of Europe.  The Prophet, Muhammad, had addressed epidemic disease, providing a guideline for his followers.  “When you learn that epidemic disease exists in a county, do not go there; but if it breaks out in the county where you are, do not leave,” which may have assisted in stemming the spread of disease through the Islamic lands, but many people disregarded this sage advice, particularly those who were not Muslim (McNeill, 1976, p. 198).  Ibn Battuta especially seemed to disregard this suggestion as he traveled extensively during the time of the Black Death, often traveling through cities who were actively suffering the effects of the plague (Sanders, et al., 2006).  In addition to the advice against traveling, Muslims who died of plague were guaranteed entry into the Paradise of afterlife, perhaps as another method to prevent fear and panic, and in fact were thought of as highly as those who had died in jihad (Sanders, et al., 2006).  Like the Christians, Muslims believed the plague was sent from God, or Allah, as a divine punishment, however, they disapproved of attempts to heal the afflicted or otherwise escape Allah’s will, and thus suffered a larger portion of deaths compared to the Christians (McNeill, 1976).  This feeling was documented when an imperial ambassador to Constantinople asked the Ottoman Sultan for permission to change his home since the plague had broken out in the house next to his.  The Sultan replied “Is not the plague in my own palace, yet I do not think of moving?”  (McNeill, 1976, p. 199).  In fact, in the Balkan peninsula, Muslim casualties were so great, that the only way they managed to stay in power was through a steady stream of conversions (McNeill, 1976).  The Muslims in that area constituted a ruling class and often lived in cities, where disease is already more common than in rural areas due to the higher density of people, while the people in the lower classes stayed in their own faith (McNeill, 1976).  McNeill (1976) postulates that the 19th century wars of independence throughout the Balkans by Christians, such as the Greeks, would not have been successful if the Muslim casualty rate was not so high in the 14th century.  In this same land, however, were physicians seeking to explain the plague in natural terms and treat it with natural therapies, using the preserved medical texts of antiquity in well-organized hospitals (Sanders, et al., 2006).  These hospitals were forbidden from turning anyone who wanted treatment away for being unable to pay (Sanders, et al., 2006).  With the losses sustained due to the plague, the Muslims could not hold their tiny remaining piece of western Europe against the return of Christianity.

Christianity offered many advantages to its believers over non-believers that may have actually strengthened the Church and faith during this time of pestilence.  Like the Muslims, they believed that death was not necessarily something to fear or fight, and Pope Clement VI granted forgiveness from penalty to the dying through the confessors, allowing believers to die with less burden on their souls (Sanders, et al., 2006).  Death would allow believers to go on to their eternal life with Christ while their enemies would be sent into Hell (McNeill, 1976).  Like many of the salvational religions, this coping mechanism is very attractive during times of high mortality (McNeill, 1976).  In addition, care for the sick is a religious duty for Christians, and even basic care, such as providing food and water for those who cannot serve themselves, can help reduce the death count (McNeill, 1976).  With this care, survivors are more likely to feel thankful towards their Christian nurses, and thus Christianity strengthened while others sputtered (McNeill, 1976).  Not everything about Christianity was a ring around the rosies.  Due to their increased contact with the sick and dying, many priests and monks also died of the plague, so many that there often were not enough priests to perform sacraments for the dead or dying (Getz, 1991).  Because the replacements for those priests were often less experienced, or less trained, the public became even more upset with the Church, and the anticlericalism that stemmed from it provided Martin Luther with some success later (McNeill, 1976).  In addition, it encouraged the shift from Latin to vernacular tongues, and mysticism and personal relationships with God, which all major branches of Christianity embraced (McNeill, 1976; Osborne, 1996).  Movements like devotio moderna encouraged approaching God through “personal contemplation and an intimate relation with their own spirituality” rather than the bureaucracy of the Church (Osborne, 1996, p. 217).  Unfortunately, there was another, more destructive variant of Christianity in the Flagellants.

The vast majority of information about Judaism during the Black Death comes from accounts of the persecution of Jews by the Flagellants or the peasantry.  The Flagellants were a sect of Christianity, deemed heretical by Pope Clement VI once he heard about it, who gathered in town squares to beat themselves and each other with weighted scourges, often with iron tips that bite into the skin (Sanders, et al., 2006).  They believed that they were “proclaimers of a new time, that of the preparation for the end of the world” (Lerner, 1981, p. 535).  Not only did they inflict punishment upon themselves, but often killed Jews they came across as well as Christians who spoke out against them (Lerner, 1981).  Jews were often seen as the cause of the plague, possibly because they suffered less during the heightened stages of plague.  One explanation for this could be that the Jews had removed all grain from their homes for Passover, during the peak season of plague, which then caused the plague-bearing rat to avoid their homes (McNeil, 2009).  Jews were accused of poisoning wells, streams, and food, and were sentenced to death across Christian Europe (Cohn, 2007; McNeill, 1976).  In Spain, it was the Catalans who took the brunt of persecution (Cohn, 2007).  While popular history states that the Jews were persecuted by peasants and the common rabble, close investigation shows that it was the aristocracy and nobles that created the atmosphere for wholesale murder.  Aristocrats were the most common clients of Jewish usurers, the ones with the power to authorize pogroms (state-sanctioned anti-Semitism), and the ones who forgave debts owed to the now-dead Jews (Cohn, 2007).  Nobles would accuse Jews of spreading or causing the plague, capture them, and torture them until they confessed, and by the end, some two hundred Jewish settlements were utterly destroyed, their people thrown into the fire alive, and their houses demolished (Cohn, 2007).  Jean de Venette reported that to ensure their small children would not be captured and baptized in the Christian faith, Jewish mothers would first throw their children into the flames, then join them in death (Sanders, et al., 2006).  These attacks were most severe in German-speaking lands, and recurred occasionally with recurrences of the plague (Cohn, 2007), possibly fueling the deep-seated anger and hatred that came to a head in the Holocaust of the twentieth century.  These attacks also accelerated the eastward shift of Jewish population, as western Jews were killed and their survivors fled east into places like Poland (McNeill, 1976).  Though some attacks occurred in Poland, the royals there welcomed urban, skilled Jews, and a market-oriented agricultural society rose in the Vistula and Nieman valleys under Jewish management (McNeill, 1976).  While anti-Semitism started well before the Black Death, and continues today, the use of the Jews as a scapegoat for the Black Death highlights the challenges Judaism faced during these times.

The Black Death was a scourge across Eurasia and had no known cause.  Doctors were powerless to stop or even stem the course of the disease, so the deeply religious people of the Middle Ages tended to look towards their gods for reasoning and relief.  Common to Islam, Christianity, and Judaism was the belief that God (or Allah) brought the plague down on the people to punish them for their wicked ways.  Muslims suffered greatly due to their acceptance of “Allah’s will” and refusal to try to avoid the plague, such as the case of the Ottoman Sultan.  Christians also suffered greatly, but their care for the sick and remittances offered by the Pope brought inner strength to the people.  Finally, the Jews did not suffer as much from the plague, possibly because of coincidental timing of Passover, but suffered greatly at the hands of Christians who accused them of poisoning wells.  All these factors led to Western Europe becoming overwhelmingly Christian.  The Spanish would chase the Muslims from their land in Reconquista, and the Jews shifted their population centers to Eastern Europe.  Now that European countries did not need to argue about which God to follow, they would become embroiled in wars and infighting in the coming centuries over how to follow God, whether through Catholicism, Protestantism, Lutheranism, or Calvinism.


Bentley, J. H., Ziegler, H.F., & Streets, H.E. (2008). Traditions and encounters: A brief global history. Ashford University edition.  Boston: The McGraw-Hill Companies.

Cohn, S. (2007). The Black Death and the burning of Jews. Past & Present 196, pp. 3-36. Retrieved February 5, 2012 from Project Muse.

Getz, F. M. (1991). Black death and the silver lining: Meaning, continuity, and revolutionary change in histories of medieval plague. Journal of the History of Biology 24(2) pp. 265-289. Retrieved January 22, 2012 from JSTOR.

Lerner, R. E. (1981, June). The Black Death and Western European eschatological mentalities. The American Historical Review 86(3), pp. 533-552. Retrieved February 5, 2012 from JSTOR.

McNeil, D. (2009, September 20). Laying blame for disease; Humans love to find a scapegoat for pandemics- Jews, Mexicans, pigs, storks, and even planets have been singled out- but the truth is that diseases are so complex that pointing blame is useless. Edmonton Journal, E.6. Retrieved February 5, 2012 from ProQuest.

McNeill, W. H.  (1976). Plagues and Peoples. Garden City, NY: Anchor Press.

Osborne, R. (2006). Civilization: A new history of the western world.  New York: Pegasus Books.

Sanders, T., Nelson, S. H., Morillo, S. & Ellenberger, N. (2006).  Encounters in world history: Sources and themes from the global past: Vol. 1. To 1500.  Boston: The McGraw-Hill Companies.

02/06/2012 Posted by | College Papers | , , , , , , , , | Leave a comment

The Greco-Roman Mediterranean

Warning: Please do not use my work and submit it as your own. Students have been caught plagiarizing from this site, and at least one university knows about this site due to that issue. This blog is not peer-reviewed, and thus is also not acceptable for scholarly research. Feel free to read the articles and papers here, but do your own research for your own schoolwork. Thank you!


The Mediterranean cultures of Greece and Rome have long been considered the foundations for the modern western world.  The cultural traditions, political conventions, and philosophical, religious, ethical and moral standards have survived since the days of Socrates, Plato, Solon, Caesar, and Augustus, and still play a role in our culture today.  The Greeks and Romans were part contemporary, part successive, and sometimes indistinguishable from one another as is seen by the spread of their cultures, the role of women in society, and the effect of social distinctions.  It must be said, however, that when speaking of “Greeks,” more often than not, the Greek polis of Sparta is not included, such as in its equality of women and social distinctions.

The Mediterranean, throughout history, has “acted more as a bridge than a barrier, encouraging trade and social contact between the countries bordering it” (Gilmore, 1982, p. 177).  Both the Greeks and Romans used the sea as a way to spread their culture from one end to the other, but in different ways.  The Greeks were always able seamen, relying on their maritime trade to supplement meager harvests caused by the rocky terrain.  By the mid-eighth century BCE, the Greeks were colonizing areas along the Mediterranean coastline (Bentley, et al., 2008).  These colonies, while maintaining many of the cultural aspects of the Greeks and benefitting from their trade routes, were not part of any Greek kingdom or empire, and were mostly left to their own devices.  On the other hand, the Mediterranean served as an invasion point for Rome against Carthage and many of its other conquests.  At the high point of the Roman Empire, there was a wide strip of Roman land entirely surrounding the Mediterranean Sea, which they referred to as “mare nostrum (“our sea”)” (Bentley, et al., 2008, p. 149).  While the Greeks were content to maintain cultural unity with independent city-states, or poleis, the Romans centralized government power in a Republic, then an Empire, but allowed conquered people to maintain much of their own culture.

Gender equality is both an ancient concept, and a relatively new one.  In the days of hunters and gatherers, the small family units are generally thought of as egalitarian- all members of the family play a role in food production, so all members are considered equal.  With the rise of agriculture, men started doing the heavy outdoors work of tending the fields, and with it, the power within the household and culture.  Both the Greeks and Romans had strong patriarchal traditions, with the Greeks granting citizenship only to men, and giving men total control of their family, including the ability to legally abandon children in the wilderness (Bentley, et al., 2008).  The only public position available to women was priestess, and one of the very few exceptions is Sappho, the 6th century BCE poet, who was eventually ostracized for probable homosexuality, another example of something open to men but unacceptable for women (Bentley, et al., 2008).  However, Greek women did have some power within their households, and upper class women were valued for their “pedigree” (Sanders, Nelson, Morillo & Ellenberger, 2006). In Rome, pater familias were given much authority over their households, including the ability to sell their family members into slavery or even execute them (Bentley, et al., 2008).  Roman women also had some power within the domestic sphere, which gradually extended to small shops and stalls as well as working around the law regarding inheritances until, in the third and second centuries BCE, women owned a considerable amount of property (Bentley, et al., 2008).

Finally, and most applicable to our modern world, is the effect of social distinctions.  Both the Greeks and Romans had an upper class of wealthy landowners and a lower class that was unhappy with their lot (Bentley, et al., 2008).  In both cases, the underprivileged threatened to revolt or secede, and in both cases, additional allowances were granted to help ease the gap between the rich and the poor.  In Greece, a 6th century BCE statesman named Solon compromised, allowing the aristocrats to keep their land, but cancelling all debt, freeing those who were in slavery due to debt, and outlawing debt slavery, and eventually, statesmen were even paid to ensure that “financial hardship would not exclude anyone” from holding office (Bentley, et al., 2008, p. 135).  In Rome, plebeians were granted the ability to elect tribunes, which became eligibility for almost all state offices, and eventually, even one of the consuls could be elected from the plebeians (Bentley, et al., 2008).  By the early third century BCE, plebeians had majority in the Senate, allowing the lower classes political power that bound the rest of the Romans (Bentley, et al., 2008).

The geographic proximity of Greece and Rome likely led to many of their similarities, while time and experimentation can account for many of their differences.  Even the term “Greco-Roman” points toward the inevitable comparison between these two cultural, political and philosophical powerhouses.  From the use of the Mediterranean to spread their cultures, the way women were treated in their societies, and the effect of social distinctions, one can see their similarities, differences, and effect on the western world.

Bentley, J. H., Ziegler, H.F., & Streets, H.E. (2008). Traditions and encounters: A brief global history. Ashford University edition.  Boston: The McGraw-Hill Companies.

Gilmore, D.  (1982).  Anthropology of the Mediterranean area.  Annual Review of Anthropology 11, pp. 175-205. Retrieved January 14, 2012 from JSTOR.

Nowak, B., & Laird, P. (2010). Cultural Anthropology (S. Wainwright & D. Moneypenny, Eds.).  Retrieved from http://content.ashford.edu/books

Sanders, T., Nelson, S. H., Morillo, S. & Ellenberger, N. (2006).  Encounters in world history: Sources and themes from the global past: Vol. 1. To 1500.  Boston: The McGraw-Hill Companies.

01/16/2012 Posted by | College Papers, Learning, Thinking | , , , , , , , , | Leave a comment

The Importance of Agriculture

Warning: Please do not use my work and submit it as your own. Students have been caught plagiarizing from this site, and at least one university knows about this site due to that issue. This blog is not peer-reviewed, and thus is also not acceptable for scholarly research. Feel free to read the articles and papers here, but do your own research for your own schoolwork. Thank you!


Agriculture is one of the most important discoveries/inventions in human history.  Nearly everything in civilization relies on the transition from nomadic hunter/gatherer, and pastoral lifestyles to a permanent settlement based on plant and animal domestication.  Over the centuries, women collecting grains would probably have started learning peculiarities about their “prey,” like the regions, types of soil, and amount of wetness required for the best harvests.  They probably would have noticed that a spilled pile of grain turned into plants the next season.  Likewise, men on the hunt would have gathered knowledge about their prey- what it eats, where it goes, what sorts of needs it has, and its breeding habits.  With a likely combination of higher population and some sort of shortage or famine, they would have tried planting and perhaps herding the animals away from their competitors.  Necessity is the mother of invention, and with more tribes fighting over the same resources, every edge counts.  Once horticulture caught on (non-mechanized agriculture), they likely would have started using horses and cattle to pull plows to prepare fields.  Current horticultural people tend to migrate between their fields, but once agriculture became the norm, people started settling in permanent villages.  They were then able to stockpile and accumulate items, food, and wealth since they no longer had to carry everything they owned on their backs.  With accumulation comes hoarding, and when wealth was transferred from generation to generation within a family, social classes were born.  More food means more people can be supported, so a population explosion occurred.  When people were no longer required to constantly be searching for their next meals, fewer people were required to take care of food, and with the surplus, other non-producers could pursue other skills and occupations.  This surplus is mandatory for a culture to have monuments, temples, palaces, a standing military, and any art form more intense than cave painting.  Without being able to support other people who do not harvest their own food, none of these other pursuits are possible.  In addition, once you have more than a tribe made up of a family, you must have some method of settling conflict and a method of maintaining a standing military force against invaders who want to steal your surplus.  This leads to the formation of government, whether a chiefdom, a kingdom, or an empire.  Agriculture allows civilization to exist.

Bentley, J. H., Ziegler, H. F., & Streets, H. E. (2008). Traditions & encounters: A brief global history (Ashford University ed.). Boston: The McGraw-Hill Companies.

Nowak, B., & Laird, P. (2010). Cultural Anthropology (S. Wainwright & D. Moneypenny, Eds.).

Retrieved from https://content.ashford.edu/books/AUANT101.10.2/sections/ch00

01/02/2012 Posted by | College Papers, Learning | , , , , , | Leave a comment